0
8.6kviews
What are performance parameters of network? Explain in brief?
1 Answer
1
560views

Network Performance Parameters

  • Network Performance Parameters show the Quality of Service(QoS) provided by the network from the customer's point of view.
  • Performance of the network by using various standards. But it generally depends on the type, design, and nature of the network,
  • Measuring the performance of the networks using different parameters is a qualitative and quantitative process.

Some of the major parameters used to measure Network Performance are as follows:

  • Bandwidth

  • Throughput

  • Latency or Delay

  • Packet Loss

  • Retransmission

  • Jitter

  • Error Rate

Bandwidth

  • The amount of data or information that can be transmitted in a given amount of time is referred to as bandwidth.
  • It represents the speed of a network, how rapid the server is in the network while manipulating information.
  • More bandwidth does not mean more speed instead it shows the maximum data transmission rate possible on a network.
  • The bandwidth of digital devices is measured in bits per second (bps) or bytes per second (bps).
  • The bandwidth of analog devices is measured in cycles per second, or Hertz (Hz).

Throughput

  • The number of data packets delivered successfully per unit time is called throughput
  • The time init refers to the time frame used to calculate the throughput.
  • Throughput will be calculated from the arrival of the first bit of data reaching the receiver.
  • Throughput measures the network’s actual data transmission rate.
  • Throughput depends on the available bandwidth and is affected by too many factors such as signal-to-noise ratio, the available processing power of system components, end-user behavior, and device limitations.
  • Therefore, the maximum throughput of a network is always higher than the actual throughput seen.
  • A low throughput indicates maybe there are a lot of failed or dropped packets that need to be sent again.
  • It is measured n the unit of bits per second(bps), bytes per second(Bps), kilobytes per second(KBps), megabytes per second(MBps), and gigabytes per second(GBps).

Latency or Delay

  • Latency also called delay is defined as the total time taken for a complete data packet to arrive at the destination, starting with the time when the first bit of the data packet is sent out from the source and ending with the time when the last bit of the data packet is delivered at the destination.
  • Latency or the delay happens between a node or device requesting data and when that data packet is finished being delivered.
  • Latency time indicates a major performance issue in the network.

Packet Loss

  • It represents the number of packets that are lost during transfer from one destination to another irrespective of network performance measurement.
  • This is measured by recording data traffic both at the sender and receiver sides.
  • Several factors such as network congestion, router performance, and software problems can cause pocket loss.
  • More data packets are lost, then it takes a longer time to fulfill the data request.

Retransmission

  • When packets are lost, the network requires to retransmit that data packets to fulfill the data request.
  • Therefore retransmission rate also shows how often data packets are being dropped in the network.
  • This is sometimes an indication of congestion on the network.
  • Also, delays in retransmission, or the how much time taken for a dropped packet to be retransmitted, represents how long it takes the network to recover from packet loss.

Jitter

  • This is sometimes also called Packet Delay Variance.
  • This is the variance in time delay for data packets carried over a network is called a jitter.
  • This happens when different data packets see different delays in the network. This variable denotes an interruption in data packet sequencing that has been identified.
  • Jitter generates increased or uneven latency between data packets, which can damage network performance and cause packet loss and congestion.
  • Some jitter is to be expected to happen and typically be tolerated, quantifying network jitter is an integral part of measuring overall network performance.
  • Jitter is measured in milliseconds(ms).

Error Rate

  • This is measured in the network where digital transmission is carried out.
  • It is measured as the number of bit errors corresponding to the number of bits received from the data stream over the communication channel.
  • This can be affected due to noise, interference, distortion, or synchronization error bits.
  • It is measured in terms of Bit Error Rate (BER) which indicates the number of bit errors divided by the total number of bits transmitted during the unit interval of time.
Please log in to add an answer.