Semester of Graduation
Electrical and Computer Engineering
First Major Professor
Dr. Hongwei Zhang
Master of Science (MS)
Jitter is defined as variation in the delay of received packets. In other words, jitter can be termed as the varying part of delay. Jitter is the variation in the latency on a packet flow between two systems, when some packets take longer than expected from one system to the other. At the transmitting side, packets are sent in a continuous stream with the packets spaced equally apart in most cases. Due to network congestion, improper queuing, channel properties or configuration errors, this steady stream can become lumpy, or the delay between each packet can vary instead of remaining constant which results in jitter. Jitter reduction is a major problem in mission critical applications which requires stringent timing guarantees with jitter limitation of about 1μs. Extensive research is being carried out to reduce this jitter to a great extent. Various methods have been adopted to reduce jitter and one such method for reduction is usage of optimal scheduling algorithms for packet scheduling. In this report, I have reviewed some of the scheduling algorithms which have been used to reduce jitter and compared their results.
Geetika-Singh, FNU, "Analysis of jitter control using real time scheduling" (2020). Creative Components. 498.