jitter

Definition of jitter in The Network Encyclopedia.

What is Jitter?

Distortion in transmission that occurs when a signal drifts from its reference position. Jitter can be caused by variations in the timing or the phase of the signal in an analog or digital transmission line.

Jitter typically results in a loss of data because of synchronization problems between the transmitting stations, especially in high-speed transmissions.

Jitter is inherent in all forms of communication because of the finite response time of electrical circuitry to the rise and fall of signal voltages.

An ideal digital signal would have instantaneous rises and falls in voltages and would appear as a square wave on an oscilloscope.

The actual output of a digital signaling device has finite rise and fall times and appears rounded when displayed on the oscilloscope, which can result in phase variation that causes loss of synchronization between communicating devices.

The goal in designing a transmission device is to ensure that the jitter remains within a range that is too small to cause appreciable data loss.