jitter
A flicker or fluctuation in a transmission signal or display image. The term is used in several ways, but it always refers to some offset of time and space from the norm. For example, in a network transmission, jitter would be a bit arriving either ahead or behind a standard clock cycle or, more generally, the variable arrival of packets. In computer graphics, to "jitter a pixel" means to place it off side of its normal placement by some random amount in order to achieve a more natural antialiasing effect.

For clock jitter, there are three commonly used metrics: absolute jitter, period jitter, and cycle to cycle jitter.

Absolute jitter is the absolute difference in the position of a clock's edge from where it would ideally be if the clock's frequency was perfectly constant. The absolute jitter metric is important in systems where a large number of clock sources are trying to pass data to one another (eg. SONET).

Period jitter (aka cycle jitter) is the difference between any one clock period and the ideal clock period. Accordingly, it can be thought of as the discrete-time derivative of absolute jitter. Period jitter tends to be important in synchronous circuitry like digital state machines where the error-free operation of the circuitry is limitted by the shortest possible clock period, and the performance of the circuitry is limitted by the average clock period. Hence, synchronous circuitry benefits from minimizing period jitter, so that the shortest clock period approaches the average clock period.



To really explain what is happening, we need to take a look at what digital audio really is. Analog sound is a wave. I'll assume everyone has seen graphs of sound waves and knows that what one looks like. Eseentially a sound wave can be thought of as a signal which OVER TIME can vary continuously in amplitude.

To get a digital representation of this signal samples are done at different intervals along the wave. For CD audio this sampling is done 44,100 samples a second with 16 bits of resolution. DVD audio is between 96,000 and 192,000 with up to 24 bits of resolution. What this means is we are taking timed samplings of this original analog sound wave and making a digital approximation of it. There is of course much discussion with regards to how much sampling is needed to make the interpolation of the original analog sound indestiguishable from the original but that has nothing to do with jitter.

What a DAC does is it takes this digital rendering of a sound wave and it reassembles the analog sound wave. To do this properly it needs to have a clock sync so it can establish the correct interval of the samples.

Think of it like a plotter. The timing is the rate at which the paper is moving under the pen. If the paper isn't moving at a constant and determined speed, you are not going to reproduce a proper graph. It will either end up drawing the wave over a longer duration of time than the orignal or a shorter duration.

This is the time instability which causes jitter. Without a proper clock that sound wave can't be reproduced with accuracy. This is why all DACs and CD players have an oscillator crystal.

My VOIP example was a rather extreme example of the timing issue. Jitter for VOIP represents the same problem, it just happens for a different reason and on a different scale (large chunks of the wave are transmitted - so it's not individual samples that are out of time sync). Some packets make it there fast, some slow, some don't make it and have to be resent, but essentially the sound can't be reproduced with proper timed accuracy.

Last edited by packetloss; 01/05/07 03:22 PM.