Digital-analog converter (DAC) transforms sample sequence (digital values) to analog voltage level sequence. In the ideal case, the same time distance between the samples should be provided. The distance is defined by clock. However, clocking is not ideal and the restored signal is distorted like the one that is shown in the picture.
Jitter is a deviation of time between samples (deviation of sampling rate).
In the upper left part of the picture, we can see an original musical signal (green) captured in digital form.
In the bottom left part of the picture, we can see a restoration of the captured digital signal back to analog form.
The samples (vertical lines with dots) have unstable time positions (jitter) at the horizontal axis. And the restored signal is distorted. We can see it in the right part of the picture.
Jitter is a non-linear distortion. Theoretically, we can artificially boost jitter deviation and listen to more noise and artifacts.
In real life, the jitter always impacts analog signal at DAC output. But it's too small for modern systems. The author thinks, it is impossible or almost impossible to listen to jitter. Because jitter distortions compete with quantization noise, non-linear distortions and own noise of electronic components. I'm not sure, that we can separate real-system-jitter noise and other noise.
Jitter is a sample clock issue (clock deviation).
Jitter appears in a moment when digital signal in analog input is transformed into binary sequence.
To transmit music data thru a line (cable), the bit sequence is converted to an electrical form. Coding in the electrical form may be implemented in different ways (voltage level values or other).
Here we consider simple amplitude coding:
The line transmitter device converts binary data to electrical levels. The line receiver device converts electrical levels back to the binary sequence.
And clock reference moments are the points, where voltage levels get higher than the threshold.
The line transmitter generates a signal close to a rectangle.
In the line signal "lose" form due to noise, frequency and non-linear distortions.
Thus, clock reference moments are detected in the line receiver. And, offset from the initial time position (time deviation at the picture) is probable.
Below we will consider where the jitter penetrates the audio system.
Jitter is a nonlinear distortion. To measure jitter, it needs to put pure sine to input of a learned system.
At the system output, we check artifacts (harmonics) and noise.
Jitter products (artifacts and noise) depend on the input signal. To check the dependency we can take spectrum for different input signal levels and frequencies.
In the picture, we do see not a real spectrum. It is an only illustration for better understanding.
We can measure:
Measurements #2 and #3 allow the separation of errors in system parts.
Jitter is a deviation time between samples. Delay is shifting of full signal waveform with keeping time distance between samples.
Latency is a delay in processing inside an audio device (read below about FIFO buffer). Delay is constant.
Above we are considered, that clock deviation is an effect of digital signal distortions.
Several factors impact clock deviation:
Clock generator has electronic elements, that define frequency and its deviations.
Changing of power DC voltage can cause frequency deviations in the clock generator.
A power supply unit can generate noise into electrical lines. This noise can modulate the generated clock impulses.
Modified clock signal causes the time deviation (see the picture).
Noise, that impacts to digital audio signal line, penetrates from electrical circuits and the air.
Unfortunately, jitter is not a pure DAC issue. When we use recorded stuff, we have jittered sequence of digital samples.
To understand fully jitter issue, it's need to learn full recording-playback system.
Music capturing is periodical measurements of analog signal. If the periods will vary, it causes time distortions in recorded audio samples.
In the upper-left part of the picture, a signal is captured without jitter.
In the lower-left part of the picture, a signal is captured with jitter.
We can see that samples for both these cases have different values.
I.e., for both signals, captured with and without jitter, after restoration to analog form (playback), different waveforms of the analog signal are happens.
If music was recorded with jitter errors we can't compensate it further. Because the jitter is random.
As rule, recording studios use professional apparatus, including dedicated clock sources (in instance, Word Clock ). Therefore, engineers in the studios try maximally decrease losses due to bad clock.
Let's look at point #2 (digital signal pass thru the digital audio system). When signal is placed into digital domain (signal in digital form) jitter does not matter. Because time scores for binary signal form have pure mathematical values with infinite precision.
Any delays in processing between samples are not mattered for restored signal.
However, processing delays into digital domain can cause real-time interruptions of data stream, which feed DAC. But it is not a jitter issue. Read details below.
Now let's look to point #3 (signal comes to DAC from pure digital part of the audio system).
Noise, that cause jitter, penetrate to digital signal several ways from:
However, FIFO buffers provide "jitter isolation" between any "jittered" segment.
FIFO ("first input, first output") kind of buffer (array of number samples) when samples out from buffer on a first-come.
In the article putting and getting of samples to/from buffer are asynchronous. Clocks of buffer writing and reading are independent.
FIFO buffer causes time delay between sample input and output (latency). Latency is measured in seconds, milliseconds, microseconds.
Short buffers cause a small delay value. Short latency is important for real-time audio systems, used for music live performance and production. For home audio big latency value almost is not matter, except real-time sound adjusting.
Jitter before FIFO-buffer doesn't impact to clock after FIFO. Because the buffer is asynchronous.
Therefore, there are no reasons to worry about jitter before FIFO into DAC for the scheme considered in the picture above.
For an audio system, before suggesting of effective jitter suppression, learning of the system scheme is recommended.
In most cases, asynchronous FIFO buffer allows fixing interruptions in a binary data stream. Of course, significant interruptions can't be fixed by FIFO. The non-fixed interruptions cause pauses, clicks or other sound damages. It happens when the buffer is empty and no binary data is ready for conversion to analog form.
DAC clock synchronization has 3 options:
Below we consider cases for DAC with internal FIFO of input audio data.
Synchronization by digital interface has jitter sources:
Synchronization by DAC's internal clock generator have primary jitter sources:
Synchronization by external dedicated clock generator have jitter sources:
For home applications, I'd recommend using DAC's internal clock generator as a simplest way. It may be chosen in the DAC's settings.
However, it does not guarantee the best result. Each setup should be analyzed (measured) to choose the minimal jitter configuration.
Jitter noise is noise, that is a result of time instability of timing (ADC or DAC clock). Read details...
USB can't cause jitter for transmitted audio data. Audio data get jitter only on border between analog and digital domain. Read details...
Audio Basis - articles about audio