Sound quality of digital coaxial vs. optical
"Sander deWaal" wrote in message
...
"mc" said:
Indeed, because of the elimination of EMI with optical, if anything Sony
should be recommending the use of the optical input for critical
adjustments.
I was wondering the same thing. If it's digital, why isn't it absolutely
bit-for-bit identical both ways? The optical input would be immune to
electromagnetic noise, and that should be the only difference. Normally
they should be indistinguishable because electromagnetic noise strong
enough
to disrupt a digital signal is rare.
A common mistake.
The S/PDIF signal is analog in nature.
Just as with RF signals, an incorrect termination might cause
reflections ( "a bad SWR") which, in turn, are said to cause jitter.
SP/DIF is an early, optical method, using a crude plastic fiber, that
actually has limited bandwidth. The resulting fuzziness of the transitions
creates more uncertainty for the input receiver chip. In the case of a
typical input receiver, using a single phase locked loop, the additional
uncertainty causes additional jitter, over the jitter inherent in recovering
the clock from a NRZ encoding scheme.
Because the plastic fiber is a large diameter multimode, the path length
actually is sensitive to distortion of the fiber by mechanical vibration. No
such artifact occurs with coaxial cable, which is modeless at the
frequencies under consideration.
Jitter doesn't have to be a problem per se.
When the incoming signal in e.g.a DAC is reclocked for instance, the
jitter must be very extreme to have any effect at all.
Yes, but reclocking is still not done as a matter of course.
If that extreme is reached (not likely), the result will be silence,
not degraded audio.
Yes, and with any input receiver consisting of a single PLL, the designer
must choose a time constant that is a compromise between low jitter, and the
possible failure to lock.
|