Sound quality of digital coaxial vs. optical
mc wrote:
"Arny Krueger" wrote in message
...
In the service manuals for certain digital preamps, Sony
instructs that the adjustment for harmonic distortion
must be performed with a signal source delivered by the
coaxial input. This shows that in some cases, the digital
input receiver can suffer with an optical connection, in
a measurable way.
No excuse for this kind of flaw at all.
Indeed, because of the elimination of EMI with optical, if anything Sony
should be recommending the use of the optical input for critical
adjustments.
I was wondering the same thing. If it's digital, why isn't it absolutely
bit-for-bit identical both ways? The optical input would be immune to
electromagnetic noise, and that should be the only difference. Normally
they should be indistinguishable because electromagnetic noise strong enough
to disrupt a digital signal is rare.
And if the digital signal *does* get disrupted the effect is sudden, huge and
very evident..
There's no 'gradual degradation' with digital signals like losing some HF or
adding a bit of hum. Either it works or it doesn't. Therefore comparisons about
quality are *almost* entirely bogus from first principles.
The exception as I understand it is jitter on the digital signal. If the clock
recovery is poorly implemented, I believe this can degrade the audio a bit.
Graham
|