View Single Post
  #21   Report Post  
Posted to rec.audio.tech,rec.audio.opinion
Arny Krueger
 
Posts: n/a
Default Sound quality of digital coaxial vs. optical


"mc" wrote in message
. ..

"Arny Krueger" wrote in message
...

In the service manuals for certain digital preamps, Sony
instructs that the adjustment for harmonic distortion
must be performed with a signal source delivered by the
coaxial input. This shows that in some cases, the digital
input receiver can suffer with an optical connection, in
a measurable way.


No excuse for this kind of flaw at all.

Indeed, because of the elimination of EMI with optical, if anything Sony
should be recommending the use of the optical input for critical
adjustments.


I was wondering the same thing. If it's digital, why isn't it absolutely
bit-for-bit identical both ways?


Every digital signal is received as an analog signal. The conversion to
digital can be a point where difficulties arise. An ideal digital receiver
is immune to noise and timing problems with its input signals, but nothing's
perfect.

The optical input would be immune to electromagnetic noise, and that
should be the only difference.


Agreed. In fact any grounding problems that may exist can be exagerated by
common kinds of tests that are done on power amps.

Normally they should be indistinguishable because electromagnetic noise
strong enough to disrupt a digital signal is rare.


The digital signal in question is not as robust as it might be. It's in the
1-2 volt peak-to-peak range. Really bad grounding problems can add noise in
the same voltage range.