View Single Post
  #18   Report Post  
Posted to rec.audio.tech,rec.audio.opinion
AZ Nomad
 
Posts: n/a
Default Sound quality of digital coaxial vs. optical

On Tue, 10 Jan 2006 18:48:01 +0100, Sander deWaal wrote:


"mc" said:


Indeed, because of the elimination of EMI with optical, if anything Sony
should be recommending the use of the optical input for critical
adjustments.



I was wondering the same thing. If it's digital, why isn't it absolutely
bit-for-bit identical both ways? The optical input would be immune to
electromagnetic noise, and that should be the only difference. Normally
they should be indistinguishable because electromagnetic noise strong enough
to disrupt a digital signal is rare.



A common mistake.
The S/PDIF signal is analog in nature.

By that definitions, there are no digital signals. All signals are
analog in nature. What makes a signal "digital" is the fact that it can
only be in a very limited number of states, for example positive being a 1
and negative being a zero. It doesn't matter that signal is analog and that
there's an infinite number of values that translate to a 1.


Just as with RF signals, an incorrect termination might cause
reflections ( "a bad SWR") which, in turn, are said to cause jitter.


Jitter doesn't have to be a problem per se.
When the incoming signal in e.g.a DAC is reclocked for instance, the
jitter must be very extreme to have any effect at all.
If that extreme is reached (not likely), the result will be silence,
not degraded audio.