Thread: D/A Converters
View Single Post
  #22   Report Post  
Stewart Pinkerton
 
Posts: n/a
Default D/A Converters

On 2 Jul 2004 03:36:33 GMT, (Marcus) wrote:

Oversampling, often combined with noise shaping, attenuates quantization
noise. It does not improve linearity.


It improves linearity in the sense that a 'raw' 1-bit converter which
is only 20% linear between its two states, can produce an output which
is linear to better than 20 bits (or 0.00001%), given a sufficient
oversampling rate. While you are semantically correct, the practical
result is somewhat different.

If the core DAC is one-bit in resolution, it is infinitely linear,
ignoring minor secondary effects.


As noted above, that rather depends on how you define linearity, and
jitter is more important in determining the linearity of a 1-bit DAC.

But it would take too much oversampling
to achieve 20- to 24-bit resolution.


Obviously not true, since the majority of available '24/192' DACs
*are* 1-bit designs.

If a multi-bit DAC is the core, a practical oversampling ratio is enough.


What is that supposed to mean? The first 16-bit players used no
oversampling, the best current multi-bit converters are 20-bit with 8x
oversampling. Both are 'practical'.

But the linearity of the core DAC becomes an issue. Usually the core DAC
is built with unit elements (still not matched), and the elements are used
randomly (hence scarmbling, etc.). It effectively converts nonlinearity
(distortion) into noise.


As you noted yourself, the linearity of the core DAC is *only* an
issue with multi-bit DACs - one reason why they are now very rare.

--

Stewart Pinkerton | Music is Art - Audio is Engineering