View Single Post
  #46   Report Post  
Chris Hornbeck
 
Posts: n/a
Default 16 bit vs 24 bit, 44.1khz vs 48 khz <-- please explain

On 18 Nov 2003 04:32:34 GMT, (Garthrr) wrote:

In article , "Arny Krueger"
writes:

24 bits also adds resolution in any region between -144 dB and full scale.


For me, with my limited understanding (or misunderstanding perhaps) of digital
theory, the above sentence cuts to the heart of the matter. If I understand
what Scott Dorsey and others have said then the change from 16 to 24 bits only
adds downward dynamic range and does not increase resolution of signals in the
relatively high ranges close to full scale. Maybe I misunderstand but thats
what they seem to be saying. On an intuitive level that seems wrong to me and
it seems as though resolution even at -10dB should increase (due to less
quantization error??) . I infer that thats what Arny is saying in the quote
above. Do I have that right? If so, is this, then, the crux of the discussion?


Greater word depth reduces the ambiguity inherent in quantization.
Some call this ambiguity "noise" and say that it is equivalent in
effect to dither. This is closest to being true for large and slow
signals.

I would say your point is exactly the crux of the discussion, as
is Jay Frigoletto's leading question about the 1-bit system.

Thanks,

Chris Hornbeck
new email address

"That is my theory, and what it is too."
Anne Elk