Thread: CD vs. SACD
View Single Post
  #28   Report Post  
Stewart Pinkerton
 
Posts: n/a
Default

On 16 Apr 2005 16:47:58 -0700, wrote:


Why just highbit digital signals, Cal?


Isn't it true that increasing the bitrate doesn't necessarily increase
precision, particularly if the increased bitrate comes from increased
sample rate and the signal being digitized is band-limted?

If the data were synchronous that would be so, but by definition
analog is async. Bicycle spokes and popsicle sticks and all that.


What the hell are you drivelling about? Synchronicity has nothing to
do with it. Increasing the bitrate simply does *not* increase the
precision, given that the input bandwidth is limited to less than half
the base sampling rate.

If OTOH you were arguing for 'highbit' in the sense of bit depth, i.e.
24 rather than 16, then the same argument applies. Increasing the bit
depth does *not* inherently increase precision, given that the input
dynamic range does not exceed that of the base bit depth. In the case
of audio, we know that there are not *master* tapes with more than
80-85dB dynamic range, so 24-bit offers *no* advantage in precision or
'low level detail' over 16-bit.
--

Stewart Pinkerton | Music is Art - Audio is Engineering