View Single Post
  #158   Report Post  
dan lavry
 
Posts: n/a
Default 16 bit vs 24 bit, 44.1khz vs 48 khz <-- please explain

(Mike Rivers) wrote in message

You're doing better than most, if not all. However, the term that we
throw around with reckless abandon is "24-bit" and not "24 bits of
resolution." Nobody said that those lowest order bits had to actually
carry information, they just have to be there so that a 24-bit
receiver will recognize the format.


Well, perhapse you are correct for most cases. I will try and explain
my angle. I am not a recording or mastering engineer. I am an
equipment designer and manufacturer, so it does make a differance to
me when someone says: You sell 24 bits, your competition does too.
Both are 24 bits, and you cost more. It is the same thing with that
192Khz crock. I actualy did lose a big sale because I did not want to
join the "king has no cloth" parade and do 192KHz.

I do not know the precentage of folks that would be influenced by such
nonsense. But there are some out there that would buy a 192KHz 24Bits
machine that has over 1 ns jitter, 103dB dynamic range, bad
distortion, 10 msec recovary from clipping (overdrive)... and not take
the time to look at a 96KHz, 20 bits 110dB range low distortions, 20ps
jitter, 2usec recovary... for about the same price. After all, one
machine does 192/24. The other is only 96/20

As far as I know, digital recivers can handel various data length up
to 24 bits just fine. They assume "a bunch of trailing zeros" so a 16
bits with 8 trailing zeros look like a 24 bit. Yes, there is the
coding in the information side (declaring sample rate, bit,
emphasis...) and I do belive it defaults to 24 bits. It is not wise to
assume it. Most companies do not pay too much attention to the
information bits, certainly not on the DA side.

I've just stirred up a discussion over on the Pro-Audio mailing list
about a related subject. How does someone who thinks that the
difference between a line level and mic level input is the kind of
connector used compare a the gain and noise performance of a preamp
which has only a digital output (integrated A/D converter with no
user-adjustable calibration) with a straight analog preamp and an A/D
converter of unknown input sensitivity for full scale (needless to say
and also unknown noise performance)? You can compare volts out to
volts in and get gain, or volts out for no volts in and get dB of
quiescent noise. But how do you relate volts in to dBFS on your DAW's
meter or headroom indicator? It's a different ball game, but trying ot
explain that you have to think differently about these things requires
more learning than some people (who buy by looking at spec sheets)
want to bother with.


Good luck. I understand what you are saying. When you make mic pre,
you must specify analog in and analog out. You make an AD, you must
specify AD... and so on. For an oveall system one needs to incluse the
mic as well. Going for a high end system, one can disable the mic
pickup and still account for the mic noise by replacing the mic with a
noise equivalent physical resistor. One can measure the whole system,
or take the various parts and figure it out. But yes, once passed the
AD, you have to look at the digital outcome. I use FFT's and digital
distortion meters...

I tend to be blessed with customers that know what they are doing,
both in terms of gear and the musical ear and artistic tast. So I
should not complain too much. But often, the big time customers can
plain a simple do what it takes by using the best gear money can buy.
That is one way to go, if you have deep pockets, like the movie
industry. But there are a lot of folks that can not go up against the
stops on everything. At that point, it is wise to look for the
performance bottlenack and improve things one at a time... That is
when that knowladge can come handy...

BR

Dan L