Ultra-High Sample Rate Discussion
"Eeyore" wrote in
message
Arny Krueger wrote:
"TT" wrote in message
From my limited understanding I gained the impression
that 192kHz was losing the 24 bit resolution.
In fact there are no practical converters operating at
any sample rate that would be appropriate for audio,
that also deliver true 24 bit resolution.
None deliver true 24 bit for sure. The reason for 24 bit
converters is to ensure that the bits 'really doing the
work' are accurate. 20 accurate bits is hunky dory.
Older ( 16 bit ) converters typically had serious
non-linearity problems with the bottom few bits which
were clearly audible ( and measurable ).
Actually, some of the older converters had nonlinearity problems at mid
levels and even high levels.
In the early days converters were based on networks of discrete resistors
whose temperature didn't necessarily track perfectly. If they had tracked,
then changes in temperature would results in just a change in over-all scale
factor. But since the discrete resistors might drift separately, there would
missing steps and steps that were too high. These kinds of errors might be
more likely for bits with low absolute magnitude, but they could be at high
levels as well. Monolythic resistor networks intially lacked the required
precision needed for the finest converters.
One of the most common early digital recorders was made by 3M, and had
field-adjustable converters. They were field-adjustable becauase they tended
to drift. If they weren't kept properly adjusted, the results were pretty
predictable. There were missing steps and wrong-sized steps. This is the
recorder of "Bop 'Till You Drop" infamy.
A number of early converters, including the converters in the CDP 101 were
based on 8 bit converters. The converter would do two conversions per
sample. On the first conversion it would be fed the 8 low order bits, and be
attenuated by a factor of 256. The second conversion would be based on the 8
high-order bits, but it would not be attenuated. Both conversions were
stored in a sort of a sample/hold circuit that effectively added them and
held them, and then clocked out the correct voltage when both conversions
were complete. This system had the potential to have larger errors at 256
step intervals.
However, highly effective converters have been available since the first
days of the CD format. In 1972 I worked with a hybrid computer that had 16
bit converters that were accurate down to the LSB and ran at something like
200,000 conversions per second.
So-called Sigma-Delta converters became popular in the early 1990s. They are
inherently incapable of having missing codes or steps that are significantly
outsized or undersized. They manifest their inaccuracies in the form that
seems to be more like random noise. IME these converters have very little
sample-to-sample variation. They are designed to have a certain amount of
resolution, and that's what they all deliver.
|