Reply
 
Thread Tools Display Modes
  #1   Report Post  
Posted to comp.os.linux.advocacy,rec.audio.tech
chrisv
 
Posts: n/a
Default Digital Rules

I'll summarize my position below. Note that I'm limiting my summary
to high-fidelity music recording, delivery, and playback. Note also
that I don't mean to imply that anyone in cola is ignorant or is
repeating all the fallacies that I list below. My point is that
"many" repeat the fallacies out of their ignorance regarding digital
audio.

Regarding the matter of digital audio, consumers should be wary of
anyone who derides it due to alleged deficiencies such as
"quantization distortion" or "digital artifacts". Be especially wary
of (read: disregard anything they say) anyone who talks of "stair
steps" in digital audio, or who claims that digital, being only 1's
and 0's, "misses something" or "loses something" in the
record/playback process that analog processes do not.

It's not surprising that many people cannot understand how you can
sample, for example, a 15kHz sine wave at at CD's 44kHz, taking fewer
than 3 samples per wave, and later use that data to reproduce the wave
so well that the distortion is not even measurable, much less audible.
But it can, and it is. Keep in mind also, that CD is kind of "worst
case" in the arena of high-fidelity, uncompressed, digital audio.
Digital audio is recorded and mastered at much-higher sampling rates
and bit levels.

The simplistic and erroneous analyses that result in proclamations of
"quantization distortion" or "stair stepping" come from people who do
not understand that these are solved issues. Dither, used propery in
the record/mastering process, _eliminates_ quantization distortion,
for the small trade-off of about 3dB of noise - noise that, even on
lowly CD, is so far below the peak output level that it's never
audible, unless you have your volume cranked to extremely high levels.
(I'll leave it as an exercise for the reader to figure-out how loud
the musical peaks have to be before 93dB-down noise becomes audible
over your amplifier's hiss, your refrigerator, or your own breathing,
for that matter.)

As for the "problem" of the small number of samples available for
capturing high frequencies, as in my 15kHz example above; after your
CD player low-passes the signal, it's a nice, clean, 15kHz sine wave.
Believe it, as it is a fact beyond dispute. This digital audio stuff
works, man. It's figured-out, even if not by you and me. 8)

Can digital audio be messed up, whether by negligence or by design?
Of course it can. Many early CD's sounded terrible, and it's a darn
shame that we have a music-delivery system (CD) that is ubiquitous and
boasts a massive 93dB of dynamic range, yet most popular CD's are
mastered with the dynamics of the music compressed into the top few
dB's available.

The bottom line is that recording, processing, and delivering audio
digitally _far_ outperforms analog methods. Distortion and noise are
_greatly_ reduced over the best that is possible using analog methods.
Reduced so much, in fact, that added noise and distortion are not only
_not_ audible, but AFAIK are not even measurable by the most sensitive
of instruments. In an experiment with a microphone running directly
into an amplifier and speaker compared to the same setup but with a
state-of-the-art 192kHz/24b A/D-D/A process in-between, there would be
_no way_ that any listening human could tell the difference. Anyone
who claims that they could is "full of it". Even "downgrading" the
sound to CD levels would be imperceptable to the vast majority of
people.

Is digital audio "perfect"? Sorry, nothing, that I am aware of, is
perfect. Even Victoria's Secret models have flaws, but a man should
not complain if he managed to obtain one for himself. 8)

Digital audio is not intuitive. Even technical people often do not
"get it". I'm far from an expert myself, but I am an electrical
engineer and audio enthusiast who is very interested in this topic,
and who has done a reasonable amount of research on it. The result of
my research, which includes witnessing many debates, is that I've
reached some conclusions, listed above and below, that I'm fairly
certain are correct.

One of these conclusions is that, while I personally do not know
everything there is to know about digital audio, a lot of really smart
people have gotten it all figured-out, and it works damn well.

Another conclusion that I have reached is that CD, as a music-delivery
system, is actually quite good - excellent, IMO, and so I have no
desire for either turntables or SACD. If I could, I would snap my
fingers and give CD a 96kHz sampling rate so that we could put the
bandwidth issue to bed, but I'm highly skeptical that anyone over the
age of 12 would notice the difference. On the other hand, CD's 16
bits is way plenty. The extra headroom of 24 bits is useful in the
recording and studio processes, but in the music-delivery system (the
silver disk), _all_ it gets you is more dynamic range (lower noise
floor), and CD already has more than enough at 93dB effective. No,
going to 24 bits here does _not_ get you more accuracy by "making the
stair-steps smaller".

At the end of the line, at the output of your CD player, there are
_no_ "stair steps". Assuming proper mastering of the CD, there are
_no_ audible "digital artifacts". So everyone should just go out and
buy themselves a great pair of speakers and enjoy their CD's. Feel
free to disagree with any or all of the above, but don't expect me to
read or respond-to any responses to this message (unless a quick
glance reveals a congratulatory response 8). I think that I have the
issue pretty-well figured-out (and my CD-based stereo system sounds
fabulous).

P.S. I also know a thing or two about room acoustics, and I have over
200 square-feet of custom-made acoustical treatments in my main
music/home-theater room. No, I currently have no desire to discuss
this issue further, either. 8)

  #2   Report Post  
Posted to comp.os.linux.advocacy,rec.audio.tech
 
Posts: n/a
Default Digital Rules


chrisv wrote:
Regarding the matter of digital audio, consumers should be wary of
anyone who derides it due to alleged deficiencies such as
"quantization distortion" or "digital artifacts". Be especially wary
of (read: disregard anything they say) anyone who talks of "stair
steps" in digital audio, or who claims that digital, being only 1's
and 0's, "misses something" or "loses something" in the
record/playback process that analog processes do not.


This is all old news, haveing been covered numerous times by any
number of people from as far back as 1978 (cf Blesser JAES) and
even myself over the last decade or so.

The problem is that any number of people have simplified their
lives by not letting facts intrude on their agendas.

It's not surprising that many people cannot understand how you can
sample, for example, a 15kHz sine wave at at CD's 44kHz, taking fewer
than 3 samples per wave, and later use that data to reproduce the wave
so well that the distortion is not even measurable, much less audible.
But it can, and it is. Keep in mind also, that CD is kind of "worst
case" in the arena of high-fidelity, uncompressed, digital audio.
Digital audio is recorded and mastered at much-higher sampling rates
and bit levels.

The simplistic and erroneous analyses that result in proclamations of
"quantization distortion" or "stair stepping" come from people who do
not understand that these are solved issues. Dither, used propery in
the record/mastering process, _eliminates_ quantization distortion,
for the small trade-off of about 3dB of noise - noise that, even on
lowly CD, is so far below the peak output level that it's never
audible, unless you have your volume cranked to extremely high levels.


Dither is the simplest of such processes, and now much if not
most audio uses one variety or another of high-order noise
shaping.

As for the "problem" of the small number of samples available for
capturing high frequencies, as in my 15kHz example above; after your
CD player low-passes the signal, it's a nice, clean, 15kHz sine wave.
Believe it, as it is a fact beyond dispute. This digital audio stuff
works, man. It's figured-out, even if not by you and me. 8)


It's actually very simple: A digital audio stream consists of the
original signal, or "base band" (the part up to half the sampling
rate) and ALL images of that baseband from half the sampling
rate to infinite frequency. That's the intrinsic nature of time-
sampled audio.

IF you looked at the sum of the baseband signal and all of its
images, indeed, you WOULD see these steps. However, the
purpose of the "anti-imaging" filter at the output of the DAC is
simply to eliminate all the images and pass only the baseband audio.

Look at it another way: for these "steps" to happen requires VERY
quick transitions, i.e., very shorty (infinitesimally short) rise
times.
Those signal transitions ALL occur OUTSIDE of the baseband
signal, i.e., they are all above half the sampling frequency. Eliminate
everything above half the sampling frequency, and you remove all
those scomponents which make the steps.

And look at it yet another way: Because the signal is filtered to
a bandwidth less than half the sampling rate BEFORE it's sampled,
there is NO "missing data" or "holes" between samples. The fact
that the bandwidth is set means that the signal can take one and
only one trajectory (based on the previous and following samples)
between two samples. Any other path represent spurious added
ninformation not in the original band-limited signal. The anti-
imaging filter, having essentially the same bandwidth as the
original antialiasing filter, eliminates all these spuriious
components.

The bottom line is that recording, processing, and delivering audio
digitally _far_ outperforms analog methods. Distortion and noise are
_greatly_ reduced over the best that is possible using analog methods.
Reduced so much, in fact, that added noise and distortion are not only
_not_ audible, but AFAIK are not even measurable by the most sensitive
of instruments.


Actually, they are. I do it routinely.

In an experiment with a microphone running directly
into an amplifier and speaker compared to the same setup but with a
state-of-the-art 192kHz/24b A/D-D/A process in-between, there would be
_no way_ that any listening human could tell the difference.


That might be, but that's not your claim, you stated that the
difference
is "not even measurable by the most sensitive of instruments," and,
even given state of the art 192 kHz/24 bit systems, the differences
ARE measurable.

Now, having got that all off your chest, how do you feel?

More importantly, with the already enormous amount of research
and material out there, and the fact that many of the people who
are making some of the more outlandish anti-digital claims have
been lead gently to these materials with no apprent success, do
you think you're going to change their minds?

Nope: agendas are things often blissfully immune to fact. Audio
his hardly unique in that respect.

Reply
Thread Tools
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
on topic: we need a rec.audio.pro.ot newsgroup! Peter Larsen Pro Audio 125 July 9th 08 06:16 PM


All times are GMT +1. The time now is 07:26 AM.

Powered by: vBulletin
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 AudioBanter.com.
The comments are property of their posters.
 

About Us

"It's about Audio and hi-fi"