View Single Post
  #16   Report Post  
Joseph Meditz
 
Posts: n/a
Default

A seminal paper from MIT shows that distortion related to sampling
must
consider both the sample rate and the target word size. For today's
CDs--that is 16 bits. Thus, according to this paper, a minimum of 8X


frequency is required--10X is better. Working backwards, that means

that CD
technology can only reproduce, at best, 5.5 KHz before distortion

starts to
enter in. This is independant of the construction of filters and

assumes a
boxcar filter (impossible in real life.)



Please cite the papar, as this is contrary to current theory - and
more importantly, to current measurements, which demonstrate that
44.1k sampling is adequate for *perfect* capture of any waveform
within a 22kHz bandwidth



Other solutions have worked hard to reduce this problem by

oversampling,
adding bits, etc. All these solutions smooth the distortion created

by the
original system, but they can not add information back in that is

lost.
What they can do is create better sounding music by smoothing out the


jaggies in the distortion.



There is *no* distortion. Cite the paper, or cite *any* measurements
which can demonstrate such distortion. Otherwise go away, troll.

--

Although the OP is tangled up in his own underwear, I think that he's
alluding to the relationship between sampling rate and quantization
noise.

Joe