View Single Post
  #3   Report Post  
Posted to rec.audio.pro
Scott Dorsey Scott Dorsey is offline
external usenet poster
 
Posts: 16,853
Default A/D converter- replace crystal unit?

Due to the low cost of both the soundcards and the crystals, it would
be pretty easy to set up a comparative test. If you have more than one
card, you normally sync them up via SPDIF and choose one as the
"master clock". You could install the new crystal in one card, then
run a parallel A/D and/or D/A test, specifying a different card as the
master clock for each run. Sound good?


Odds are he's doing more than just changing the crystal to clean up the
clock.

I do question, though, the wisdom of spending money on a cheap soundcard
to improve the sound quality, when there are known-good A/D boxes available
off the shelf for a little more money. You upgrade this, then upgrade
that, and next thing you know you could have bought the Benchmark with
what you spent.

Great! But how do I analyze the results? Arny, I know you're familiar
with these cards- any recommendations on how I can test for jitter?
(sorry to single you out, but I've read your posts on the 1010LT, and
I figured you'd sympathize with a fellow user).


Problem is that you need a signal source that is really good. For example,
if you record a 1 KC sine wave, clock jitter will show up as little sidebands
around the main peak if you do an FFT. But, if you record a 1KC sine wave
from a 200CD oscillator like the one on my bench, you'll see lots of little
sidebands that came from the oscillator itself and aren't artifacts of the
conversion. In order to measure the converter quality, you need accurate
signal sources and analysis tools that are an order of magnitude tighter
than what you're trying to measure. That's why this is so hard.
--scott
--
"C'est un Nagra. C'est suisse, et tres, tres precis."