View Single Post
  #8   Report Post  
Posted to rec.audio.tubes
patrick-turner patrick-turner is offline
external usenet poster
 
Posts: 119
Default Amplifier Burst Testing


John Stewart reminds us that............

"A paper by Anthony New published in Electronics World makes a very good
argument that THD testing is a waste of time."

But with most tube amps used for hi-fi and which operate in class A1 for nearly all the power required for home hi-fi listening, measuring THD is actually not such a bad way to assess or compare amps. RDH4 gives approximate factors for estimating IMD once the THD has been measured. This assumes there are only 2 test frequencies applied at the amp input with say 1Vrms of 80Hz and 0.25Vrms of say 5kHz. The 80Hz may then be fairly easily filtered out of the sample voltage at amp output, leaving just 5kHz which may be examined on a CRO to estimate the percentage of amplitude modulation caused by presence of 80Hz.
The 5kHz wave may be passed through a diode & R+C detector circuit to measure low levels of AM which may be difficult to see on a CRO if it is below 1%.
This saves the home constructor the effort needed to build a special filter capable of rejecting 80Hz AND 5kHz, and just giving the sidebands of 4,920Hz and 5,080Hz. Nobody would bother now to make such an effort because software and a PC is +90dB easier to do, even though much is learnt by building one's own analog gear which was used before PCs and digitalia came along.
In fact I made a tunable BPF for between 1.4kHz and 11kHz and with Q = 50 to pick out all F between 2kHz and 10kHz, including IMD products. Only 3 op-amps are needed with some overlapping FB paths plus a good pot. Some schematics are online, or in old books.
It was assumed that the "standard" IMD test with 4:1 ratio of LF to HF was the best test for any tube amp because the OPT iron behavior and tubes and RC couplings caused their worst levels of nonlinearity at bass frequencies, ie, a large bass signal of 80Hz or worse, say 30Hz, will upset the clean production of 5kHz more than say having two signals of say 1kHz and 5kHz, even if the 1kHz is 4 times the level of 5 kHz.


So, at the end of the day, if the tube amp is used at levels where nothing ever clips, and the THD 0.1% based on 1 kHz THD testing, and the open loop bandwidth ( with no global NFB ) is say 30Hz to 40kHz, then music is usually judged as subjectivley pleasing to most ppl. If one then tries to compare by using a Halcro SS amp capable of THD levels at 1 Watt which are undetectable, then few would hear any difference in a blind A-B test where they are asked to identify which has less distortion. If asked which sounds best, maybe the class A triode amp will be chosen more often, but this depends on source and recodings etc, and the 1,001 other things that have been endlessly discussed at rec.audio.hi-end, with moderators having to weed out flamers.

I myself have conducted comparison blind tests between amps making THD of 0..1% and 0.0002% and found the listeners concluded that "the same ******* designed and built both amps, which were both OK". And indeed the 2 x 300W mosfet amp with 6 mosfets and class AB was undetectable when substituted by a "mystery switch" with 2 x 65W clas AB tube amp with 4 x 6550 per channel.

So once I was able to routinely make tube amps with 0.1% THD at 1 kHz at 50Watts, and with wide open loop BW, I didn't need to check IMD. The IMD was always low enough. Other factors such as noise, Rout, BW and stability were always important.

I also found that someone could make the worlds best amp in a back shed, and nobody really wanted to pay the world price, they would always go to some brandname, which so often was a technical POS, but with percieved "investment value", so I cncluded many people access the sound of the music by the price tag of the amp, or the bling factor, and just as likely hardly know what real music is.

I'm now happily retired and I won't be an underpaid volanteer expert any more.
Patrick Turner.