View Single Post
  #46   Report Post  
Posted to rec.audio.tech
Trevor Wilson[_2_] Trevor Wilson[_2_] is offline
external usenet poster
 
Posts: 724
Default Stereo receievers: THD of .04 vs. .08


"Arny Krueger" wrote in message
...
"Eeyore" wrote in
message
JANA wrote:

With the distortion you mentioned, there will be no
difference that you can hear. Even at 0.1% THD
distortion, you cannot hear that.


The distortion spec is measured at close to full power.


The real distortion at lower powers where most of the
listening is done may be as much as TEN times higher with
typical Class AB bipolar transistor designs.


No way with modern amps. Any big rises at low levels that may be shown in
published tests is due to noise. The truth outs if someone actually
publishes a spectral analysis.

For one thing, modern output transistors have Ft up in the megahertz
range. Back in the 70s people were still struggling with output devices
with Ft in the KHz range.


**Nope. The Japanese had a goodly number of complementary devices, with
decent Voltage, current and SOA ratings which extended well into the tens of
MHz. By 1974, Hi-Rel had true Triple Diffused, TO3 devices with a 20MHz fT
and 20 Amps capacity. Exceptionally linear and very potent devices. I recall
them vividly, since Motorola had announced their intention to commit to
Epitaxial Base devices (ca. 2MHz), since it was impossible to produce Triple
Diffused devices (according to Motorola). Other manufacturers also had some
impressive devices back then. The Marantz 2325 receiver used 2SA747/2SC1116
devices. These posessed an fT of 15MHz.

Having said all that, good gain linearity over a wide current range has only
been a relatively recent development.

Trevor Wilson