View Single Post
  #7   Report Post  
Arny Krueger
 
Posts: n/a
Default old solid state circa 70-80's`

Nice post Trevor. FWIW, I agree with *every* point raised therein.


"Trevor Wilson" wrote in message
...

"Bob Morein" wrote in message
...


"Chad Williams" wrote in message
om...
In researching solid state integrated amps/receivers I've come across
several proponents of old receivers, circa '75-early 80's, who say
that these solid state systems are every bit as good as anything being
made now. While I don't disbelive this statement, I'd like to
understand why this is the case. Was the build quality simply better
back then? Are the transformers higher quality???

I notice on ebay that a lot of these "vintage" models aren't even that
cheap. Some sell for well over $100. For another $100-200 you could
have something new, with new technology and a remote. So why would you
buy old?


It's not true.
If you're into vintage equipment, and you want that sound, or that

memory,
then, of course, it might be worth it to you.
However, in terms of quality of the amplifier, it is definitely false,

for
several reasons:

1. Designs in the 70's suffered from transient intermodulation

distortion.
Around 1979, this was discovered and eliminated.


**Not all designs from the 1970s suffered from TIM.


2. Bipolar transistors suffer from "thermal runaway", which occurs when

a
small area of the junction heats up locally and becomes more active than

the
rest of the transistor. Once it starts, the transistor is quickly

destroyed.
The only solution available in the 1970's was brick-wall current

limiting.
However, amplifiers which use this kind of protection cannot handle the
dynamic range of a CD at greater than low volume.


**Nope. Some 1970s amplifiers incorporated several mechanisms to prevent
such damage, without resorting to severe forms of current limiting (the
worst being 'foldback limiting').


3. The noise figure of bipolar transistors dropped about 10 dB around

1980.
Prior to that, equipment had an S/N ratio of around 70 dB. After 1980,

S/N
ratios of 90 dB and greater became the norm.


**Really? Here are some Marantz models, I am familiar with, their
approximate release data and their high level S/N figures:

Model 18 - 1968 - -80dB
Model 3800 - 1972 -100dB
Model 500 - 1972 -106dB
Model 250M - 1975 - 106dB

The noise of any amplifier, using modern style, silicon transistors can
easily exceed 90dB. Most important is layout and construction, not the
actual devices used. This truism applies to pretty much any amplifier
manufactured since 1965 or so. What DID become important, by the early

1980s
was that with the emerging digital media, amplifier noise was going to be

an
important issue. Manufacturers had to place more emphasis into proper
techniques. The same techniques which Marantz employed with the Model 500
and Model 3800 units.


4. In 1981, David Hafler was the first to build an audio amplifier with
Hitachi's new "power MOSFET." This was a major watershed in amplifier
design. Concurrently, new methods of protecting bipolar transistors

were
implemented. Both Hafler's and Strickland's MOSFET designs had

interesting
qualities that raised the bar for bipolar designers. The result was an
informal competition which lead to rapid advances in amplifier design.

This
continued up until about 1991. Since 1991, high end amplification has

shown
no significant advances, although variations occur from time to time.


**Pure sophistry. The Hafler did not raise any kind of bar. The Hafler
design was merely interesting, in that it used MOSFETs. MOSFETs were not
then (and certainly not now) a major advance in any particular area for a
whole bunch of reasons:
* MOSFETs were very expensive, per peak Amp delivered. They still are, if
you value complementary designs.
* MOSFETs are inherently robust and self-protecting.
* To implement similar levels of protection in a 1980 vintage BJT

amplifier
would have cost around US$2.00. A cost which was dramatically eclipsed by
the far higher cost of MOSFETs.
* If MOSFETs were so compelling an answer for audio applications, they

would
be more ubiquitous. They're not.

High end amplification has not improved significantly, since the mid

1970s.
What has happened is that costs have fallen and the decent amps are much
less expensive.




Home theater has had a negative impact on amplifier design.

Nevertheless,
there is one company, Pioneer, which makes MOSFET receivers of heavy
construction that are notable for reproduction of music.


**Pure sophistry. Whilst MOSFET amplifiers are not bad, per se, BJT
amplifiers can provide higher levels of performance, at lower costs, than
MOSFET amplifiers. Home Cinema has not had a negative effect on amplifier
design. There are some very crappy Home Cinema amplifiers. There are (or
were) some crappy stereo amplifiers.


The key years were 1981-1982. The CD propelled an advance in the state

of
the art.


**In amplifier design? Not really. Very good amplifiers were available

long
before 16/44 digital became available. What 16/44 digital did do, was to
force crappy amplifier manufacturers to lift their game, vis a vis S/N
ratios.


--
Trevor Wilson
www.rageaudio.com.au