Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
snip
I notice on ebay that a lot of these "vintage" models aren't even that cheap. Some sell for well over $100. With models such as Pioneer's SX-1980 fetching $1500+, for example... Talk to the guys at: http://www.audiokarma.org They are heavily into "vintage" gear. Jeff |
#2
![]() |
|||
|
|||
![]() "Chad Williams" wrote in message om... In researching solid state integrated amps/receivers I've come across several proponents of old receivers, circa '75-early 80's, who say that these solid state systems are every bit as good as anything being made now. While I don't disbelive this statement, I'd like to understand why this is the case. Was the build quality simply better back then? Are the transformers higher quality??? I notice on ebay that a lot of these "vintage" models aren't even that cheap. Some sell for well over $100. For another $100-200 you could have something new, with new technology and a remote. So why would you buy old? It's not true. If you're into vintage equipment, and you want that sound, or that memory, then, of course, it might be worth it to you. However, in terms of quality of the amplifier, it is definitely false, for several reasons: 1. Designs in the 70's suffered from transient intermodulation distortion. Around 1979, this was discovered and eliminated. 2. Bipolar transistors suffer from "thermal runaway", which occurs when a small area of the junction heats up locally and becomes more active than the rest of the transistor. Once it starts, the transistor is quickly destroyed. The only solution available in the 1970's was brick-wall current limiting. However, amplifiers which use this kind of protection cannot handle the dynamic range of a CD at greater than low volume. 3. The noise figure of bipolar transistors dropped about 10 dB around 1980. Prior to that, equipment had an S/N ratio of around 70 dB. After 1980, S/N ratios of 90 dB and greater became the norm. 4. In 1981, David Hafler was the first to build an audio amplifier with Hitachi's new "power MOSFET." This was a major watershed in amplifier design. Concurrently, new methods of protecting bipolar transistors were implemented. Both Hafler's and Strickland's MOSFET designs had interesting qualities that raised the bar for bipolar designers. The result was an informal competition which lead to rapid advances in amplifier design. This continued up until about 1991. Since 1991, high end amplification has shown no significant advances, although variations occur from time to time. Home theater has had a negative impact on amplifier design. Nevertheless, there is one company, Pioneer, which makes MOSFET receivers of heavy construction that are notable for reproduction of music. The key years were 1981-1982. The CD propelled an advance in the state of the art. |
#3
![]() |
|||
|
|||
![]()
"Bob Morein" wrote in message
"Chad Williams" wrote in message om... In researching solid state integrated amps/receivers I've come across several proponents of old receivers, circa '75-early 80's, who say that these solid state systems are every bit as good as anything being made now. Certainly by the mid-late 1970s and early 1980s really good-sounding SS power amplifiers had started to really proliferate. While I don't disbelive this statement, I'd like to understand why this is the case. Was the build quality simply better back then? Are the transformers higher quality??? No, the power transformers were arguably worse. There have been some changes in power transformer design that has led to smaller, lighter power transformers with greater power handling capacity, or equal-sized transformers with superior capacity. However, the power transformer is just one more thing that has to be designed and specified, and a wide range of options are available, and have been available and exercised for decades. I notice on ebay that a lot of these "vintage" models aren't even that cheap. Some sell for well over $100. For another $100-200 you could have something new, with new technology and a remote. So why would you buy old? Sentimentality. It's not true. If you're into vintage equipment, and you want that sound, or that memory, then, of course, it might be worth it to you. Right, the sentimentality issue. However, in terms of quality of the amplifier, it is definitely false, for several reasons: 1. Designs in the 70's suffered from transient intermodulation distortion. Around 1979, this was discovered and eliminated. TIM is just excess nonlinear distortion at high frequencies. In the early days of SS amps there was a tendency to give power ratings for power amps that were too close to the actual capabilities of the equipment. Most SS amps have output decoupling networks that improve amplifier stability, but also cause losses that decrease the maximum undistorted power output of the amp at 20 KHz by 0.5 to 1 dB. This decrease in power output is audibly insignificant, but can cause a substantial increase in distortion to be measured if the amp's power rating is too close to the actual maximum capabilities of the amp. Thus the appearance of what seemed to be TIM showed up in many power amplifier tests in the late 60s and 70s. The date information presented above is grossly inaccurate. A check of the AES database for technical papers shows that Otala's first TIM article was published in 1972, and that by 1979 Otala was publishing some of the last papers in which he vainly attempted to defend the audible significance of his earlier claims. Yes, Otala failed to adequately defend his theories and only ignorant people place any credence in them. Note that by 1982 Clark had published his first ABX listening test paper. On the 1982 AES convention exhibits floor Clark demonstrated ABX listening tests of a TIM simulator that was originally designed and promoted by a well-known TIM advocate, I seem to recall it was Walter Jung. These listening tests, and all successive reliable listening tests ever done since show conclusively that for TIM to be an audible problem, it has to be far more severe of a problem than had ever been observed in real-world solid state amplifiers of even mediocre quality levels. The TIM myth was effectively deconstructed at that time. 2. Bipolar transistors suffer from "thermal runaway", which occurs when a small area of the junction heats up locally and becomes more active than the rest of the transistor. Once it starts, the transistor is quickly destroyed. Again this is false. Thermal runaway is a large-scale effect that involves the entire transistor junction. It has been known and managed about as long as there have been bipolar transistors, or more than 50 years. Permanent damage of transistors due to small-scale localized heating is instead known as "Secondary Breakdown". The only solution available in the 1970's was brick-wall current limiting. Again this is false. In the 1960s and to this day, SOA protection circuits were widely used, but these circuits monitored both the voltage being dropped across the output devices and the current flowing through them. If one finds audio transistor design manuals from the 1960s as well as modern manuals, SOA protection circuits that are controlled by both voltage and current are described. However, amplifiers which use this kind of protection cannot handle the dynamic range of a CD at greater than low volume. This is also false. SOA limiting parameters are determined by the SOA limits of the amplifier's output devices. Early Germanium devices didn't have much SOA capacity and failed often. The early mass-market silicon output devices (The 2N3055 family) had adequate SOA for building power amps in the 20-30 watt power range, but were marginal for building larger power amps unless used in multiples which is expensive. SOA circuits tend to be activated by real-world speaker loads, but are less likely to be activated in resistive load testing. Therefore, some power amps with relatively high power ratings such as the original Crown DC-300 were sold that arguably lacked sufficient SOA for handling loudspeakers that had the deadly combination of low impedance, high reactance, and low efficiency at frequencies where music tends to have a lot of energy. SOA limiting was often observed while playing LPs, not just CDs as stated above. For example the "Some Amplifiers Sound Different" article that I co-authored and appeared in High Fidelity News and Record review is basically a story about an expensive (Audio Research) solid state power amp that had just been highly reviewed by TAS, but in fact had SOA issues with certain (Acoustat) speakers. These listening tests were as I recall, based on playing a LP of the Eagles' "Hotel California" at a pretty high level. Back the volume off a dB to a more modest but still loud level, and the problem went away. 3. The noise figure of bipolar transistors dropped about 10 dB around 1980. Prior to that, equipment had an S/N ratio of around 70 dB. After 1980, S/N ratios of 90 dB and greater became the norm. This is a meaningless statement because SNR is only meaningful when referenced to an operating level. If one measures the SNR of amps and preamps made in the 1960s, 1970s or even 1990s the line level circuits tend to have SNRs that are better than tubed equipment and are in the 90+ dB range. If one measures their phono inputs the SNRs are more like 70 dB and up which is logical because the operating voltage levels are lower. There have not been any difficulties with well-designed transistor amplifiers being noisy excessively noisy since no later than the late 1950s. So not only is it a meaningless statement, it's just plain wrong. 4. In 1981, David Hafler was the first to build an audio amplifier with Hitachi's new "power MOSFET." Again the date is all wrong. Trivial searching shows that the DH-200 was introduced in 1979. However the DH-200 was not the first Hitachi-MOSFET power amp, just the first popular-priced kit. This was a major watershed in amplifier design. While the DH-200 and many successive MOSFET amplifiers are fine amplifiers, in fact they don't as a rule sound better than competitive, well-designed bipolar designs. MOSFETs have long been popular, but have never dominated the marketplace for high quality power amplifiers. They have their advantages and their disadvantages... Concurrently, new methods of protecting bipolar transistors were implemented. Previously debunked, and false. What has happened is that the SOA of SOTA power transistors and ICs has undergone steady improvement. Therefore, it's possible to build an effective high powered amplifier with fewer and physically smaller output devices. For example I have an upgraded Dyna 400 that equals or exceeds the reactive-load handing capabilities of the stock Dyna 416 with half as many, but far more modern output devices. Both Hafler's and Strickland's MOSFET designs had interesting qualities that raised the bar for bipolar designers. Credit needs to be given to Hitachi's engineers who laid out many of the circuit designs and parameters for building MODFET power amps and provided them along with the devices. "Name" high end audio engineers generally don't innovate much of anything in the way of power amps, they just tune and repackage circuits that are already widely used and/or suggested by device manufacturers. The result was an informal competition which lead to rapid advances in amplifier design. This continued up until about 1991. Since 1991, high end amplification has shown no significant advances, although variations occur from time to time. Actually, many ca. 1990 and later power amps include no circuit refinements that were not well-known in 1980 and possibly 1970 or even 1965. The biggest tangible changes have involved the power capacity/size/cost of the output devices, and how the circuits are packaged (ICs versus discrete designs). The power levels at which IC power amps are effective has continued to slowly improve and is now arguably in the range of one or more 100's of watts. This is up from 5 or 10 watts in the early days of audio IC power amps. Home theater has had a negative impact on amplifier design. Nevertheless, there is one company, Pioneer, which makes MOSFET receivers of heavy construction that are notable for reproduction of music. There are some basics in power amps. Their power supplies tend to be large, expensive and heavy. Power output stages tend to need large heat sinks. There are ways to minimize these costs, but those means are themselves costly. Power amps are now essentially commodity items. Such competition and technical innovation as exists mostly relates to power, size, and weight considerations. The key years were 1981-1982. The CD propelled an advance in the state of the art. |
#4
![]() |
|||
|
|||
![]() "Bob Morein" wrote in message ... "Chad Williams" wrote in message om... In researching solid state integrated amps/receivers I've come across several proponents of old receivers, circa '75-early 80's, who say that these solid state systems are every bit as good as anything being made now. While I don't disbelive this statement, I'd like to understand why this is the case. Was the build quality simply better back then? Are the transformers higher quality??? I notice on ebay that a lot of these "vintage" models aren't even that cheap. Some sell for well over $100. For another $100-200 you could have something new, with new technology and a remote. So why would you buy old? It's not true. If you're into vintage equipment, and you want that sound, or that memory, then, of course, it might be worth it to you. However, in terms of quality of the amplifier, it is definitely false, for several reasons: 1. Designs in the 70's suffered from transient intermodulation distortion. Around 1979, this was discovered and eliminated. **Not all designs from the 1970s suffered from TIM. 2. Bipolar transistors suffer from "thermal runaway", which occurs when a small area of the junction heats up locally and becomes more active than the rest of the transistor. Once it starts, the transistor is quickly destroyed. The only solution available in the 1970's was brick-wall current limiting. However, amplifiers which use this kind of protection cannot handle the dynamic range of a CD at greater than low volume. **Nope. Some 1970s amplifiers incorporated several mechanisms to prevent such damage, without resorting to severe forms of current limiting (the worst being 'foldback limiting'). 3. The noise figure of bipolar transistors dropped about 10 dB around 1980. Prior to that, equipment had an S/N ratio of around 70 dB. After 1980, S/N ratios of 90 dB and greater became the norm. **Really? Here are some Marantz models, I am famaliar with, their approximate release data and their high level S/N figures: Model 18 - 1968 - -80dB Model 3800 - 1972 -100dB Model 500 - 1972 -106dB Model 250M - 1975 - 106dB The noise of any amplifier, using modern style, silicon transistors can easily exceed 90dB. Most important is layout and construction, not the actual devices used. This truism applies to pretty much any amplifier manufactured since 1965 or so. What DID become important, by the early 1980s was that with the emerging digital media, amplifier noise was going to be an important issue. Manufacturers had to place more emphasis into proper techniques. The same techniques which Maranta employed with the Model 500 and Model 3800 units. 4. In 1981, David Hafler was the first to build an audio amplifier with Hitachi's new "power MOSFET." This was a major watershed in amplifier design. Concurrently, new methods of protecting bipolar transistors were implemented. Both Hafler's and Strickland's MOSFET designs had interesting qualities that raised the bar for bipolar designers. The result was an informal competition which lead to rapid advances in amplifier design. This continued up until about 1991. Since 1991, high end amplification has shown no significant advances, although variations occur from time to time. **Pure sophistry. The Hafler did not raise any kind of bar. The Hafler design was merely interesting, in that it used MOSFETs. MOSFETs were not then (and certainly not now) a major advance in any particular area for a whole bunch of reasons: * MOSFETs were very expensive, per peak Amp delivered. They still are, if you value complementary designs. * MOSFETs are inherently robust and self-protecting. * To implement similar levels of protection in a 1980 vintage BJT amplifier would have cost around US$2.00. A cost which was dramatically eclipsed by the far higher cost of MOSFETs. * If MOSFETs were so compelling an answer for audio applications, they would be more ubiquitous. They're not. High end amplification has not improved significantly, since the mid 1970s. What has happened is that costs have fallen and the decent amps are much less expensive. Home theater has had a negative impact on amplifier design. Nevertheless, there is one company, Pioneer, which makes MOSFET receivers of heavy construction that are notable for reproduction of music. **Pure sophistry. Whilst MOSFET amplifiers are not bad, per se, BJT amplifiers can provide higher levels of performance, at lower costs, than MOSFET amplifiers. Home Cinema has not had a negative effect on amplifier design. There are some very crappy Home Cinema amplifiers. There are (or were) some crappy stereo amplifiers. The key years were 1981-1982. The CD propelled an advance in the state of the art. **In amplifier design? Not really. Very good amplifiers were available long before 16/44 digital became available. What 16/44 digital did do, was to force crappy amplifier manufacturers to lift their game, vis a vis S/N ratios. -- Trevor Wilson www.rageaudio.com.au |
#5
![]() |
|||
|
|||
![]() "Arny Krueger" wrote in message ... "Bob Morein" wrote in message "Chad Williams" wrote in message om... In researching solid state integrated amps/receivers I've come across several proponents of old receivers, circa '75-early 80's, who say that these solid state systems are every bit as good as anything being made now. Certainly by the mid-late 1970s and early 1980s really good-sounding SS power amplifiers had started to really proliferate. To your ears, perhaps. I put it early eighties. While the Phase Linear amplifiers had good sound, they couldn't manage thermal runaway. While I don't disbelive this statement, I'd like to understand why this is the case. Was the build quality simply better back then? Are the transformers higher quality??? [snip] However, in terms of quality of the amplifier, it is definitely false, for several reasons: 1. Designs in the 70's suffered from transient intermodulation distortion. Around 1979, this was discovered and eliminated. TIM is just excess nonlinear distortion at high frequencies. No, it's not. The definition includes the word transient. It is distinguished from steady-state TIM in that it is measured when the amplifier is not in steady state. Read he http://www.zero-distortion.com/techn...cations_04.htm In the early days of SS amps there was a tendency to give power ratings for power amps that were too close to the actual capabilities of the equipment. Most SS amps have output decoupling networks that improve amplifier stability, but also cause losses that decrease the maximum undistorted power output of the amp at 20 KHz by 0.5 to 1 dB. This is obfuscation. The cause of TIM is excess loop gain at high frequencies, and there is a specific remedy. [snip] The TIM myth was effectively deconstructed at that time. I disagree. 2. Bipolar transistors suffer from "thermal runaway", which occurs when a small area of the junction heats up locally and becomes more active than the rest of the transistor. Once it starts, the transistor is quickly destroyed. Again this is false. Thermal runaway is a large-scale effect that involves the entire transistor junction. It has been known and managed about as long as there have been bipolar transistors, or more than 50 years. Permanent damage of transistors due to small-scale localized heating is instead known as "Secondary Breakdown". IMPORTANT***IMPORTANT***IMPORTANT: Arny, in a BJT, THERMAL RUNAWAY AND SECONDARY BREAKDOWN ARE IDENTICAL. Secondary breakdown is merely the physical phenomena which causes loss of current control. In this case, you are confusing the same phenomena viewed differently. "Secondary breakdown" is the term used by a device physicist. "Thermal runaway" is the term used from the systems & control point of view. The only solution available in the 1970's was brick-wall current limiting. Again this is false. In the 1960s and to this day, SOA protection circuits were widely used, but these circuits monitored both the voltage being dropped across the output devices and the current flowing through them. If one finds audio transistor design manuals from the 1960s as well as modern manuals, SOA protection circuits that are controlled by both voltage and current are described. In my experience, I have not found an amplifier made in the 60's or 70's which did anything other than brick-wall limiting. I'd be interested in some examples. However, amplifiers which use this kind of protection cannot handle the dynamic range of a CD at greater than low volume. This is also false. SOA limiting parameters are determined by the SOA limits of the amplifier's output devices. Based upon my experience with Marantz and Heathkit, I disagree. But as I've said, I'd be interested in some counterexamples. Crown, perhaps? SOA circuits tend to be activated by real-world speaker loads, but are less likely to be activated in resistive load testing. Therefore, some power amps with relatively high power ratings such as the original Crown DC-300 were sold that arguably lacked sufficient SOA for handling loudspeakers that had the deadly combination of low impedance, high reactance, and low efficiency at frequencies where music tends to have a lot of energy. SOA limiting was often observed while playing LPs, not just CDs as stated above. For example the "Some Amplifiers Sound Different" article that I co-authored and appeared in High Fidelity News and Record review is basically a story about an expensive (Audio Research) solid state power amp that had just been highly reviewed by TAS, but in fact had SOA issues with certain (Acoustat) speakers. These listening tests were as I recall, based on playing a LP of the Eagles' "Hotel California" at a pretty high level. Back the volume off a dB to a more modest but still loud level, and the problem went away. 3. The noise figure of bipolar transistors dropped about 10 dB around 1980. Prior to that, equipment had an S/N ratio of around 70 dB. After 1980, S/N ratios of 90 dB and greater became the norm. This is a meaningless statement because SNR is only meaningful when referenced to an operating level. If one measures the SNR of amps and preamps made in the 1960s, 1970s or even 1990s the line level circuits tend to have SNRs that are better than tubed equipment and are in the 90+ dB range. Apparently, you can't hear hiss. If one measures their phono inputs the SNRs are more like 70 dB and up which is logical because the operating voltage levels are lower. There have not been any difficulties with well-designed transistor amplifiers being noisy excessively noisy since no later than the late 1950s. So not only is it a meaningless statement, it's just plain wrong. 4. In 1981, David Hafler was the first to build an audio amplifier with Hitachi's new "power MOSFET." Again the date is all wrong. Trivial searching shows that the DH-200 was introduced in 1979. I'll give you the date. However the DH-200 was not the first Hitachi-MOSFET power amp, just the first popular-priced kit. This was a major watershed in amplifier design. While the DH-200 and many successive MOSFET amplifiers are fine amplifiers, in fact they don't as a rule sound better than competitive, well-designed bipolar designs. MOSFETs have long been popular, but have never dominated the marketplace for high quality power amplifiers. They have their advantages and their disadvantages... Concurrently, new methods of protecting bipolar transistors were implemented. Previously debunked, and false. No, true from personal experience. I have had several amplifiers of that era that clip hard as a result of the current limiting. What has happened is that the SOA of SOTA power transistors and ICs has undergone steady improvement. True with respect to discrete devices. With respect to ICs, there has been substantial work in more sophisticated safe area protection, which I reference below. Therefore, it's possible to build an effective high powered amplifier with fewer and physically smaller output devices. For example I have an upgraded Dyna 400 that equals or exceeds the reactive-load handing capabilities of the stock Dyna 416 with half as many, but far more modern output devices. That's a terrible amplifier. Your fundamental problem is a lack of hearing acuity. Arny, you're SO full of ****. Here's a reference to a National data sheet: National Semiconductor's bipolar-output parts (Table 1) incorporate a dynamic SOA-protection mechanism, called SPiKe, which stands for self-peak instantaneous Kelvin temperature-protection circuitry. National claims the circuitry makes the ICs nearly impervious to damage from instantaneous temperature peaks and overvoltage and overcurrent conditions. You can read the above at the EDN website: http://www.e-insite.net/ednmag/archi...1795/17df1.htm and the datasheet at http://www.national.com/an/AN/AN-898.pdf#page=9 Also of interest is http://www.national.com/an/AN/AN-261.pdf#page=2, which dates an attack on this problem to 1981. The Philips Power Division has a relevant document: http://www.semiconductors.philips.co...es/APPCHP7.pdf "When a power transistor is subjected to a pulsed load, higher peak power dissipation is permitted. The materials in a power transistor have a definite thermal capacity, and thus the critical junction temperature will not be reached instantaneously, even when excessive power is being dissipated in the device. The power dissipation limit may be extended for intermittent operation. The size of the extension will depend on the duration of the operation period (that is, pulse duration) and the frequency with which operation occurs (that is, duty factor)." and "Conclusion A method has been presented to allow the calculation of average and peak junction temperatures for a variety of pulse types. Several worked examples have shown calculations for various common waveforms. The method for non rectangular pulses can be applied to any wave shape, allowing temperature calculations for waveforms such as exponential and sinusoidal power pulses. For pulses such as these, care must be taken to ensure that the calculation gives the peak junction temperature, as it may not occur at the end of the pulse. In this instance several calculations must be performed with different" IBM has a document dated 1977 that refines safe area calculation: http://domino.watson.ibm.com/tchjr/j...c?OpenDocument Both Hafler's and Strickland's MOSFET designs had interesting qualities that raised the bar for bipolar designers. Credit needs to be given to Hitachi's engineers who laid out many of the circuit designs and parameters for building MODFET power amps and provided them along with the devices. "Name" high end audio engineers generally don't innovate much of anything in the way of power amps, they just tune and repackage circuits that are already widely used and/or suggested by device manufacturers. This is bull****. And what the hell is a MODFET ? |
#6
![]() |
|||
|
|||
![]() "Bob Morein" wrote in message ... "Arny Krueger" wrote in message ... "Bob Morein" wrote in message "Chad Williams" wrote in message om... In researching solid state integrated amps/receivers I've come across several proponents of old receivers, circa '75-early 80's, who say that these solid state systems are every bit as good as anything being made now. Certainly by the mid-late 1970s and early 1980s really good-sounding SS power amplifiers had started to really proliferate. To your ears, perhaps. I put it early eighties. While the Phase Linear amplifiers had good sound, they couldn't manage thermal runaway. While I don't disbelive this statement, I'd like to understand why this is the case. Was the build quality simply better back then? Are the transformers higher quality??? [snip] However, in terms of quality of the amplifier, it is definitely false, for several reasons: 1. Designs in the 70's suffered from transient intermodulation distortion. Around 1979, this was discovered and eliminated. TIM is just excess nonlinear distortion at high frequencies. No, it's not. The definition includes the word transient. So what? If the problem shows up only with instantaneous, one-time transients, then it isn't a problem in audio because there are no truely instantaneous transients in audio signals (they are always band-limited) and if they are one-time events they aren't heard because there is no mechanism in the ear for perceiving extremely short-term one-time events. Make a pulse narrow enough (i.e, instanteous) and it isn't audible. It is distinguished from steady-state TIM in that it is measured when the amplifier is not in steady state. Read he http://www.zero-distortion.com/techn...cations_04.htm They get to be wrong. Just because its on a web page doesn't mean it has to be right. In the early days of SS amps there was a tendency to give power ratings for power amps that were too close to the actual capabilities of the equipment. Most SS amps have output decoupling networks that improve amplifier stability, but also cause losses that decrease the maximum undistorted power output of the amp at 20 KHz by 0.5 to 1 dB. This is obfuscation. No, it was a cause of misundertanding. The cause of TIM is excess loop gain at high frequencies, and there is a specific remedy. Since we can't agree on what the problem is, it is very impossible to discuss what caused it or what cures it. However, the cause of TIM was stated by Otala in Circuit Design Modifications for Minimizing Transient Intermodulation Distortion in Audio Amplifiers JAES: Volume 20 Number 5 pp. 396-399; June 1972 as not being excess over-all loop gain, but excess gain in certain parts of the loop, and/or inadequte gain in other parts of the loop. [snip] The TIM myth was effectively deconstructed at that time. I disagree. You get to be wrong, which I have documented with authoritative independent references many time, such as the example just above. 2. Bipolar transistors suffer from "thermal runaway", which occurs when a small area of the junction heats up locally and becomes more active than the rest of the transistor. Once it starts, the transistor is quickly destroyed. Again this is false. Thermal runaway is a large-scale effect that involves the entire transistor junction. It has been known and managed about as long as there have been bipolar transistors, or more than 50 years. Permanent damage of transistors due to small-scale localized heating is instead known as "Secondary Breakdown". IMPORTANT***IMPORTANT***IMPORTANT: Arny, in a BJT, THERMAL RUNAWAY AND SECONDARY BREAKDOWN ARE IDENTICAL. No they aren't. They have well-known and distinct definitions. I clairy this below using one of your referneces. I suggest that you search google and inform yourself properly, or just read your own references. Secondary breakdown is merely the physical phenomena which causes loss of current control. It's a specific failure mode of output transistors that happens quite rapidly due to excessive short term power dissipation. It is generally independent of transistor biasing although class A bias can lead to transistors becomeing prematurely overstressed. In contrast, classic thermal runaway is basically a failure of biasing circuits to maintian normal quiscent, long-term current levels. In this case, you are confusing the same phenomena viewed differently. "Secondary breakdown" is the term used by a device physicist. They are two different things, which both equipment designers and device designers call by the same names. "Thermal runaway" is the term used from the systems & control point of view. Nope, its defined differently as I already explained. The only solution available in the 1970's was brick-wall current limiting. Again this is false. In the 1960s and to this day, SOA protection circuits were widely used, but these circuits monitored both the voltage being dropped across the output devices and the current flowing through them. If one finds audio transistor design manuals from the 1960s as well as modern manuals, SOA protection circuits that are controlled by both voltage and current are described. In my experience, I have not found an amplifier made in the 60's or 70's which did anything other than brick-wall limiting. This sentence is irrelevant to the discussion since it does not contain the phrase "currrent limiting" which was present in the original claim and my response to it. I'd be interested in some examples. First, its time for you to start talking about the same thing you started out talking about, i.e, "current limiting" and that I responded to. However, amplifiers which use this kind of protection cannot handle the dynamic range of a CD at greater than low volume. This is also false. SOA limiting parameters are determined by the SOA limits of the amplifier's output devices. Based upon my experience with Marantz and Heathkit, I disagree. SOA is not just an issue relating to a couple of equipment manufacturers. However Heathkit SS amplifier schematics are online and all but their initial non-performing germanium-transistor instant flame-out junk used SOA limiting circuits that monitor both current and voltage. I built and owned AA-22 (germanium junk) as well as the properly-designed silicon-based AR-15, AR-1500, and AA-1640 Heathkit amps. I also own three different Dynakits, one of which has a reputation for extreme fragility and lacks standard SOA protection. I own numerous other SS amps as well. But as I've said, I'd be interested in some counterexamples. Crown, perhaps? Crown amps from the DC-300 on incorporated some kind of SOA protection. In later Crown designs the output stages can be very complex and analyzing the function can be very difficult, but in the DC-300 the SOA circuits are pretty straight-forward. SOA circuits tend to be activated by real-world speaker loads, but are less likely to be activated in resistive load testing. Therefore, some power amps with relatively high power ratings such as the original Crown DC-300 were sold that arguably lacked sufficient SOA for handling loudspeakers that had the deadly combination of low impedance, high reactance, and low efficiency at frequencies where music tends to have a lot of energy. SOA limiting was often observed while playing LPs, not just CDs as stated above. For example the "Some Amplifiers Sound Different" article that I co-authored and appeared in High Fidelity News and Record review is basically a story about an expensive (Audio Research) solid state power amp that had just been highly reviewed by TAS, but in fact had SOA issues with certain (Acoustat) speakers. These listening tests were as I recall, based on playing a LP of the Eagles' "Hotel California" at a pretty high level. Back the volume off a dB to a more modest but still loud level, and the problem went away. 3. The noise figure of bipolar transistors dropped about 10 dB around 1980. Prior to that, equipment had an S/N ratio of around 70 dB. After 1980, S/N ratios of 90 dB and greater became the norm. This is a meaningless statement because SNR is only meaningful when referenced to an operating level. If one measures the SNR of amps and preamps made in the 1960s, 1970s or even 1990s the line level circuits tend to have SNRs that are better than tubed equipment and are in the 90+ dB range. Apparently, you can't hear hiss. Gratuitous and irrelevant personal attack noted. SNR is a measured parameter. Whether or not you have audible hiss depends on many things, not just SNR in the preamp. If one measures their phono inputs the SNRs are more like 70 dB and up which is logical because the operating voltage levels are lower. There have not been any difficulties with well-designed transistor amplifiers being noisy excessively noisy since no later than the late 1950s. So not only is it a meaningless statement, it's just plain wrong. 4. In 1981, David Hafler was the first to build an audio amplifier with Hitachi's new "power MOSFET." Again the date is all wrong. Trivial searching shows that the DH-200 was introduced in 1979. I'll give you the date. I gave you the date, which is backed by independent references I didn't bother to cite. However the DH-200 was not the first Hitachi-MOSFET power amp, just the first popular-priced kit. This was a major watershed in amplifier design. While the DH-200 and many successive MOSFET amplifiers are fine amplifiers, in fact they don't as a rule sound better than competitive, well-designed bipolar designs. MOSFETs have long been popular, but have never dominated the marketplace for high quality power amplifiers. They have their advantages and their disadvantages... Concurrently, new methods of protecting bipolar transistors were implemented. Previously debunked, and false. No, true from personal experience. You get to be wrong, again. I have had several amplifiers of that era that clip hard as a result of the current limiting. So what? I explain why more modern amps are less likely to be driven into SOA limiting. BTW again, the claim that pure current limiting is the fault is false. SOA limits are based on combinations of voltage and current not just current alone. It is possible to approximate SOA limits with pure current limiting, but this results in poor exploitation of device capabilities. Nevertheless it is used in certain highly-overbuilt (from the standpoint of SOA) power amps such as my two QSC units. What has happened is that the SOA of SOTA power transistors and ICs has undergone steady improvement. True with respect to discrete devices. True with respect for all devices, discrete or IC. With respect to ICs, there has been substantial work in more sophisticated safe area protection, which I reference below. This does not prove that IC SOA hasn't been improved. Certain ICs may have more sophisticated SOA protection that better exploits device properties, but the solid cure to SOA issues is simple and straight-forward: Output devices with more SOA. Therefore, it's possible to build an effective high powered amplifier with fewer and physically smaller output devices. For example I have an upgraded Dyna 4 00 that equals or exceeds the reactive-load handing capabilities of the stock Dyna 416 with half as many, but far more modern output devices. That's a terrible amplifier. Your fundamental problem is a lack of hearing acuity. Gratuitous and irrelevant personal attack noted. Use an amp within its limits and there is no audible distoriton. Arny, you're SO full of ****. Gratuitous and irrelevant personal attack noted. Here's a reference to a National data sheet: National Semiconductor's bipolar-output parts (Table 1) incorporate a dynamic SOA-protection mechanism, called SPiKe, which stands for self-peak instantaneous Kelvin temperature-protection circuitry. National claims the circuitry makes the ICs nearly impervious to damage from instantaneous temperature peaks and overvoltage and overcurrent conditions. Largely irrelevant because protecting the device better does not increase its basic capability. If you want to drive a certain speaker load with an undistorted signal the only option is to deliver a certain set of currents and voltages. Improved protection circuits don't increase basic device capabilities, they only make incremental improvements in how the existing capabilities are exploited. The major story in SS power amp performance over the past 40 years has been basic improvements in output stage SOA capabilities. Better SOA circuits provide detail-level improvements. You can read the above at the EDN website: http://www.e-insite.net/ednmag/archi...1795/17df1.htm Nothing there that is consistent with what I said. Indeed it does a nice job of distinguishing between thermal runaway and secondary breakdown. The paragraph headed "Beware of Bipolar Burnout" covers thermal runaway and its relationship to VBE variations with temperature. Two paragraphs later a discusison of SOA and second breakdown. The referenced document (AN-039) can be found at http://www-k.ext.ti.com/SRVS/CGI-BIN...00000000346801 7,K=7800,Sxi=3,Case=obj(17402) and the datasheet at http://www.national.com/an/AN/AN-898.pdf#page=9 Also of interest is http://www.national.com/an/AN/AN-261.pdf#page=2, which dates an attack on this problem to 1981. SOA protection circuits appeard in RCA transistor manuals during the early-mid 1960s. The Philips Power Division has a relevant document: http://www.semiconductors.philips.co...es/APPCHP7.pdf "When a power transistor is subjected to a pulsed load, higher peak power dissipation is permitted. The materials in a power transistor have a definite thermal capacity, and thus the critical junction temperature will not be reached instantaneously, even when excessive power is being dissipated in the device. The power dissipation limit may be extended for intermittent operation. The size of the extension will depend on the duration of the operation period (that is, pulse duration) and the frequency with which operation occurs (that is, duty factor)." Classic, completely supports my position. and "Conclusion A method has been presented to allow the calculation of average and peak junction temperatures for a variety of pulse types. Several worked examples have shown calculations for various common waveforms. The method for non rectangular pulses can be applied to any wave shape, allowing temperature calculations for waveforms such as exponential and sinusoidal power pulses. For pulses such as these, care must be taken to ensure that the calculation gives the peak junction temperature, as it may not occur at the end of the pulse. In this instance several calculations must be performed with different" Classic, completely supports my position. IBM has a document dated 1977 that refines safe area calculation: http://domino.watson.ibm.com/tchjr/j...8fcca51d85256b fa0067f80c?OpenDocument Per the reference cited above, this was old news at the time. Both Hafler's and Strickland's MOSFET designs had interesting qualities that raised the bar for bipolar designers. Credit needs to be given to Hitachi's engineers who laid out many of the circuit designs and parameters for building MODFET power amps and provided them along with the devices. "Name" high end audio engineers generally don't innovate much of anything in the way of power amps, they just tune and repackage circuits that are already widely used and/or suggested by device manufacturers. This is bull****. Sorry that the truth hurts so bad. And what the hell is a MODFET ? MOSFET + a typo. DoooHHH! |
#7
![]() |
|||
|
|||
![]()
Nice post Trevor. FWIW, I agree with *every* point raised therein.
"Trevor Wilson" wrote in message ... "Bob Morein" wrote in message ... "Chad Williams" wrote in message om... In researching solid state integrated amps/receivers I've come across several proponents of old receivers, circa '75-early 80's, who say that these solid state systems are every bit as good as anything being made now. While I don't disbelive this statement, I'd like to understand why this is the case. Was the build quality simply better back then? Are the transformers higher quality??? I notice on ebay that a lot of these "vintage" models aren't even that cheap. Some sell for well over $100. For another $100-200 you could have something new, with new technology and a remote. So why would you buy old? It's not true. If you're into vintage equipment, and you want that sound, or that memory, then, of course, it might be worth it to you. However, in terms of quality of the amplifier, it is definitely false, for several reasons: 1. Designs in the 70's suffered from transient intermodulation distortion. Around 1979, this was discovered and eliminated. **Not all designs from the 1970s suffered from TIM. 2. Bipolar transistors suffer from "thermal runaway", which occurs when a small area of the junction heats up locally and becomes more active than the rest of the transistor. Once it starts, the transistor is quickly destroyed. The only solution available in the 1970's was brick-wall current limiting. However, amplifiers which use this kind of protection cannot handle the dynamic range of a CD at greater than low volume. **Nope. Some 1970s amplifiers incorporated several mechanisms to prevent such damage, without resorting to severe forms of current limiting (the worst being 'foldback limiting'). 3. The noise figure of bipolar transistors dropped about 10 dB around 1980. Prior to that, equipment had an S/N ratio of around 70 dB. After 1980, S/N ratios of 90 dB and greater became the norm. **Really? Here are some Marantz models, I am familiar with, their approximate release data and their high level S/N figures: Model 18 - 1968 - -80dB Model 3800 - 1972 -100dB Model 500 - 1972 -106dB Model 250M - 1975 - 106dB The noise of any amplifier, using modern style, silicon transistors can easily exceed 90dB. Most important is layout and construction, not the actual devices used. This truism applies to pretty much any amplifier manufactured since 1965 or so. What DID become important, by the early 1980s was that with the emerging digital media, amplifier noise was going to be an important issue. Manufacturers had to place more emphasis into proper techniques. The same techniques which Marantz employed with the Model 500 and Model 3800 units. 4. In 1981, David Hafler was the first to build an audio amplifier with Hitachi's new "power MOSFET." This was a major watershed in amplifier design. Concurrently, new methods of protecting bipolar transistors were implemented. Both Hafler's and Strickland's MOSFET designs had interesting qualities that raised the bar for bipolar designers. The result was an informal competition which lead to rapid advances in amplifier design. This continued up until about 1991. Since 1991, high end amplification has shown no significant advances, although variations occur from time to time. **Pure sophistry. The Hafler did not raise any kind of bar. The Hafler design was merely interesting, in that it used MOSFETs. MOSFETs were not then (and certainly not now) a major advance in any particular area for a whole bunch of reasons: * MOSFETs were very expensive, per peak Amp delivered. They still are, if you value complementary designs. * MOSFETs are inherently robust and self-protecting. * To implement similar levels of protection in a 1980 vintage BJT amplifier would have cost around US$2.00. A cost which was dramatically eclipsed by the far higher cost of MOSFETs. * If MOSFETs were so compelling an answer for audio applications, they would be more ubiquitous. They're not. High end amplification has not improved significantly, since the mid 1970s. What has happened is that costs have fallen and the decent amps are much less expensive. Home theater has had a negative impact on amplifier design. Nevertheless, there is one company, Pioneer, which makes MOSFET receivers of heavy construction that are notable for reproduction of music. **Pure sophistry. Whilst MOSFET amplifiers are not bad, per se, BJT amplifiers can provide higher levels of performance, at lower costs, than MOSFET amplifiers. Home Cinema has not had a negative effect on amplifier design. There are some very crappy Home Cinema amplifiers. There are (or were) some crappy stereo amplifiers. The key years were 1981-1982. The CD propelled an advance in the state of the art. **In amplifier design? Not really. Very good amplifiers were available long before 16/44 digital became available. What 16/44 digital did do, was to force crappy amplifier manufacturers to lift their game, vis a vis S/N ratios. -- Trevor Wilson www.rageaudio.com.au |