View Single Post
  #8   Report Post  
Phil Allison
 
Posts: n/a
Default


"Scott Gardner"

I've noticed that solid-state amplifiers tend to have much
higher damping factors than tube amps. Is damping factor measured the
same way for both types of amps (load impedance divided by output
impedance)?



** Yep.

What's a good minimum damping factor that won't introduce
audible artifacts,



** 50 to 100 is a good range - but the whole DF issue depends on what
speakers you are using.

An ideal power amp is one that can be used with any commercial hi-fi
speaker system without audible change in the output due to variations in the
speaker load impedance with frequency.

If an amp has a specified DF of 10 then that means the output impedance is
0.8 ohms. If a speaker system ( 8 ohm nominal) is used that has a minimum
impedance of 2 ohms at some frequency - which is not that uncommon - then
there will be a 3 dB loss of input level at that minimum. A 3 dB dip in the
mid or high band is VERY audible.

Speaker designers usually assume that a high quality amp is going to be
used - ie a SS amp with high damping factor - so they do not often design
their creations to suit amps with low numbers like 5 or 10.

One major reason why tube amps are claimed to sound "different" is their
low DF figures.



.......... Phil