Reply
 
Thread Tools Display Modes
  #1   Report Post  
Posted to rec.audio.tubes
Alexander Dyszewski Alexander Dyszewski is offline
external usenet poster
 
Posts: 5
Default amplifier input sensitivity

Hello,

where is the point in making the input sensitivity of a modern amplifier
as low as 250mv ?
Lets assume the amplifier needs 0,775Vrms (=0dBu) for full output and
most digital sources tend to produce about 2Vrms leaving a 8dB margin
for quieter recordings. I found a Marantz Fmtuner with 1Vrms Output and
i think other tuners will bei similar. This will give only about 2dB
margin, but since most fm broadcasts are modulated noise anyway there
should be no problem in achieving full volume. High quality reel-reel
tapedecks will have balanced XLR connectors giving +4dBu which equals to
1,2Vrms. So again no reason for low input sensitivity. Riaastages might
be different, but i think that a high quality preamplifier should be
able to deliver 0dBu even with strangely mastered records.

Can anybody here think of any (high quality) source that will actually
need such high sensivity ?

sincerly,
Alexander Dyszewski
  #2   Report Post  
Posted to rec.audio.tubes
Phil Allison[_3_] Phil Allison[_3_] is offline
external usenet poster
 
Posts: 500
Default amplifier input sensitivity


"Alexander Dyszewski"

where is the point in making the input sensitivity of a modern amplifier
as low as 250mv ?


** Plenty of audio signal sources deliver only that much voltage.

( snip **** )


I found a Marantz Fm tuner with 1Vrms Output


** So ****ing what ?

You are cherry picking examples to suit yourself.


and i think other tuners will bei similar.



** Fraid YOU do not think anything at all.


High quality reel-reel tapedecks


** Huh ??

Utterly obsolete in a domestic sound system.


Can anybody here think of any (high quality) source that will actually
need such high sensivity ?



** Nearly all AM /FM tuners, nearly all cassette decks, most TV sets with AV
outputs, and nearly all phono stages.

**** Off.


..... Phil






  #3   Report Post  
Posted to rec.audio.tubes
Ian Iveson Ian Iveson is offline
external usenet poster
 
Posts: 960
Default amplifier input sensitivity

Alexander Dyszewski wrote:

where is the point in making the input sensitivity of a
modern amplifier as low as 250mv ?
Lets assume the amplifier needs 0,775Vrms (=0dBu) for full
output and most digital sources tend to produce about
2Vrms leaving a 8dB margin for quieter recordings. I found
a Marantz Fmtuner with 1Vrms Output and i think other
tuners will bei similar. This will give only about 2dB
margin, but since most fm broadcasts are modulated noise
anyway there should be no problem in achieving full
volume. High quality reel-reel tapedecks will have
balanced XLR connectors giving +4dBu which equals to
1,2Vrms. So again no reason for low input sensitivity.
Riaastages might be different, but i think that a high
quality preamplifier should be able to deliver 0dBu even
with strangely mastered records.

Can anybody here think of any (high quality) source that
will actually need such high sensivity ?


If you're building for yourself, make whatever fits best in
your system.

Commercial amps are different. Partly for the obvious
reason, and partly because the average consumer expects full
power to be about 1 o'clock on the volume control. So I read
somewhere, AFAIR...possibly Menno van der Veen or Morgan
Jones.

Ian


  #4   Report Post  
Posted to rec.audio.tubes
Alexander Dyszewski Alexander Dyszewski is offline
external usenet poster
 
Posts: 5
Default amplifier input sensitivity

flipper schrieb:
where is the point in making the input sensitivity of a modern amplifier
as low as 250mv ?


I don't know that I'd pick 250mV but the problem is things are not as
precise as you're making out.

Lets assume the amplifier needs 0,775Vrms (=0dBu) for full output and
most digital sources tend to produce about 2Vrms


I think you're already in trouble making that assumption as 1Vrms,
which you mention below, is also rather common.

Is the goal to work with something specific you already have or cover
the range of what's available? If the latter then you don't have the
luxury of 'most sources' even if you're right about what's common.


I want to cover a reasonable range of possible sources like "normal"
japanese hifi components.

leaving a 8dB margin
for quieter recordings.


Here you have a *wide* range, depending on what you listen to.

First, the '2Vrms' (1Vrms) is digital speak for full scale output, not
nominal. There is nothing, not so much as one pico volt, more than FS
and that means someone trying to do 'hi quality' is not going to
approach full scale because of clipping. If I remember correctly,
peaks (true peak meter) 6dB under FS is the recommended 'safe' setting
and if it's a 1Vfso device that's already down to 500mV peaks.

'Modern' rock will compress the bejesus out of things to compete in
the volume wars, so nominal will not be low. Classical, however, to
keep dynamic range, will have nominal depressed along with staying
away from full scale. Early rock recordings, from back in the days
when they thought CD was supposed to be 'high quality', will be closer
to the 'classical' level because it isn't compressed flat as a
screaming cow pattie up against FS.


I disregarded classical music, because most of my cds are metal, rock
and electronic. But you do have a point there, some cds do have quite a
lot of dynamic range.

On top of that you have listener preference and you seem to be
assuming it's to never drive into clipping regardless of what nominal
volume level that dictates but some (many?) will turn it up till it
hurts, which is clipping, so 'max power' is not necessarily your
'peak' point. This, btw, is especially true of tube gear because they
(usually) clip more gracefully than SS.

Then, silly as it may seem, there's the volume knob itself and people
don't like it pegged on max because that feels like it's 'run out of
volume', never mind whether it's usable.


I found a Marantz Fmtuner with 1Vrms Output and
i think other tuners will bei similar. This will give only about 2dB
margin, but since most fm broadcasts are modulated noise anyway there
should be no problem in achieving full volume.


I have no idea what your reasoning there is. 'Noise' isn't going to
help it reach max power.


My idea was that since fm-broadcasts tend to be extremely compressed and
enchanced for kitchen radios there is no need for additional gain in the
preamplifier.

High quality reel-reel
tapedecks will have balanced XLR connectors giving +4dBu which equals to
1,2Vrms. So again no reason for low input sensitivity. Riaastages might
be different, but i think that a high quality preamplifier should be
able to deliver 0dBu even with strangely mastered records.


Consumer line level is 316mVrms nominal.


Can anybody here think of any (high quality) source that will actually
need such high sensivity ?


Thank you for the very informative post.
  #5   Report Post  
Posted to rec.audio.tubes
Bret L Bret L is offline
external usenet poster
 
Posts: 1,145
Default amplifier input sensitivity

On Apr 20, 10:20*am, "Phil Allison" wrote:
"Alexander Dyszewski"

Can anybody here think of any (high quality) source that will actually
need such high sensivity ?


** Nearly all AM /FM tuners, nearly all cassette decks, most TV sets with AV
outputs, and nearly all phono stages.

* ***** Off.

.... *Phil


You'll have to excuse Phil. He's an autistic ****.

775 mV sensitivity at any actual impedance between 10 K and 1 M is
"about right" though a little higher or lower is fine.

High input impedance is more important if driving directly from a CD
player, for instance, since the manufacturers of these refuse to put
in adequate output sections and certainly adequate volume controls.
The original MC275, which is the canonical stereo amplifier IMO,
sensibly has a knob but there are separate ones for each input, which
is awkward.


  #6   Report Post  
Posted to rec.audio.tubes
[email protected] arthrnyork@webtv.net is offline
external usenet poster
 
Posts: 81
Default amplifier input sensitivity

On Apr 20, 9:01*pm, Bret L wrote:
On Apr 20, 10:20*am, "Phil Allison" wrote:

"Alexander Dyszewski"


Can anybody here think of any (high quality) source that will actually
need such high sensivity ?


** Nearly all AM /FM tuners, nearly all cassette decks, most TV sets with AV
outputs, and nearly all phono stages.


* ***** Off.


.... *Phil


*You'll have to excuse Phil. He's an autistic ****.

775 mV sensitivity at any actual impedance between 10 K and 1 M is
"about right" though a little higher or lower is fine.

*High input impedance is more important if driving directly from a CD
player, for instance, since the manufacturers of these refuse to put
in adequate output sections and certainly adequate volume controls.
The original MC275, which is the canonical stereo amplifier IMO,
sensibly has a knob but there are separate ones for each input, which
is awkward.


Line level is not standard among consumer grade
audio. It is true, 316mV is very common,but the input (as well
output ) voltage is not clear (meaningless) unless impedance is
specified. 0dB is actually ~775mV at 600 Ohm in telecom and
idustrial /professional audio environment. If the impedance (Z) is
not 600 Ohm any voltage level stated becomes rather ambiguous and
quite misleading. 0DB(standard) is 1mW = V^2 / Z.
  #7   Report Post  
Posted to rec.audio.tubes
Bret L Bret L is offline
external usenet poster
 
Posts: 1,145
Default amplifier input sensitivity

Most consumer audio amplifiers, and in fact most in general (EXCEPT
the "booster" amps of yore such as large Altecs which were nothing
more than a single pair of medium mu transmitting triodes with an
array of transformers for power supply, input and output) are bridging
input loads. They are not designed to accept input power, just
voltage. If one needs to drive them from a source which wants a
defined load then one simply puts a load resistor across the input.

There is no particular reason for the input sensitivity of these to
be what they are, it's simple approximate consensus commonality. 775
mV is "about right" because most any source will drive it to full
output except a raw phono cartridge or a guitar. If needed you can pad
it down, which is easier than the other way. Manufacturers should be
URGED to provide convenient internal tie points to add a voltage
divider resistor pair if needed in a sanitary fashion.
  #8   Report Post  
Posted to rec.audio.tubes
Bret L Bret L is offline
external usenet poster
 
Posts: 1,145
Default amplifier input sensitivity

Consumer grade audio manufacturers build whatever they
think is right and just leave it to the public to find out (by trial
and error) what the true signal levels in their gadjets are...


Yes.

Most of the better test sets have multiple output impedances today:
standalone audio gens aren't that common in current use. The pro world
has been utterly dominated by AP for about a decade, such that you can
get HP 339s, 8903s, and even the grossly overpriced Potomac
Instruments boxes cheap.
  #9   Report Post  
Posted to rec.audio.tubes
Ian Iveson Ian Iveson is offline
external usenet poster
 
Posts: 960
Default amplifier input sensitivity



Actually, the decibel, originated from loss of power over
1 mile of
standard telephone cable defined as "a cable having
uniformly
distributed resistances of 88 ohms per loop mile and
uniformly
distributed shunt capacitance of .054 microfarad per
mile," or 1 MSC.- Hide quoted text -

- Show quoted text -


Very good info! However, if the
impedances are
different the source voltage will not remain the same as
soon as it is
hooked up to the load ! In fact, it will drop
significantly. If we
take an audio signal generator (and most of them have
nominally 600
Ohm output impedance) and set it at 0.775V level (open
circuit). Now
let us connect a load (resistor of some other value,say 75)
across and
measure the output voltage. It will NOT be 0.775V any
longer: in an
average piece of test equipment this level WILL drop if the
load is
anything DIFFERENT than source impedance. If the load is
higher than
600 it may not be as pronounced. Even oscilloscope input Z
is rated
(typically 1 MOhm or higher) to be aware of possible
misleading
measurements. This is real life , not pure theory from
textbooks.To
keep the output level constant at different loads will
require a
constant voltage source and is normally not found in a
typical audio
generator. Consumer grade audio manufacturers build whatever
they
think is right and just leave it to the public to find out
(by trial
and error) what the true signal levels in their gadjets
are...

***For several related reasons, it's usual to ensure that
input impedance is at least 10 times source impedance. With
that ratio, the signal is reduced by about 9% from its open
circuit voltage. Just a rule of thumb, but useful guide to
the limit of what can be considered a voltage source for the
purpose of audio.

***The ratio used for efficient power transfer would be 1:1,
presumably. Flipper is quite right: you're hung up on rules
that apply to different circumstances, and for different
purposes.

***What would be an average output impedance for a modern
source, and how much variation is there, actually rather
than in a suspicious mind? Similarly for amplifier input
impedance? I don't think you'll find much cause for concern.
For valve amps, much less so. For valve sources into SS
amps, you might need to be more careful.

***The decibel is a ratio of two quantities. Since the units
of the two quantities must be the same, the decibel ends up
with no unit of its own. Some argue that a unit remains
implicit even when it's cancelled out, but that makes no
sense to my maths. It does make sense to claim that, by
convention of language, it has a few particular
applications, and therefore one of a few units is implied,
depending on context. If it is not absolutely clear from the
context, then the unit should be stated, as in "dBV". "dBm"
is smart-arse jargon that stuck.

***Ian


  #10   Report Post  
Posted to rec.audio.tubes
John Byrns John Byrns is offline
external usenet poster
 
Posts: 1,441
Default amplifier input sensitivity

In article
,
Bret L wrote:

Most consumer audio amplifiers, and in fact most in general (EXCEPT
the "booster" amps of yore such as large Altecs which were nothing
more than a single pair of medium mu transmitting triodes with an
array of transformers for power supply, input and output) are bridging
input loads. They are not designed to accept input power, just
voltage. If one needs to drive them from a source which wants a
defined load then one simply puts a load resistor across the input.


I don't agree, I remember some of those PA "booster" amplifiers were
designed to accept input power, this was sometimes necessary because
some of them used transmitting tubes that were driven into grid current,
i.e. class B2.

--
Regards,

John Byrns

Surf my web pages at, http://fmamradios.com/


  #11   Report Post  
Posted to rec.audio.tubes
Phil Allison[_3_] Phil Allison[_3_] is offline
external usenet poster
 
Posts: 500
Default amplifier input sensitivity


"flipper"


If the impedance (Z) is
not 600 Ohm any voltage level stated becomes rather ambiguous and
quite misleading.


No it isn't. dBV is specified relative to 1V. I.E. 0dBV is 1V, 10dBV
is 10V, etc


** Sorry - 10 dBV is 3.16 volts.


dBu is specified relative to .775V (rounded). I.E. 0dBV is .775V
(rounded), 10dBu is 7.746V, etc.


** Same error again - 10 dBu is 2.449 volts




..... Phil


  #12   Report Post  
Posted to rec.audio.tubes
Phil Allison[_3_] Phil Allison[_3_] is offline
external usenet poster
 
Posts: 500
Default amplifier input sensitivity





However, if the impedances are
different the source voltage will not remain the same as soon as it is
hooked up to the load ! In fact, it will drop significantly.

** Wot absurd drivel.


If we
take an audio signal generator (and most of them have nominally 600
Ohm output impedance) and set it at 0.775V level (open circuit). Now
let us connect a load (resistor of some other value,say 75) across and
measure the output voltage. It will NOT be 0.775V any longer: in an
average piece of test equipment this level WILL drop if the load is
anything DIFFERENT than source impedance.

** Totally off with the fairies.

The voltage level will always DROP

NO MATTER WHAT THE LOAD IMPEDANCE IS !!


If the load is higher than 600 it may not be as pronounced.


** The bats in this loon's belfry are having a party.


Consumer grade audio manufacturers build whatever they
think is right and just leave it to the public to find out (by trial
and error) what the true signal levels in their gadjets are...


** Total crap.

The line voltage levels quoted by makers of consumer audio gear are based on
a simple assumption - ie that the load impedance as specified or is
WITHIN the range of values found in consumer audio gear, ie at least
10kohm.

The source impedance values used are low enough that the output level is not
significantly affected by such a load.

This has been ** standard practice ** for over 60 years and it still amazes
me how many fools like this posturing ASS have not caught up with it.



..... Phil




  #13   Report Post  
Posted to rec.audio.tubes
[email protected] arthrnyork@webtv.net is offline
external usenet poster
 
Posts: 81
Default amplifier input sensitivity

On Apr 21, 11:46*pm, flipper wrote:
On Thu, 22 Apr 2010 13:18:40 +1000, "Phil Allison"
wrote:







"flipper"


If the impedance (Z) *is
not 600 Ohm any *voltage level stated becomes rather ambiguous and
quite misleading.


No it isn't. dBV is specified relative to 1V. I.E. 0dBV is 1V, 10dBV
is 10V, etc


** Sorry *- * *10 dBV * is 3.16 volts.


dBu is specified relative to .775V (rounded). I.E. 0dBV is .775V
(rounded), 10dBu is 7.746V, etc.


** Same error again *- * 10 dBu is 2.449 volts


Quite right. My bad. I swapped power ratios for voltage.



.... * Phil- Hide quoted text -


- Show quoted text -- Hide quoted text -

- Show quoted text -


Everybody must be happy at this stage!
  #14   Report Post  
Posted to rec.audio.tubes
Ian Iveson Ian Iveson is offline
external usenet poster
 
Posts: 960
Default amplifier input sensitivity

Alex wrote:

No it isn't. dBV is specified relative to 1V. I.E. 0dBV
is 1V, 10dBV
is 10V, etc. There is no 'ambiguity' nor is it
'misleading'.


This is a mistake, by the way.
+10dBV is approximately 3.1V.
+20dBV is 10V.


dBu is specified relative to .775V (rounded). I.E. 0dBV
is .775V
(rounded), 10dBu is 7.746V, etc.


The same mistake.
+20dBu corresponds to 7.75V.


The dB has become a dog's dinner of a non unit that causes
confusion wherever it goes. As a convention, it's perfectly
clear once you've learned it, but it isn't obvious.

In particular, if you wanted a confusing way of expressing
the voltage necessary to increase the power by a certain
proportion, given constant resistance, dBV would be a good
candidate. Then having established that form of expression
you could add the totally contrary dBm, for good measure.

Isn't this the kind of thing that Europe was supposed to
sort out?

A more consistent convention for dealing with its
subscripts...the u, V, m etc....would help maybe.

On a positive note, I've noticed that many of those here who
appear versed in audio technology are quite disciplined in
their usage of the decibel, especially with respect to
voltage. In circumstances where a voltage is significant on
its own account, I generally see volts or percentages. Where
voltage is significant by dint of its relation to power, I
generally see dB. Not a habit I've fully adopted, I must
admit.

It would be nice, I think, if the same kind of distinction
were made when talking about frequency. We don't use octaves
as much as we ought, I fear.

Ian


  #15   Report Post  
Posted to rec.audio.tubes
Ian Iveson Ian Iveson is offline
external usenet poster
 
Posts: 960
Default amplifier input sensitivity

Flipper wrote:

dBu is specified relative to .775V (rounded). I.E. 0dBV
is .775V
(rounded), 10dBu is 7.746V, etc.

The same mistake.
+20dBu corresponds to 7.75V.


The dB has become a dog's dinner of a non unit that causes
confusion wherever it goes. As a convention, it's
perfectly
clear once you've learned it, but it isn't obvious.

In particular, if you wanted a confusing way of expressing
the voltage necessary to increase the power by a certain
proportion, given constant resistance, dBV would be a good
candidate.


For that purpose it doesn't matter what flavor of dB you
use because a
doubling of power is, by definition, 3dB, is 3dB, is 3dB.


Well yes. But the simple ratio of voltages is not the same
as the simple ratio of powers. Half the voltage is -6dB,
half the power is -3dB. Sticking the V on the end of the dB
doesn't give much of a clue that this will be the case,
especially as there is no explicit indication that the dBV
has anything to do with power.

Surely you agree that many ppl find the dB at least a bit
hard to grasp? Why do you want to demonstrate that it is
straightforward?

For example, when double checking FB on a power amp in
spice I
disconnect FB, measure output power, connect FB, measure
output power,
and verify it's 100 to 1 for 20db or 10 to 1 for 10dB.
Which, btw, is
how I made the stupid mistake. In spice I can measure
power 'directly'
and reflexively know 10dB is 10x power. A stupid mistake
from not
thinking about what I was typing.


I don't quite follow what mistake you mean, I'm afraid. One
thing that confused me for some time arises from the fact
that my spice automatically labels the y-axis in terms of
current or voltage, because there's no such thing as a power
"probe". So if I multiply a current by a voltage, and graph
the result, it's not obvious that the result is power,
although, almost by fluke, it still gets the dB scale right
in the sense that the numerical values are correct.
Similarly with resistance, and even stuff like flux density,
as long as I'm careful to stick with units from the same
family.

What 'nominal' or 'reference' power do I start with or
'calibrate' to?
Nothing. It doesn't matter. 10dB is 10dB.

The flavor only matters if you care about an interchange
reference
level like, say, building a repeater that you need to work
in a
'standard' telecom system that expects 0dB to be 1mW into
600 Ohms.


Proper units don't need all these ifs and buts. A metre or a
second can be simply defined in a jiffy.

Then having established that form of expression
you could add the totally contrary dBm, for good measure.


It isn't 'contrary' and I'm afraid you have it backwards.
dBm (as
simply dB, the "Bel" in honor of Alexander Graham Bell
since it was
telephone transmission the unit was created for) came
first and it's
arguable that dBV was to 'simplify' things like you opine
about.


I didn't mean to imply a historical sequence, but rather the
order one might introduce the dB such as to highlight the
confusion.

It is contrary. In dBV, the V is a unit. In dBm, the m is
not a unit. How can that possibly be consistent? In plain
dB, there's no subsript at all. The rules of subscripting
are different in each case.

When people don't know about something they tend to think
it's
'arbitrary' or 'stupid' but that's rarely the case. Like,
who's the
idiot that decided there should be so many '12s' in
things?

Well, I don't know who the 'idiot' was but the reason is
that, back
when most people we're mathematically challenged, 12 is
divisible by
2, 3, and 4. It made life simpler.


You invented the idiot part. Dozens are clearly useful, as
are 60s and 360s. 10 is the unfortunate choice, but even
then I'm sure everyone knows how that came about.

I'm not saying the dB is arbitrary or stupid. On the
contrary, it serves a purpose very well. It's the
inconsistency of subsripting that's a dog's dinner, and that
in combination with the absence of explicit reference to
power (unavoidable I guess) that can make it hard to follow.
Transforms are a bit tricky for folk to get their heads
around, and a cluttering of historical debris doesn't make
it easier for the novice.


I tried to give a brief background on dBm for the same
reason. Waaaay
back when you have people building the first phone lines,
and we all
know the signal degrades over distance, we obviously need
some way to
express that and the first 'unit' was the MSC (Mile [of]
Standard
Cable). I.E. How much the signal degrades over a mile of
the stuff we
use. Makes sense if you think about it, at least for a
starter. Now,
as things progress it's not really wise to tie a
'standard' to a
particular type of cable (we use multiple types now) so in
the 1920's
the "TU" (transmission unit) was defined (as ten times the
base10
logarithm of the ratio of measured power to reference
power) and, to
reduce confusion, intentionally made it close to the MSC
(=1.056 TU).
The TU then became the decibel.

Now, I don't know why 1mW was the preferred reference but
I assure you
there was some practical reason and, at any rate, back
when the MSC
was done there were no radars, hi-fi stereos, AM radio, or
any other
'electronics' as we know it to bother with and no
'confusion' about
what's being worked with. It was a telephone system,
period. That was
the problem being dealt with, you care about impedance
matching, power
transfer, and it works.

As I previously mentioned, when you start adding
electronic repeaters
it makes perfectly good sense to use the dB(m) reference
level because
that's the system you're putting them into. And it makes
perfect sense
that, if one is going to make a general purpose amplifier,
people used
to making repeaters are likely to use the same
terminology, not to
mention the first ones are likely to be nothing more than
the same
repeaters used for a different purpose, and we know they
expect 0dBm
as reference, so that's likely to be the 'reference level'
for
whatever else you stick them in because why redesign the
damn things?
Of course, eventually you do because not all applications
have the
same requirements.

And, again of course, after time and things have
multiplied a million
fold, so people are separated from the 'original reason',
there's
likely to be someone who figures things would be 'simpler'
if we used
'1' as a reference, instead of the 'arbitrary' and
'stupid' .775V, so
we get dBV. And when we get to radar reflection we get
another use for
dB, and another letter. And then we have speakers, and
another use for
dB, and so on, because, as it turns out, power is power
and the
fundamental ratio applies but 1mW into 600 Ohms is simply
not a useful
reference for radar and speakers.

Again, I can't tell you how 'everything' came about but I
assure you
there were practical reasons. Or, at the very least,
someone thought
there was a good practical reason and if it persisted it
probably was.


I haven't argued that the dB is not useful, or not valid.
I'm trying to identify why ppl find it hard to follow. It is
hard to follow. Ppl have big arguments about it and come
from all sorts of directions. What's your explanation for
that? Ohm's Law, for example, doesn't attract any debate at
all. Neither do simple units like seconds or metres. Even
logs don't attract the same level of confusion. Even you
have just made a couple of errors, which you wouldn't with
seconds, metres or logs.

Isn't this the kind of thing that Europe was supposed to
sort out?

A more consistent convention for dealing with its
subscripts...the u, V, m etc....would help maybe.


What inconsistency are you claming?


See above. In terms of self-consistency, V is an SI unit
whereas m is a misplaced SI prefix and actually I don't know
what the u stands for. Is it a corruption of a mu symbol? In
that case it would be a misplaced corruption of an SI
prefix. In terms of consistency with SI units, dBV relates
to power but makes no mention of a unit of power, the m
isn't a properly applicable prefix even if it wasn't
misplaced, and the u isn't either, and it's corrupt to boot.

On a positive note, I've noticed that many of those here
who
appear versed in audio technology are quite disciplined in
their usage of the decibel, especially with respect to
voltage. In circumstances where a voltage is significant
on
its own account, I generally see volts or percentages.
Where
voltage is significant by dint of its relation to power, I
generally see dB. Not a habit I've fully adopted, I must
admit.

It would be nice, I think, if the same kind of distinction
were made when talking about frequency. We don't use
octaves
as much as we ought, I fear.


Like most things it depends on what is of interest. Talk
to musicians
and you'll get an earful about octaves (plus a lot more).


And yet, when we are concerned about audible bandwidth,
octaves would be linear with respect to perception, so there
is just as much reason to use the octave in audio as the
decibel. They both linearise an axis that would otherwise
require a log scale to fit with perception, and therefore
with significance.

Ian




  #16   Report Post  
Posted to rec.audio.tubes
Bret L Bret L is offline
external usenet poster
 
Posts: 1,145
Default amplifier input sensitivity

On Apr 21, 9:43*pm, John Byrns wrote:
In article
,
*Bret L wrote:

*Most consumer audio amplifiers, and in fact most in general (EXCEPT
the "booster" amps of yore such as large Altecs which were nothing
more than a single pair of medium mu transmitting triodes with an
array of transformers for power supply, input and output) are bridging
input loads. They are not designed to accept input power, just
voltage. If one needs to drive them from a source which wants a
defined load then one simply puts a load resistor across the input.


I don't agree, I remember some of those PA "booster" amplifiers were
designed to accept input power, this was sometimes necessary because
some of them used transmitting tubes that were driven into grid current,
i.e. class B2.


Isn't that just what I said?

The booster amps were indeed desirous of input power and for
precisely that reason, BUT they were uncommon then and very rare now.
It added very little weight and power to provide them with a high
impedance input driver section but by the time that was figured out
the era of big tube audio amplifiers was over.

Refer to the "Audio Anthologies" which I believe you said you
possess, the "Musician's Amplifier Senior" by Sarser and Sprinkle
(Sarser is still alive and I have acquaintences who correspond with
him occasionally) was the prototype for several of the big Altecs such
as the 260B. Most ball parks had a couple of these in that era. You
also saw them used in the occasional electronic church organ install.
  #17   Report Post  
Posted to rec.audio.tubes
Alex Alex is offline
external usenet poster
 
Posts: 65
Default amplifier input sensitivity


"flipper" wrote in message
...
No it isn't. dBV is specified relative to 1V. I.E. 0dBV is 1V, 10dBV
is 10V, etc. There is no 'ambiguity' nor is it 'misleading'.


This is a mistake, by the way.
+10dBV is approximately 3.1V.
+20dBV is 10V.


dBu is specified relative to .775V (rounded). I.E. 0dBV is .775V
(rounded), 10dBu is 7.746V, etc.


The same mistake.
+20dBu corresponds to 7.75V.


  #18   Report Post  
Posted to rec.audio.tubes
John Byrns John Byrns is offline
external usenet poster
 
Posts: 1,441
Default amplifier input sensitivity

In article
,
Bret L wrote:

On Apr 21, 9:43*pm, John Byrns wrote:
In article
,
*Bret L wrote:

*Most consumer audio amplifiers, and in fact most in general (EXCEPT
the "booster" amps of yore such as large Altecs which were nothing
more than a single pair of medium mu transmitting triodes with an
array of transformers for power supply, input and output) are bridging
input loads. They are not designed to accept input power, just
voltage. If one needs to drive them from a source which wants a
defined load then one simply puts a load resistor across the input.


I don't agree, I remember some of those PA "booster" amplifiers were
designed to accept input power, this was sometimes necessary because
some of them used transmitting tubes that were driven into grid current,
i.e. class B2.


Isn't that just what I said?


No, you said "They are not designed to accept input power, just
voltage." You said they were "NOT designed to accept input power",
emphasis of the NOT is mine. I said just the opposite that they "were
designed to accept input power", note the absence of the "not" in what I
said.

The booster amps were indeed desirous of input power and for
precisely that reason, BUT they were uncommon then and very rare now.
It added very little weight and power to provide them with a high
impedance input driver section but by the time that was figured out
the era of big tube audio amplifiers was over.

Refer to the "Audio Anthologies" which I believe you said you
possess,


Yes, you are correct, I do possess the "Audio Anthologies", however when
did I admit to possessing them?

the "Musician's Amplifier Senior" by Sarser and Sprinkle
(Sarser is still alive and I have acquaintences who correspond with
him occasionally) was the prototype for several of the big Altecs such
as the 260B. Most ball parks had a couple of these in that era. You
also saw them used in the occasional electronic church organ install.


Regards,

John Byrns

--
Regards,

John Byrns

Surf my web pages at, http://fmamradios.com/
  #19   Report Post  
Posted to rec.audio.tubes
Ian Iveson Ian Iveson is offline
external usenet poster
 
Posts: 960
Default amplifier input sensitivity

I think your being a tad contrary.

So: it's not inconsistent that "m" rather than "mW" stands
for "milliwatt", and "m" isn't in this case a prefix, and
"V" is not a unit, and no-one finds this any more confusing
than anything else they are too stupid to know about.
Thanks, Flips. The glorious light of a new dawn.

Everyone knows what an ovtave sounds like, and if they don't
you can tell them its like somewhere, in somwhere over the
rainbow.

A reduction from 20kHz to 10kHz is the loss of an octave.
Similarly, a cut at the low end from 20Hz to 40Hz would lose
an octave. Makes sense to me. Make an effort.

Ian

"flipper" wrote in message
news
On Fri, 23 Apr 2010 01:05:59 +0100, "Ian Iveson"
wrote:

Flipper wrote:

dBu is specified relative to .775V (rounded). I.E.
0dBV
is .775V
(rounded), 10dBu is 7.746V, etc.

The same mistake.
+20dBu corresponds to 7.75V.

The dB has become a dog's dinner of a non unit that
causes
confusion wherever it goes. As a convention, it's
perfectly
clear once you've learned it, but it isn't obvious.

In particular, if you wanted a confusing way of
expressing
the voltage necessary to increase the power by a certain
proportion, given constant resistance, dBV would be a
good
candidate.

For that purpose it doesn't matter what flavor of dB you
use because a
doubling of power is, by definition, 3dB, is 3dB, is
3dB.


Well yes. But the simple ratio of voltages is not the same
as the simple ratio of powers. Half the voltage is -6dB,
half the power is -3dB. Sticking the V on the end of the
dB
doesn't give much of a clue that this will be the case,
especially as there is no explicit indication that the dBV
has anything to do with power.


A symbol is not a self contained tutorial. You have to
learn what they
stand for and that's true of any symbol.

The definition for power decibels is 10log10(ratio) and
for amplitudes
20log10(ratio)


Surely you agree that many ppl find the dB at least a bit
hard to grasp? Why do you want to demonstrate that it is
straightforward?


People are usually confused by the jargon of any technical
field
they're unfamiliar with and it's usually straightforward
to those
who've learned and routinely use it.


For example, when double checking FB on a power amp in
spice I
disconnect FB, measure output power, connect FB, measure
output power,
and verify it's 100 to 1 for 20db or 10 to 1 for 10dB.
Which, btw, is
how I made the stupid mistake. In spice I can measure
power 'directly'
and reflexively know 10dB is 10x power. A stupid mistake
from not
thinking about what I was typing.


I don't quite follow what mistake you mean, I'm afraid.


At the top where I used power ratio for voltage ratio.

One
thing that confused me for some time arises from the fact
that my spice automatically labels the y-axis in terms of
current or voltage, because there's no such thing as a
power
"probe". So if I multiply a current by a voltage, and
graph
the result, it's not obvious that the result is power,
although, almost by fluke, it still gets the dB scale
right
in the sense that the numerical values are correct.
Similarly with resistance, and even stuff like flux
density,
as long as I'm careful to stick with units from the same
family.


I can't speak to the user friendliness, or lack thereof,
in your spice
program.

In mine I can put a probe on the 'body' of a resistor, and
most active
devices, to get a direct, instantaneous, power reading or
a plot of it
vs time.

What 'nominal' or 'reference' power do I start with or
'calibrate' to?
Nothing. It doesn't matter. 10dB is 10dB.

The flavor only matters if you care about an interchange
reference
level like, say, building a repeater that you need to
work
in a
'standard' telecom system that expects 0dB to be 1mW
into
600 Ohms.


Proper units don't need all these ifs and buts. A metre or
a
second can be simply defined in a jiffy.


Try explaining 'why' there is a 'meter', a 'yard', and a
'furlong' and
the history of when they came about for what reason.
You'll find
yourself employing just as many "ifs and buts."

dBm, itself, is just as simple as a meter.

A meter is --- this long --- and 0 dBm is 1 mW into 600
Ohms.


Then having established that form of expression
you could add the totally contrary dBm, for good
measure.

It isn't 'contrary' and I'm afraid you have it
backwards.
dBm (as
simply dB, the "Bel" in honor of Alexander Graham Bell
since it was
telephone transmission the unit was created for) came
first and it's
arguable that dBV was to 'simplify' things like you
opine
about.


I didn't mean to imply a historical sequence, but rather
the
order one might introduce the dB such as to highlight the
confusion.


My object was to reduce confusion. Why do you want to
increase it?

It is contrary. In dBV, the V is a unit. In dBm, the m is
not a unit. How can that possibly be consistent? In plain
dB, there's no subsript at all. The rules of subscripting
are different in each case.


No, it isn't. Decibel is dimensionless and neither the V
or m is a
'unit'. They are one letter abbreviations for a 'word'
that makes it
simple to remember when combined into the 'symbol'. dB -
decibel, V -
Volt, m - milliwatt. dBV - decibels-Volt.

That the word or letter may serve as a 'unit' in another
context is
how English works. There's only 26 letters and a few
appropriate words
for any one thing so they're bound to have multiple uses.

You are trying to find ways to confuse yourself.

When people don't know about something they tend to
think
it's
'arbitrary' or 'stupid' but that's rarely the case.
Like,
who's the
idiot that decided there should be so many '12s' in
things?

Well, I don't know who the 'idiot' was but the reason is
that, back
when most people we're mathematically challenged, 12 is
divisible by
2, 3, and 4. It made life simpler.


You invented the idiot part.


I simply recount what I've heard people 'complain' about.

Dozens are clearly useful, as
are 60s and 360s. 10 is the unfortunate choice, but even
then I'm sure everyone knows how that came about.


That's fine. In your case 10 is 'stupid'. Most people I've
heard rant
about it think '12' is stupid.

The point remains that people like to call things they
don't
understand the reasons for 'stupid' or 'needlessly
complex' or some
other complaint.

I'm not saying the dB is arbitrary or stupid. On the
contrary, it serves a purpose very well. It's the
inconsistency of subsripting that's a dog's dinner, and
that
in combination with the absence of explicit reference to
power (unavoidable I guess) that can make it hard to
follow.



It is you who are inventing the 'contrariness' and
supposed
'inconsistency'.

Transforms are a bit tricky for folk to get their heads
around, and a cluttering of historical debris doesn't make
it easier for the novice.


It would if you didn't insist on creating your own
'meanings'.

I tried to give a brief background on dBm for the same
reason. Waaaay
back when you have people building the first phone
lines,
and we all
know the signal degrades over distance, we obviously
need
some way to
express that and the first 'unit' was the MSC (Mile [of]
Standard
Cable). I.E. How much the signal degrades over a mile of
the stuff we
use. Makes sense if you think about it, at least for a
starter. Now,
as things progress it's not really wise to tie a
'standard' to a
particular type of cable (we use multiple types now) so
in
the 1920's
the "TU" (transmission unit) was defined (as ten times
the
base10
logarithm of the ratio of measured power to reference
power) and, to
reduce confusion, intentionally made it close to the MSC
(=1.056 TU).
The TU then became the decibel.

Now, I don't know why 1mW was the preferred reference
but
I assure you
there was some practical reason and, at any rate, back
when the MSC
was done there were no radars, hi-fi stereos, AM radio,
or
any other
'electronics' as we know it to bother with and no
'confusion' about
what's being worked with. It was a telephone system,
period. That was
the problem being dealt with, you care about impedance
matching, power
transfer, and it works.

As I previously mentioned, when you start adding
electronic repeaters
it makes perfectly good sense to use the dB(m) reference
level because
that's the system you're putting them into. And it makes
perfect sense
that, if one is going to make a general purpose
amplifier,
people used
to making repeaters are likely to use the same
terminology, not to
mention the first ones are likely to be nothing more
than
the same
repeaters used for a different purpose, and we know they
expect 0dBm
as reference, so that's likely to be the 'reference
level'
for
whatever else you stick them in because why redesign the
damn things?
Of course, eventually you do because not all
applications
have the
same requirements.

And, again of course, after time and things have
multiplied a million
fold, so people are separated from the 'original
reason',
there's
likely to be someone who figures things would be
'simpler'
if we used
'1' as a reference, instead of the 'arbitrary' and
'stupid' .775V, so
we get dBV. And when we get to radar reflection we get
another use for
dB, and another letter. And then we have speakers, and
another use for
dB, and so on, because, as it turns out, power is power
and the
fundamental ratio applies but 1mW into 600 Ohms is
simply
not a useful
reference for radar and speakers.

Again, I can't tell you how 'everything' came about but
I
assure you
there were practical reasons. Or, at the very least,
someone thought
there was a good practical reason and if it persisted it
probably was.


I haven't argued that the dB is not useful, or not valid.
I'm trying to identify why ppl find it hard to follow. It
is
hard to follow. Ppl have big arguments about it and come
from all sorts of directions. What's your explanation for
that?


Already answered. They're unfamiliar with it or know just
enough to
think they know when they don't.

You've probably noticed that a lack of knowledge seldom
stops people
from arguing about things, regardless of the topic.

Ohm's Law, for example, doesn't attract any debate at
all. Neither do simple units like seconds or metres. Even
logs don't attract the same level of confusion. Even you
have just made a couple of errors, which you wouldn't with
seconds, metres or logs.


I appreciate the confidence in my otherwise infallibility.


Isn't this the kind of thing that Europe was supposed to
sort out?

A more consistent convention for dealing with its
subscripts...the u, V, m etc....would help maybe.

What inconsistency are you claming?


See above. In terms of self-consistency, V is an SI unit
whereas m is a misplaced SI prefix and actually I don't
know
what the u stands for. Is it a corruption of a mu symbol?
In
that case it would be a misplaced corruption of an SI
prefix. In terms of consistency with SI units, dBV relates
to power but makes no mention of a unit of power, the m
isn't a properly applicable prefix even if it wasn't
misplaced, and the u isn't either, and it's corrupt to
boot.


As I explained above, you are inventing meanings that do
not exist.

Not only that, you go out of your way to invent problems,
like
claiming a prefix that isn't even in the prefix position,
so obviously
not a prefix, has TWO errors with one being it isn't in
the prefix
position which, to most people, would be proof enough it
isn't a
prefix.. yet you decide it is.

I couldn't change it even if I were inclined to try so you
can either
learn the convention that's been around for a hundred
years or not.
Your choice.

On a positive note, I've noticed that many of those here
who
appear versed in audio technology are quite disciplined
in
their usage of the decibel, especially with respect to
voltage. In circumstances where a voltage is significant
on
its own account, I generally see volts or percentages.
Where
voltage is significant by dint of its relation to power,
I
generally see dB. Not a habit I've fully adopted, I must
admit.

It would be nice, I think, if the same kind of
distinction
were made when talking about frequency. We don't use
octaves
as much as we ought, I fear.

Like most things it depends on what is of interest. Talk
to musicians
and you'll get an earful about octaves (plus a lot
more).


And yet, when we are concerned about audible bandwidth,
octaves would be linear with respect to perception, so
there
is just as much reason to use the octave in audio as the
decibel.


'Linear' to what perception? And if I'm building a 'hi-fi'
amplifier
that is supposed to cover all audible frequencies why do I
care how
one 'perceives' 6kHz relative to 1kHz? They're both there
so... what?

Ok, so an amp with a 20-10kHz bandwidth is what? octave?
what? than
one with a 20-20kHz bandwidth? Twice, or thrice, or half,
or what of
what? What additional usefulness do I get by measuring
bandwidth in
octaves?

They both linearise an axis that would otherwise
require a log scale to fit with perception, and therefore
with significance.


I do plot it on a log scale, log10.

Ian



  #20   Report Post  
Posted to rec.audio.tubes
[email protected] arthrnyork@webtv.net is offline
external usenet poster
 
Posts: 81
Default amplifier input sensitivity

On Apr 25, 5:45*am, "Ian Iveson"
wrote:
I think your being a tad contrary.

So: it's not inconsistent that "m" rather than "mW" stands
for "milliwatt", and "m" isn't in this case a prefix, and
"V" is not a unit, and no-one finds this any more confusing
than anything else they are too stupid to know about.
Thanks, Flips. The glorious light of a new dawn.

Everyone knows what an ovtave sounds like, and if they don't
you can tell them its like somewhere, in somwhere over the
rainbow.

A reduction from 20kHz to 10kHz is the loss of an octave.
Similarly, a cut at the low end from 20Hz to 40Hz would lose
an octave. Makes sense to me. Make an effort.

Ian

"flipper" wrote in message

news


On Fri, 23 Apr 2010 01:05:59 +0100, "Ian Iveson"
wrote:


Flipper wrote:


dBu is specified relative to .775V (rounded). I.E.
0dBV
is .775V
(rounded), 10dBu is 7.746V, etc.


The same mistake.
+20dBu corresponds to 7.75V.


The dB has become a dog's dinner of a non unit that
causes
confusion wherever it goes. As a convention, it's
perfectly
clear once you've learned it, but it isn't obvious.


In particular, if you wanted a confusing way of
expressing
the voltage necessary to increase the power by a certain
proportion, given constant resistance, dBV would be a
good
candidate.


For that purpose it doesn't matter what flavor of dB you
use because a
doubling of power is, by definition, 3dB, is 3dB, is
3dB.


Well yes. But the simple ratio of voltages is not the same
as the simple ratio of powers. Half the voltage is -6dB,
half the power is -3dB. Sticking the V on the end of the
dB
doesn't give much of a clue that this will be the case,
especially as there is no explicit indication that the dBV
has anything to do with power.


A symbol is not a self contained tutorial. You have to
learn what they
stand for and that's true of any symbol.


The definition for power decibels is 10log10(ratio) and
for amplitudes
20log10(ratio)


Surely you agree that many ppl find the dB at least a bit
hard to grasp? Why do you want to demonstrate that it is
straightforward?


People are usually confused by the jargon of any technical
field
they're unfamiliar with and it's usually straightforward
to those
who've learned and routinely use it.


For example, when double checking FB on a power amp in
spice I
disconnect FB, measure output power, connect FB, measure
output power,
and verify it's 100 to 1 for 20db or 10 to 1 for 10dB.
Which, btw, is
how I made the stupid mistake. In spice I can measure
power 'directly'
and reflexively know 10dB is 10x power. A stupid mistake
from not
thinking about what I was typing.


I don't quite follow what mistake you mean, I'm afraid.


At the top where I used power ratio for voltage ratio.


One
thing that confused me for some time arises from the fact
that my spice automatically labels the y-axis in terms of
current or voltage, because there's no such thing as a
power
"probe". So if I multiply a current by a voltage, and
graph
the result, it's not obvious that the result is power,
although, almost by fluke, it still gets the dB scale
right
in the sense that the numerical values are correct.
Similarly with resistance, and even stuff like flux
density,
as long as I'm careful to stick with units from the same
family.


I can't speak to the user friendliness, or lack thereof,
in your spice
program.


In mine I can put a probe on the 'body' of a resistor, and
most active
devices, to get a direct, instantaneous, power reading or
a plot of it
vs time.


What 'nominal' or 'reference' power do I start with or
'calibrate' to?
Nothing. It doesn't matter. 10dB is 10dB.


The flavor only matters if you care about an interchange
reference
level like, say, building a repeater that you need to
work
in a
'standard' telecom system that expects 0dB to be 1mW
into
600 Ohms.


Proper units don't need all these ifs and buts. A metre or
a
second can be simply defined in a jiffy.


Try explaining 'why' there is a 'meter', a 'yard', and a
'furlong' and
the history of when they came about for what reason.
You'll find
yourself employing just as many "ifs and buts."


dBm, itself, is just as simple as a meter.


A meter is --- this long --- and 0 dBm is 1 mW into 600
Ohms.


Then having established that form of expression
you could add the totally contrary dBm, for good
measure.


It isn't 'contrary' and I'm afraid you have it
backwards.
dBm (as
simply dB, the "Bel" in honor of Alexander Graham Bell
since it was
telephone transmission the unit was created for) came
first and it's
arguable that dBV was to 'simplify' things like you
opine
about.


I didn't mean to imply a historical sequence, but rather
the
order one might introduce the dB such as to highlight the
confusion.


My object was to reduce confusion. Why do you want to
increase it?


It is contrary. In dBV, the V is a unit. In dBm, the m is
not a unit. How can that possibly be consistent? In plain
dB, there's no subsript at all. The rules of subscripting
are different in each case.


No, it isn't. Decibel is dimensionless and neither the V
or m is a
'unit'. They are one letter abbreviations for a 'word'
that makes it
simple to remember when combined into the 'symbol'. dB -
decibel, V -
Volt, m - milliwatt. dBV - decibels-Volt.


That the word or letter may serve as a 'unit' in another
context is
how English works. There's only 26 letters and a few
appropriate words
for any one thing so they're bound to have multiple uses.


You are trying to find ways to confuse yourself.


When people don't know about something they tend to
think
it's
'arbitrary' or 'stupid' but that's rarely the case.
Like,
who's the
idiot that decided there should be so many '12s' in
things?


Well, I don't know who the 'idiot' was but the reason is
that, back
when most people we're mathematically challenged, 12 is
divisible by
2, 3, and 4. It made life simpler.


You invented the idiot part.


I simply recount what I've heard people 'complain' about.


Dozens are clearly useful, as
are 60s and 360s. 10 is the unfortunate choice, but even
then I'm sure everyone knows how that came about.


That's fine. In your case 10 is 'stupid'. Most people I've
heard rant
about it think '12' is stupid.


The point remains that people like to call things they
don't
understand the reasons for 'stupid' or 'needlessly
complex' or some
other complaint.


I'm not saying the dB is arbitrary or stupid. On the
contrary, it serves a purpose very well. It's the
inconsistency of subsripting that's a dog's dinner, and
that
in combination with the absence of explicit reference to
power (unavoidable I guess) that can make it hard to
follow.


It is you who are inventing the 'contrariness' and
supposed
'inconsistency'.


Transforms are a bit tricky for folk to get their heads
around, and a cluttering of historical debris doesn't make
it easier for the novice.


It would if you didn't insist on creating your own
'meanings'.


I tried to give a brief background on dBm for the same
reason. Waaaay
back when you have people building the first phone
lines,
and we all
know the signal degrades over distance, we obviously
need
some way to
express that and the first 'unit' was the MSC (Mile [of]
Standard
Cable). I.E. How much the signal degrades over a mile of
the stuff we
use. Makes sense if you think about it, at least for a
starter. Now,
as things progress it's not really wise to tie a
'standard' to a
particular type of cable (we use multiple types now) so
in
the 1920's
the "TU" (transmission unit) was defined (as ten times
the
base10
logarithm of the ratio of measured power to reference
power) and, to
reduce confusion, intentionally made it close to the MSC
(=1.056 TU).
The TU then became the decibel.


Now, I don't know why 1mW was the preferred reference
but
I assure you
there was some practical reason and, at any rate, back
when the MSC
was done there were no radars, hi-fi stereos, AM radio,
or
any other
'electronics' as we know it to bother with and no
'confusion' about
what's being worked with. It was a telephone system,
period. That was
the problem being dealt with, you care about impedance
matching, power
transfer, and it works.


As I previously mentioned, when you start adding
electronic repeaters
it makes perfectly good sense to use the dB(m) reference
level because
that's the system you're putting them into. And it makes
perfect sense
that, if one is going to make a general purpose
amplifier,
people used
to making repeaters are likely to use the same
terminology, not to
mention the first ones are likely to be nothing more
than
the same
repeaters used for a different purpose, and we know they
expect 0dBm
as reference, so that's likely to be the 'reference
level'
for
whatever else you stick them in because why redesign the
damn things?
Of course, eventually you do because not all
applications
have the
same requirements.


And, again of course, after time and things have
multiplied a million
fold, so people are separated from the 'original
reason',
there's
likely to be someone who figures


...

read more »- Hide quoted text -

- Show quoted text -


This thread is becoming an exercise in futility. A decibel and an
octave can mean quite different things in different cases. In consumer
grade audio environment (and even in pro audio) 1 volt signal levels
have become usual and bridged input is more common due to an improved
frequency response . O dBm is a historical reference level and we all
have to respect it and understand why and how it was established.
Voltage bridging is very approximate in cosumer grade audio and the
impedances involved may range from about 1KOhm to 10K or even higher
for inputs...


  #21   Report Post  
Posted to rec.audio.tubes
Ian Iveson Ian Iveson is offline
external usenet poster
 
Posts: 960
Default amplifier input sensitivity

This thread is becoming an exercise in futility. A decibel
and an
octave can mean quite different things in different cases.
In consumer
grade audio environment (and even in pro audio) 1 volt
signal levels
have become usual and bridged input is more common due to an
improved
frequency response . O dBm is a historical reference level
and we all
have to respect it and understand why and how it was
established.
Voltage bridging is very approximate in cosumer grade audio
and the
impedances involved may range from about 1KOhm to 10K or
even higher
for inputs...

This is recreational audio tubes. Surely futility is what we
come here for?

At a guess, the three of us know what a dB is, and why it's
useful. Considering how immediately useful it is, it needs
no historical explanation IMO, although it's a bit
interesting.

My point was about intelligibility of expression...about
consistency of language. dBm, in particular, is a mess in
that respect. In addition to the points I've made already,
even the "d" prefix is odd in the context of standard units.

The dB itself is accepted by, but not a part of, the SI
system. In this respect it is one of a few odd units.

The dBm, AFAIK, is outlawed from SI standard units, because
it is considered to be confusing and ambiguous. There is
rarely a need to know what it means...it's part of the
jargon that's used in the business, that's all. Similarly
dBu. Dunno what it means, don't care, and can't be arsed to
look it up.

How would you abbreviate the decibel-metre, incidentally?
dB.m? And how would you write one thousandth of a decibel?
mdB?

The octave is similar in that, if you wanted to express an
absolute frequency in octaves, you would need to attach a
reference frequency. It is also similar in that it is a
ratio that is mathematically unitless, and yet is
inextricably linked to frequency, just like the dB is linked
to power. If I found myself needing to teach the meaning of
the dB, the octave might be a useful comparison, because
everyone knows what an octave sounds like, and everyone
knows it is related to frequency, and they also know that it
sounds essentially the same whatever the absolute frequency
span. I can sing "somewhere" in any key and the octave
sounds like the same span, even though the frequency
difference would not be the same in each case.

All futile to those who don't want to see the point,
obviously.

They are both useful in audio insofar as they relate more
directly to perception than their "host" units, Watt and Hz.
Significance in audio is very often about perception. If we
can linearise what is significant, it becomes more
intelligible.

Should anyone wish to know more, I suggest they google the
SI system of units.

Ian


  #22   Report Post  
Posted to rec.audio.tubes
[email protected] arthrnyork@webtv.net is offline
external usenet poster
 
Posts: 81
Default amplifier input sensitivity

On Apr 26, 11:50*pm, "Ian Iveson"
wrote:
*This thread is becoming an exercise in futility. A decibel
and an
octave can mean quite different things in different cases.
In consumer
grade audio environment (and even in pro audio) 1 volt
signal levels
have become usual and bridged input is more common due to an
improved
frequency response . *O dBm is a historical reference level
and we all
have to respect it and understand why and how it was
established.
Voltage bridging is very approximate in cosumer grade audio
and the
impedances involved may range from about 1KOhm to 10K or
even higher
for inputs...

This is recreational audio tubes. Surely futility is what we
come here for?

At a guess, the three of us know what a dB is, and why it's
useful. Considering how immediately useful it is, it needs
no historical explanation IMO, although it's a bit
interesting.

My point was about intelligibility of expression...about
consistency of language. dBm, in particular, is a mess in
that respect. In addition to the points I've made already,
even the "d" prefix is odd in the context of standard units.

The dB itself is accepted by, but not a part of, the SI
system. In this respect it is one of a few odd units.

The dBm, AFAIK, is outlawed from SI standard units, because
it is considered to be confusing and ambiguous. There is
rarely a need to know what it means...it's part of the
jargon that's used in the business, that's all. Similarly
dBu. Dunno what it means, don't care, and can't be arsed to
look it up.

How would you abbreviate the decibel-metre, incidentally?
dB.m? And how would you write one thousandth of a decibel?
mdB?

The octave is similar in that, if you wanted to express an
absolute frequency in octaves, you would need to attach a
reference frequency. It is also similar in that it is a
ratio that is mathematically unitless, and yet is
inextricably linked to frequency, just like the dB is linked
to power. If I found myself needing to teach the meaning of
the dB, the octave might be a useful comparison, because
everyone knows what an octave sounds like, and everyone
knows it is related to frequency, and they also know that it
sounds essentially the same whatever the absolute frequency
span. I can sing "somewhere" in any key and the octave
sounds like the same span, even though the frequency
difference would not be the same in each case.

All futile to those who don't want to see the point,
obviously.

They are both useful in audio insofar as they relate more
directly to perception than their "host" units, Watt and Hz.
Significance in audio is very often about perception. If we
can linearise what is significant, it becomes more
intelligible.

Should anyone wish to know more, I suggest they google the
SI system of units.

Ian


Amen.
  #23   Report Post  
Posted to rec.audio.tubes
Ian Iveson Ian Iveson is offline
external usenet poster
 
Posts: 960
Default amplifier input sensitivity

flipper wrote:

[below]


I've said that the strange notation of the dB and its
derivative expressions is part of the reason why it causes
confusion to the novice. Perhaps it's different in the US,
but here the standard school curriculum strictly and
explicitly adheres to the SI system. In this context, the SI
and I consider the dB to be strange but unavoidable, whereas
its derivatives are particularly inconsistent with the
standard, and therefore confusing. That's enough for me.
Let's just agree to differ, eh?

With respect to the octave, I suggested we might perhaps use
it more as a measure of frequency, and I have said most of
why, in that it linearises a log scale in an intelligible
manner. This occurred to me because I wondered why, in the
standard form I have seen of plots of gain v bandwidth, a
decimal log scale is used for frequency, marked in Hz,
whereas the gain axis is linearised using dB. Why linearise
the scale for one axis and not the other? I wondered.

There is a very basic reason to use the octave, it seems to
me, but for the moment let's look at the simple
intelligibility side of things.

A filter might be said to roll of at 3db per octave. I've
seen this use of the octave lots of times. Then I look at
the graph and it's shown in Hz rather than octaves and
that's not very convenient is it? If the graph were
calibrated in octaves I could look at it and think ah yes, I
see, a one octave drop in frequency results in a 3dB fall in
gain. Us Europeans like this kind of thing.

For more advanced intelligibility, I could say that, in
order for an amp to be stable, the frequency difference
between the dominant pole and the next must be a certain
number of octaves per dB of open-loop gain. That makes a
simple picture out of a problem that lots of people seem to
have advanced difficulty with.

Now, so far you might say that the octave has no advantage
over the decade, which would also make the significance of
frequency-doubling explicit.

But consider that the construction of our ears, and that of
musical instruments, and therefore our expectations of what
is musical and what is not, is all built round the same
rules of how filters work everywhere, in electronics and the
rest of the physical world. The harmonic structure of music
and speech conform to the same rules that apply to the
transient responses of our hi-fi systems.

It might be useful, for example, to show harmonic distortion
on a spectrogram with the frequency axis marked in octaves,
using the input frequency as the reference. That would be
convenient because there would be no need for negative
octaves. It would be particularly revealing if the octaves
were subdivided into tones, especially coz I know you like
dozens and the dozen and the eight combine to make music! Or
not, if it's the 7th harmonic, for example, that falls close
to an unfortunate tone IIRC.

It could be even better if, instead of using a 1kHz test
tone, some standard note were used instead. Then the tones
on the frequency axis could be marked in notes, and it would
be even easier to imagine what the distortion would sound
like.

A graph showing bandwidth might be clearer if it were
calibrated in octaves and notes above and below middle C, or
A, or whatever. The span of instruments, ears, amps,
speakers and music could all be accommodated on the same
scale, making comparison a doddle.

Worth a thought, it seemed to me. But have it your own way,
I don't care. We may well be on different planets.

Ian

"flipper" wrote in message
...
On Tue, 27 Apr 2010 04:50:46 +0100, "Ian Iveson"
wrote:

This thread is becoming an exercise in futility. A
decibel
and an
octave can mean quite different things in different cases.
In consumer
grade audio environment (and even in pro audio) 1 volt
signal levels
have become usual and bridged input is more common due to
an
improved
frequency response . O dBm is a historical reference
level
and we all
have to respect it and understand why and how it was
established.
Voltage bridging is very approximate in cosumer grade
audio
and the
impedances involved may range from about 1KOhm to 10K or
even higher
for inputs...

This is recreational audio tubes. Surely futility is what
we
come here for?

At a guess, the three of us know what a dB is, and why
it's
useful. Considering how immediately useful it is, it needs
no historical explanation IMO, although it's a bit
interesting.


The historical explanation was specifically for how dBm
and dBu came
about..

I think it is also useful for those who imagine decibel is
some
arbitrary invention that it's based on empirical studies
of human
perception with the minimum observed detectable volume
change being,
on average, .1 Bel.

People tend to like 'units' that correspond to the 'common
size' of
whatever they're measuring, which is why meter is popular
despite the
fact we could use 6.684587122671E-12 au. And, conversely,
why
astronomers like au instead of meters.

I.E. 1 decibel (dB) is 1 'loudness perception unit'.

My point was about intelligibility of expression...about
consistency of language. dBm, in particular, is a mess in
that respect. In addition to the points I've made already,
even the "d" prefix is odd in the context of standard
units.


Never heard of deci? How about centi?

http://www.asknumbers.com/Centimeter...onversion.aspx

Centimeters (cm) To Decimeters (dm) Conversion

The dB itself is accepted by, but not a part of, the SI
system. In this respect it is one of a few odd units.

The dBm, AFAIK, is outlawed from SI standard units,
because
it is considered to be confusing and ambiguous.


The standards committee told you that, did they?

SI unit rules and style conventions have a built in
paradox with
respect to logarithmic 'ratio units'. They requires
knowing the
reference but SI objects to both 'creating'/'describing'
units with
suffixes and textual 'explanation' of units. There's no
'SI standard'
way of saying 'what it is'.

That's understandable, given the SI view that all things
derive from
the 7 basic units, and gets to the more philosophical
problem that
decibel (and neper) does not represent a physical property
or process,
like ampere or the derived unit ohm do, so you have the
conundrum that
it doesn't 'exist', as indicated by being dimensionless.
It's not one
of the 7 base units nor can it be definitively derived
from them
(because it can ratio 'anything').

Of course, it has dimension when you know the reference
but see SI
problem number 1.

SI conundrums aside, the dB is clearly useful and the only
thing
'ambiguous' about dBm is your steadfast refusal to accept
the
definition of it, not to mention it's rather obtuse to
complain it
doesn't follow current SI convention when there isn't one.
dBmW is
just as 'non SI' as dBm but if it makes you happy then
write dBmW.
Some people do but the vast majority do not so if you want
to know
what the heck they're talking about you need to know what
dBm is
whether you 'approve' of it or not.

There is
rarely a need to know what it means...it's part of the
jargon that's used in the business, that's all. Similarly
dBu. Dunno what it means, don't care, and can't be arsed
to
look it up.


No need as long as one doesn't care about working with
telecom,
consumer, or professional audio equipment, all of which
reference to
one or the other of dBm, dBV or dBu (not to mention the
scores of
other dB[something] used in all sorts of technical fields)

http://www.behringer.com/EN/download...P0573_S_EN.pdf

It might be nice to know what they mean when spec'ing, for
example,
max input as +22 dBu @ 0 dB gain.

Standard telecom fiber optic transmitters output 0 to +10
dBm.

Might not matter if your a cable plugger but the thread is
"amplifier
input sensitivity" so just how do you suggest one
determine it without
knowing what the dern expected levels are?

How would you abbreviate the decibel-metre, incidentally?
dB.m?


The first question would be why you're even thinking of it
but I
imagine it would be just as confusing as the equally
mysterious
metre-Watt (m.W).

And how would you write one thousandth of a decibel?
mdB?


I've seen it done.

http://cires.colorado.edu/jimenez-gr.../NIDAQ_Ref.pdf

Attenuation (in mdB) = -[20 log10 (Vo/Vi)]*1000

The octave is similar in that, if you wanted to express an
absolute frequency in octaves, you would need to attach a
reference frequency. It is also similar in that it is a
ratio that is mathematically unitless, and yet is
inextricably linked to frequency, just like the dB is
linked
to power. If I found myself needing to teach the meaning
of
the dB, the octave might be a useful comparison, because
everyone knows what an octave sounds like, and everyone
knows it is related to frequency, and they also know that
it
sounds essentially the same whatever the absolute
frequency
span. I can sing "somewhere" in any key and the octave
sounds like the same span, even though the frequency
difference would not be the same in each case.

All futile to those who don't want to see the point,
obviously.


You never presented any 'point' of using octave as some
sort of
'teaching tool'. You argued I should plot things in
octaves and when I
asked 'what for' gave nothing.

They are both useful in audio insofar as they relate more
directly to perception than their "host" units, Watt and
Hz.
Significance in audio is very often about perception. If
we
can linearise what is significant, it becomes more
intelligible.


Octave is, as I already said, certainly useful to 'music'.
What use it
is to 'audio' you have yet to explain.

Should anyone wish to know more, I suggest they google the
SI system of units.


There isn't an SI unit at all for measuring 'perception
units', be it
loudness or your octaves.

Let me ask, besides satisfying your urge to bitch what use
does
deliberately trying to confuse things serve?


Ian



  #24   Report Post  
Posted to rec.audio.tubes
[email protected] arthrnyork@webtv.net is offline
external usenet poster
 
Posts: 81
Default amplifier input sensitivity

On Apr 28, 9:37*pm, "Ian Iveson"
wrote:
flipper wrote:
[below]


I've said that the strange notation of the dB and its
derivative expressions is part of the reason why it causes
confusion to the novice. Perhaps it's different in the US,
but here the standard school curriculum strictly and
explicitly adheres to the SI system. In this context, the SI
and I consider the dB to be strange but unavoidable, whereas
its derivatives are particularly inconsistent with the
standard, and therefore confusing. That's enough for me.
Let's just agree to differ, eh?

With respect to the octave, I suggested we might perhaps use
it more as a measure of frequency, and I have said most of
why, in that it linearises a log scale in an intelligible
manner. This occurred to me because I wondered why, in the
standard form I have seen of plots of gain v bandwidth, a
decimal log scale is used for frequency, marked in Hz,
whereas the gain axis is linearised using dB. Why linearise
the scale for one axis and not the other? I wondered.

There is a very basic reason to use the octave, it seems to
me, but for the moment let's look at the simple
intelligibility side of things.

A filter might be said to roll of at 3db per octave. I've
seen this use of the octave lots of times. Then I look at
the graph and it's shown in Hz rather than octaves and
that's not very convenient is it? If the graph were
calibrated in octaves I could look at it and think ah yes, I
see, a one octave drop in frequency results in a 3dB fall in
gain. Us Europeans like this kind of thing.

For more advanced intelligibility, I could say that, in
order for an amp to be stable, the frequency difference
between the dominant pole and the next must be a certain
number of octaves per dB of open-loop gain. That makes a
simple picture out of a problem that lots of people seem to
have advanced difficulty with.

Now, so far you might say that the octave has no advantage
over the decade, which would also make the significance of
frequency-doubling explicit.

But consider that the construction of our ears, and that of
musical instruments, and therefore our expectations of what
is musical and what is not, is all built round the same
rules of how filters work everywhere, in electronics and the
rest of the physical world. The harmonic structure of music
and speech conform to the same rules that apply to the
transient responses of our hi-fi systems.

It might be useful, for example, to show harmonic distortion
on a spectrogram with the frequency axis marked in octaves,
using the input frequency as the reference. That would be
convenient because there would be no need for negative
octaves. It would be particularly revealing if the octaves
were subdivided into tones, especially coz I know you like
dozens and the dozen and the eight combine to make music! Or
not, if it's the 7th harmonic, for example, that falls close
to an unfortunate tone IIRC.

It could be even better if, instead of using a 1kHz test
tone, some standard note were used instead. Then the tones
on the frequency axis could be marked in notes, and it would
be even easier to imagine what the distortion would sound
like.

A graph showing bandwidth might be clearer if it were
calibrated in octaves and notes above and below middle C, or
A, or whatever. The span of instruments, ears, amps,
speakers and music could all be accommodated on the same
scale, making comparison a doddle.

Worth a thought, it seemed to me. But have it your own way,
I don't care. We may well be on different planets.

Ian

"flipper" wrote in message

...



On Tue, 27 Apr 2010 04:50:46 +0100, "Ian Iveson"
wrote:


This thread is becoming an exercise in futility. A
decibel
and an
octave can mean quite different things in different cases.
In consumer
grade audio environment (and even in pro audio) 1 volt
signal levels
have become usual and bridged input is more common due to
an
improved
frequency response . *O dBm is a historical reference
level
and we all
have to respect it and understand why and how it was
established.
Voltage bridging is very approximate in cosumer grade
audio
and the
impedances involved may range from about 1KOhm to 10K or
even higher
for inputs...


This is recreational audio tubes. Surely futility is what
we
come here for?


At a guess, the three of us know what a dB is, and why
it's
useful. Considering how immediately useful it is, it needs
no historical explanation IMO, although it's a bit
interesting.


The historical explanation was specifically for how dBm
and dBu came
about..


I think it is also useful for those who imagine decibel is
some
arbitrary invention that it's based on empirical studies
of human
perception with the minimum observed detectable volume
change being,
on average, .1 Bel.


People tend to like 'units' that correspond to the 'common
size' of
whatever they're measuring, which is why meter is popular
despite the
fact we could use 6.684587122671E-12 *au. And, conversely,
why
astronomers like au instead of meters.


I.E. 1 decibel (dB) is 1 'loudness perception unit'.


My point was about intelligibility of expression...about
consistency of language. dBm, in particular, is a mess in
that respect. In addition to the points I've made already,
even the "d" prefix is odd in the context of standard
units.


Never heard of deci? How about centi?


http://www.asknumbers.com/Centimeter...onversion.aspx


Centimeters (cm) To Decimeters (dm) Conversion


The dB itself is accepted by, but not a part of, the SI
system. In this respect it is one of a few odd units.


The dBm, AFAIK, is outlawed from SI standard units,
because
it is considered to be confusing and ambiguous.


The standards committee told you that, did they?


SI unit rules and style conventions have a built in
paradox with
respect to logarithmic 'ratio units'. They requires
knowing the
reference but SI objects to both 'creating'/'describing'
units with
suffixes and textual 'explanation' of units. There's no
'SI standard'
way of saying 'what it is'.


That's understandable, given the SI view that all things
derive from
the 7 basic units, and gets to the more philosophical
problem that
decibel (and neper) does not represent a physical property
or process,
like ampere or the derived unit ohm do, so you have the
conundrum that
it doesn't 'exist', as indicated by being dimensionless.
It's not one
of the 7 base units nor can it be definitively derived
from them
(because it can ratio 'anything').


Of course, it has dimension when you know the reference
but see SI
problem number 1.


SI conundrums aside, the dB is clearly useful and the only
thing
'ambiguous' about dBm is your steadfast refusal to accept
the
definition of it, not to mention it's rather obtuse to
complain it
doesn't follow current SI convention when there isn't one.
dBmW is
just as 'non SI' as dBm but if it makes you happy then
write dBmW.
Some people do but the vast majority do not so if you want
to know
what the heck they're talking about you need to know what
dBm is
whether you 'approve' of it or not.


There is
rarely a need to know what it means...it's part of the
jargon that's used in the business, that's all. Similarly
dBu. Dunno what it means, don't care, and can't be arsed
to
look it up.


No need as long as one doesn't care about working with
telecom,
consumer, or professional audio equipment, all of which
reference to
one or the other of dBm, dBV or dBu (not to mention the
scores of
other dB[something] used in all sorts of technical fields)


http://www.behringer.com/EN/download...P0573_S_EN.pdf


It might be nice to know what they mean when spec'ing, for
example,
max input as +22 dBu @ 0 dB gain.


Standard telecom fiber optic transmitters output 0 to +10
dBm.


Might not matter if your a cable plugger but the thread is
"amplifier
input sensitivity" so just how do you suggest one
determine it without
knowing what the dern expected levels are?


How would you abbreviate the decibel-metre, incidentally?
dB.m?


The first question would be why you're even thinking of it
but I
imagine it would be just as confusing as the equally
mysterious
metre-Watt (m.W).


And how would you write one thousandth of a decibel?
mdB?


I've seen it done.


http://cires.colorado.edu/jimenez-gr.../NIDAQ_Ref.pdf


Attenuation (in mdB) = -[20 log10 (Vo/Vi)]*1000


The octave is similar in that, if you wanted to express an
absolute frequency in octaves, you would need to attach a
reference frequency. It is also similar in that it is a
ratio that is mathematically unitless, and yet is
inextricably linked to frequency, just like the dB is
linked
to power. If I found myself needing to teach the meaning
of
the dB, the octave might be a useful comparison, because
everyone knows what an octave sounds like, and everyone
knows it is related to frequency, and they also know that
it
sounds essentially the same whatever the absolute
frequency
span. I can sing "somewhere" in any key and the octave
sounds like the same span, even though the frequency
difference would not be the same in each case.


All futile to those who don't want to see the point,
obviously.


You never presented any 'point' of using octave as some
sort of
'teaching tool'. You argued I should plot things in
octaves and when I
asked 'what for' gave nothing.


They are both useful in audio insofar as they relate more
directly to perception than their "host" units, Watt and
Hz.
Significance in audio is very often about perception. If
we
can linearise what is significant, it becomes more
intelligible.


Octave is, as I already said, certainly useful to 'music'.
What use it
is to 'audio' you have yet to explain.


Should anyone wish to know more, I suggest they google the
SI system of units.


There isn't an SI unit at all for measuring 'perception
units', be it
loudness or your octaves.


Let me ask, besides satisfying your urge to bitch what use
does
deliberately trying to confuse things serve?


Ian- Hide quoted text -


- Show quoted text -


Obviously,Erich von Daniken had something in mind in his first best-
seller book.
  #25   Report Post  
Posted to rec.audio.tubes
Phil Allison[_3_] Phil Allison[_3_] is offline
external usenet poster
 
Posts: 500
Default amplifier input sensitivity





** LEARN TO TRIM - imbecile !!




  #26   Report Post  
Posted to rec.audio.tubes
Ian Bell[_2_] Ian Bell[_2_] is offline
external usenet poster
 
Posts: 861
Default amplifier input sensitivity

On 15/06/10 23:46, flipper wrote:
On Thu, 29 Apr 2010 02:37:37 +0100, "Ian Iveson"
wrote:

flipper wrote:

[below]


I've said that the strange notation of the dB and its
derivative expressions is part of the reason why it causes
confusion to the novice.


Any notation is 'confusing to the novice', till they learn it.

Perhaps it's different in the US,
but here the standard school curriculum strictly and
explicitly adheres to the SI system. In this context, the SI
and I consider the dB to be strange but unavoidable, whereas
its derivatives are particularly inconsistent with the
standard, and therefore confusing. That's enough for me.


You sound like a sales brochure for SI but SI has no mechanism
whatsoever for dealing with referenced logarithmic ratios. There is no
'labeling' that *could* be 'consistent' with SI, as it stands, so it's
a red herring to complain the clearly useful dB is 'inconsistent' with
something that has no means to deal with it, or anything like it, or
any substitute.


Not to mention that the dB is the ratio of two identically dimensioned
quantities and is therefore dimensionless so a system of standardising
dimensions is meaningless in this context.


Cheers

Ian
  #27   Report Post  
Posted to rec.audio.tubes
[email protected] arthrnyork@webtv.net is offline
external usenet poster
 
Posts: 81
Default amplifier input sensitivity

On Jun 16, 7:42*am, Ian Bell wrote:
On 15/06/10 23:46, flipper wrote:





On Thu, 29 Apr 2010 02:37:37 +0100, "Ian Iveson"
*wrote:


flipper wrote:


[below]


I've said that the strange notation of the dB and its
derivative expressions is part of the reason why it causes
confusion to the novice.


Any notation is 'confusing to the novice', till they learn it.


Perhaps it's different in the US,
but here the standard school curriculum strictly and
explicitly adheres to the SI system. In this context, the SI
and I consider the dB to be strange but unavoidable, whereas
its derivatives are particularly inconsistent with the
standard, and therefore confusing. That's enough for me.


You sound like a sales brochure for SI but SI has no mechanism
whatsoever for dealing with referenced logarithmic ratios. There is no
'labeling' that *could* be 'consistent' with SI, as it stands, so it's
a red herring to complain the clearly useful dB is 'inconsistent' with
something that has no means to deal with it, or anything like it, or
any substitute.


Not to mention that the dB is the ratio of two identically dimensioned
quantities and is therefore dimensionless so a system of standardising
dimensions is meaningless in this context.

Cheers

Ian- Hide quoted text -

- Show quoted text -


Amen...
Reply
Thread Tools
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Amplifier Input Impedance dre7 Vacuum Tubes 24 October 27th 07 03:40 PM
Input Sensitivity Settings on amps. ElGalanazo Car Audio 4 March 16th 06 06:46 PM
Input Sensitivity and Low Pass Filter Settings truballa2k3 Car Audio 0 September 13th 04 09:58 AM
Eclipse 5444 and Amplifier input sensitivity? MN Guy Car Audio 1 September 11th 04 12:11 AM
Ev P3000 - input sensitivity Marco Gaudesi Pro Audio 0 December 22nd 03 01:03 AM


All times are GMT +1. The time now is 10:44 AM.

Powered by: vBulletin
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 AudioBanter.com.
The comments are property of their posters.
 

About Us

"It's about Audio and hi-fi"