Actually, the length of the wire doesn't matter when determining the
proper
size fuse. It's not the resistance of the wire that's the issue, but
rather
the resistance per unit length (or, even more accurately, resistance per
unit surface area). This is because ultimately we're dealing with heat
dissipation as the factor governing what a wire can handle.
Wait a sec MZ. The length of a given wire WOULD effect voltage, right?
Longer runs cause larger losses of voltage.
Exactly right.
So, if you had a very long run
and your voltage dropped considerably, wouldn't you need more amperage to
maintain a given wattage (Ohm's law)? So, length of wire WOULD effect
amperage. Is my reasoning flawed?
This would be true only if today's amplifiers were fully regulated and that
they were perfectly capable of maintaining their power output despite
fluctuations in the supply voltage. But most amplifiers today are only
semi-regulated, and what you find is that the power output decreases as the
supply voltage decreases. As a result, the current draw is technically
lower with higher resistance wire.
But all this doesn't matter, really. Fusing depends only on the properties
of the wire - how much current the amp draws is irrelevant because fusing is
done to protect the wire. The internal fuse protects the amplifier.
|