View Single Post
  #8   Report Post  
Mike Rivers
 
Posts: n/a
Default


In article writes:

I agree that such a situation is hypothetical. However, if the only
difference between the two mics is their distance from the source
(say, a flute) and there are no reflections (i.e. this is an anechoic
chamber) then the two signals would be nearly identical at the two
mics (if the two mics are a foot or two apart, say), except for time
displacement (about 1mS). In other words, it's not a specific phase
angle at a specific frequency, but all frequencies, in the same phase
relationships, arriving at the two mics at different times.


I know you know what you're trying to say g but if there's a time
difference between signals, there's a phase relationship (not phase
DIFFERNCE as prople are prone to say) that isn't 0 degrees at all
frequencies. With a constant time delay, the phase shift will increase
from zero to several thousand degrees (depending on how far apart the
mics are and how high up in frequency you wish to measure it).
However, the time difference will be constant, obviously.

When we "correct" an "out of phase" microphone by reversing the
polarity, we add 180 degrees of phase shift to whatever it happens to
be at whatever frequency we choose to look at. Sometimes that sounds
better, sometimes it sounds worse, sometimes it's a tossup. When we
move a mic a few inches relative to another mic (or delay one signal
relative to the other electronically), we change all the phase shift
numbers. By finding the right amount of delay, we can make them all
zero, which usually is a good thing, or at least the most
theoretically correct thing. But at times, there's some aesthetic
benefit to letting a phase difference reduce the amplitude of certain
frequencies.



--
I'm really Mike Rivers )
However, until the spam goes away or Hell freezes over,
lots of IP addresses are blocked from this system. If
you e-mail me and it bounces, use your secret decoder ring
and reach me he double-m-eleven-double-zero at yahoo