Home |
Search |
Today's Posts |
#1
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
What's the best way to compensate for phase shift (due to latency) when bussing things out in digital?
|
#2
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
What's the best way to compensate for phase shift (due to latency) when bussing things out in digital? Use software that compensates for it. Just about all of the modern DAW systems deal with this automatically; the total latency of the system is the latency of the slowest plug-in plus the latency of the DAW application. The day of having to line everything up or do batch processing to avoid delays are gone with the days of Session 8. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#3
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 9/2/2017 12:32 PM, James Price wrote:
What's the best way to compensate for phase shift (due to latency) when bussing things out in digital? What program (or programs) are you using? Nearly all modern (or newest versions of legacy) DAW programs can read the number of samples that a particular plug-in delays from the driver and automatically compensates for this. Most modern DAW programs can also do that automatically when going out to an analog processor and back in to a track or as an effect return, but you may need to measure and manually enter the round trip delay. Your software probably offers instructions for doing that. But automatic delay compensation isn't always accurate, so you may have to do a little trimming. Your ears will tell you whether or not it's correct. -- For a good time, call http://mikeriversaudio.wordpress.com |
#4
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Saturday, September 2, 2017 at 12:09:31 PM UTC-5, Mike Rivers wrote:
On 9/2/2017 12:32 PM, James Price wrote: What's the best way to compensate for phase shift (due to latency) when bussing things out in digital? What program (or programs) are you using? Nearly all modern (or newest versions of legacy) DAW programs can read the number of samples that a particular plug-in delays from the driver and automatically compensates for this. Most modern DAW programs can also do that automatically when going out to an analog processor and back in to a track or as an effect return, but you may need to measure and manually enter the round trip delay. Your software probably offers instructions for doing that. But automatic delay compensation isn't always accurate, so you may have to do a little trimming. Your ears will tell you whether or not it's correct. I'm specifically referring to sending / returning from outboard gear. Do most DAW's handle latency, when sending to and returning from external hardware, automatically or is it generally a case of having to measure and manually enter the round trip delay? |
#5
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 9/2/2017 1:56 PM, James Price wrote:
I'm specifically referring to sending / returning from outboard gear. Do most DAW's handle latency, when sending to and returning from external hardware, automatically or is it generally a case of having to measure and manually enter the round trip delay? It depends on the program. That's why I asked what you were using. But the simple answer is: "RTFM" -- For a good time, call http://mikeriversaudio.wordpress.com |
#6
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
I'm specifically referring to sending / returning from outboard gear. Do most DAW's handle latency, when sending to and returning from external hardware, automatically or is it generally a case of having to measure and manually enter the round trip delay? Okay, that's a different matter altogether. Some systems have an automatic function where it will measure the latency in an effects loop. Some do not, and you just have to measure it yourself and line the pips up on the screen. The good news is that for most analogue processing, the latency is close enough to zero that you don't have to worry about it. You only have to worry about the latency of the A/D and D/A converters. Now, a lot of systems which have integrated software and hardware, like the higher levels of ProTools, and the Merging Technology systems and RADAR, all will compensate for this automatically because the software is designed to work with those converters, and the people who designed the software knew precisely what the converter latency was. But for systems that are not so carefully integrated, you're on your own. Personally, I just use the DAW as a big tape machine, I run the outputs of the DAW into a console and I mix on the console. If I want analogue processing I put it into the console insert. If I want digital processing, I do it on the DAW. Doing this eliminates all of your latency issues since all of the channels are going through the same converters and so all have the same latency. I understand this technique is not for everyone, however. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#7
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Saturday, September 2, 2017 at 2:26:17 PM UTC-5, Scott Dorsey wrote:
James Price wrote: I'm specifically referring to sending / returning from outboard gear. Do most DAW's handle latency, when sending to and returning from external hardware, automatically or is it generally a case of having to measure and manually enter the round trip delay? Okay, that's a different matter altogether. Some systems have an automatic function where it will measure the latency in an effects loop. Some do not, and you just have to measure it yourself and line the pips up on the screen. The good news is that for most analogue processing, the latency is close enough to zero that you don't have to worry about it. You only have to worry about the latency of the A/D and D/A converters. Now, a lot of systems which have integrated software and hardware, like the higher levels of ProTools, and the Merging Technology systems and RADAR, all will compensate for this automatically because the software is designed to work with those converters, and the people who designed the software knew precisely what the converter latency was. But for systems that are not so carefully integrated, you're on your own. Personally, I just use the DAW as a big tape machine, I run the outputs of the DAW into a console and I mix on the console. If I want analogue processing I put it into the console insert. If I want digital processing, I do it on the DAW. Doing this eliminates all of your latency issues since all of the channels are going through the same converters and so all have the same latency. I understand this technique is not for everyone, however. Correct me if I'm wrong but, isn't phase frequency dependent and not necessarily a one dimensional constant that can be fixed by delaying a signal? Thus, is it possible to have phase issues when running multiple tracks to multiple compressors within a DAW and/or externally (to outboard gear) despite delay compensation? |
#8
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Saturday, September 2, 2017 at 2:26:17 PM UTC-5, Scott Dorsey wrote:
James Price wrote: I'm specifically referring to sending / returning from outboard gear. Do most DAW's handle latency, when sending to and returning from external hardware, automatically or is it generally a case of having to measure and manually enter the round trip delay? Okay, that's a different matter altogether. Some systems have an automatic function where it will measure the latency in an effects loop. Some do not, and you just have to measure it yourself and line the pips up on the screen. The good news is that for most analogue processing, the latency is close enough to zero that you don't have to worry about it. You only have to worry about the latency of the A/D and D/A converters. Now, a lot of systems which have integrated software and hardware, like the higher levels of ProTools, and the Merging Technology systems and RADAR, all will compensate for this automatically because the software is designed to work with those converters, and the people who designed the software knew precisely what the converter latency was. But for systems that are not so carefully integrated, you're on your own. Personally, I just use the DAW as a big tape machine, I run the outputs of the DAW into a console and I mix on the console. If I want analogue processing I put it into the console insert. If I want digital processing, I do it on the DAW. Doing this eliminates all of your latency issues since all of the channels are going through the same converters and so all have the same latency. I understand this technique is not for everyone, however. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." Correct me if I'm wrong but, isn't phase frequency dependent and not necessarily a one dimensional constant that can be fixed by delaying a signal? Thus, is it possible to have phase issues when running multiple tracks to multiple compressors within a DAW and/or externally (to outboard gear) despite delay compensation? |
#9
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 9/3/2017 11:50 AM, James Price wrote:
Correct me if I'm wrong but, isn't phase frequency dependent and not necessarily a one dimensional constant that can be fixed by delaying a signal? You're correct here Thus, is it possible to have phase issues when running multiple tracks to multiple compressors within a DAW and/or externally (to outboard gear) despite delay compensation? Yes. Any time you have a time difference between two waveforms of the same frequency and amplitude _that are summed together_ the sum will be affected by the time difference. At the proper time difference for a given frequency, it the two waveforms will be 180 degrees out of phase, and the sum will cancel to zero. At another time difference, the two waveforms will sum to double amplitude. At other time delays, there will various degrees of boost or cut. The trick is that the original waveform and its processed version need to be summed. If you send a track out to a hardware compressor and then bring the compressor back into the mix _and only use the processed track_, then there will be no phase issues, only a time delay which, chances are, will be negligible, but which can be completely corrected if you know the delay time through the D/A and D/A converters (plus associated computer time necessary to talk to them) and, if any, the delay through the processor. If you're mixing the outboard-processed and original signals, however, such as with "parallel compression," then, unless you compensate for the time difference between them, the sum will contain frequencies that will be cancelled and other frequencies that will be emphasized. This is what the jargon has corrupted the original meaning of phase shift and what we speak of, more correctly, as "phase problems." What you need to be aware of is "unplanned summing" of the original and processed tracks. For example, if you run a snare drum through a compressor, the processed snare track will be mixed with the leakage of the snare in other drum tracks (they're everywhere) and there, you have a potential "phase problem" that you didn't expect. But this is often small enough to be ignored unless the amplitude of the snare in some other track is pretty close to that of the real snare track. -- For a good time, call http://mikeriversaudio.wordpress.com |
#10
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
Correct me if I'm wrong but, isn't phase frequency dependent and not necess= arily a one dimensional constant that can be fixed by delaying a signal? In this case we are speaking specifically of time delay which is constant and independent of frequency. There are plenty of other sorts of phase shift, but most of them you will not encounter as problems in the digital world. Th= us, is it possible to have phase issues when running multiple tracks to mul= tiple compressors within a DAW and/or externally (to outboard gear) despite= delay compensation? What is an "issue?" If you put signal through an analogue equalizer, it will change the phase in a specific way that isn't constant with frequency. That's not an issue, it's just part of what a minimum phase equalizer does, and it's part of why we use equalizers. There's no need to compensate for it, it's just there. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#11
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Sunday, September 3, 2017 at 11:24:21 AM UTC-5, Mike Rivers wrote:
On 9/3/2017 11:50 AM, James Price wrote: Correct me if I'm wrong but, isn't phase frequency dependent and not necessarily a one dimensional constant that can be fixed by delaying a signal? You're correct here Thus, is it possible to have phase issues when running multiple tracks to multiple compressors within a DAW and/or externally (to outboard gear) despite delay compensation? Yes. Any time you have a time difference between two waveforms of the same frequency and amplitude _that are summed together_ the sum will be affected by the time difference. At the proper time difference for a given frequency, it the two waveforms will be 180 degrees out of phase, and the sum will cancel to zero. At another time difference, the two waveforms will sum to double amplitude. At other time delays, there will various degrees of boost or cut. The trick is that the original waveform and its processed version need to be summed. If you send a track out to a hardware compressor and then bring the compressor back into the mix _and only use the processed track_, then there will be no phase issues, only a time delay which, chances are, will be negligible, but which can be completely corrected if you know the delay time through the D/A and D/A converters (plus associated computer time necessary to talk to them) and, if any, the delay through the processor. If you're mixing the outboard-processed and original signals, however, such as with "parallel compression," then, unless you compensate for the time difference between them, the sum will contain frequencies that will be cancelled and other frequencies that will be emphasized. This is what the jargon has corrupted the original meaning of phase shift and what we speak of, more correctly, as "phase problems." What you need to be aware of is "unplanned summing" of the original and processed tracks. For example, if you run a snare drum through a compressor, the processed snare track will be mixed with the leakage of the snare in other drum tracks (they're everywhere) and there, you have a potential "phase problem" that you didn't expect. But this is often small enough to be ignored unless the amplitude of the snare in some other track is pretty close to that of the real snare track. As soon as you sum two signals where one was processed with EQ, the phase relationship changes, doesn't it? I'm not talking about a pure sine wave but rather a complex signal. For instance, if you add a .5ms delay to a sound, won't 1k, 3k, 5k, and 7k be 180 degrees out of phase whereas 2k, 4k, 6k and 8k will still be in phase? Doesn't using a compressor for color add overtones? What happens to the phase between the altered and unaltered material? |
#12
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Sunday, September 3, 2017 at 3:19:42 PM UTC-5, Scott Dorsey wrote:
James Price wrote: Correct me if I'm wrong but, isn't phase frequency dependent and not necess= arily a one dimensional constant that can be fixed by delaying a signal? In this case we are speaking specifically of time delay which is constant and independent of frequency. There are plenty of other sorts of phase shift, but most of them you will not encounter as problems in the digital world. Th= us, is it possible to have phase issues when running multiple tracks to mul= tiple compressors within a DAW and/or externally (to outboard gear) despite= delay compensation? What is an "issue?" If you put signal through an analogue equalizer, it will change the phase in a specific way that isn't constant with frequency. That's not an issue, it's just part of what a minimum phase equalizer does, and it's part of why we use equalizers. There's no need to compensate for it, it's just there. From what I understand linear phase EQ's do keep phase in check, but have other undesirable issues. As for latency, I've heard that a lot of plugins report none, but there's still a change. Ever checked it with a test tone? |
#13
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 9/3/2017 9:55 PM, James Price wrote:
On Sunday, September 3, 2017 at 11:24:21 AM UTC-5, Mike Rivers wrote: On 9/3/2017 11:50 AM, James Price wrote: Correct me if I'm wrong but, isn't phase frequency dependent and not necessarily a one dimensional constant that can be fixed by delaying a signal? You're correct here Thus, is it possible to have phase issues when running multiple tracks to multiple compressors within a DAW and/or externally (to outboard gear) despite delay compensation? Yes. Any time you have a time difference between two waveforms of the same frequency and amplitude _that are summed together_ the sum will be affected by the time difference. At the proper time difference for a given frequency, it the two waveforms will be 180 degrees out of phase, and the sum will cancel to zero. At another time difference, the two waveforms will sum to double amplitude. At other time delays, there will various degrees of boost or cut. The trick is that the original waveform and its processed version need to be summed. If you send a track out to a hardware compressor and then bring the compressor back into the mix _and only use the processed track_, then there will be no phase issues, only a time delay which, chances are, will be negligible, but which can be completely corrected if you know the delay time through the D/A and D/A converters (plus associated computer time necessary to talk to them) and, if any, the delay through the processor. If you're mixing the outboard-processed and original signals, however, such as with "parallel compression," then, unless you compensate for the time difference between them, the sum will contain frequencies that will be cancelled and other frequencies that will be emphasized. This is what the jargon has corrupted the original meaning of phase shift and what we speak of, more correctly, as "phase problems." What you need to be aware of is "unplanned summing" of the original and processed tracks. For example, if you run a snare drum through a compressor, the processed snare track will be mixed with the leakage of the snare in other drum tracks (they're everywhere) and there, you have a potential "phase problem" that you didn't expect. But this is often small enough to be ignored unless the amplitude of the snare in some other track is pretty close to that of the real snare track. As soon as you sum two signals where one was processed with EQ, the phase relationship changes, doesn't it? I'm not talking about a pure sine wave but rather a complex signal. For instance, if you add a .5ms delay to a sound, won't 1k, 3k, 5k, and 7k be 180 degrees out of phase whereas 2k, 4k, 6k and 8k will still be in phase? Doesn't using a compressor for color add overtones? What happens to the phase between the altered and unaltered material? What you seem to be describing is comb filtering. Look it up. The phase effects of EQ are quite different, more related to Kramers–Kronig relations. -- == Later... Ron Capik -- --- This email has been checked for viruses by AVG. http://www.avg.com |
#14
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 9/3/2017 9:55 PM, James Price wrote:
As soon as you sum two signals where one was processed with EQ, the phase relationship changes, doesn't it? I'm not talking about a pure sine wave but rather a complex signal. That depends on the signals and the type of equalization. For instance, if you add a .5ms delay to a sound, won't 1k, 3k, 5k, and 7k be 180 degrees out of phase whereas 2k, 4k, 6k and 8k will still be in phase? If you finish the sentence. Those frequencies in the delayed version of the signal will be 180 degrees out of phase with those frequencies in the unprocessed signal. A single signal isn't "out of phase" unless you state what it's out of phase with. Oh, and any other frequency will be some amount of out of phase with its undelayed counterpart. How you use those two versions of a signal determines whether or not you have an audio problem - or fix a problem - or create a desirable effect. Doesn't using a compressor for color add overtones? Yes. That's what "color" is. What happens to the phase between the altered and unaltered material? It depends on how long it takes for the signal to pass from the input to the output of the compressor. With an analog compressor, which has essentially no throughput delay, there will be no cancellation when the compressed and uncompressed signals are mixed. This is why "parallel compression" became a popular technique before we had DAWs and plug-ins. On the other hand, if the compressor introduces a delay, then, when you sum its output with the uncompressed signal, either you'll get some phase cancellation (typically called "comb filtering"). From what I understand linear phase EQ's do keep phase in check, but have other undesirable issues. A linear phase filter doesn't really exist in the analog world (there are some squirrelly exceptions so don't get all "what about..." with me). It's a product of digital filtering, which is based on delays. Linear phase equalizers can introduce several hundred samples of delay. As for latency, I've heard that a lot of plugins report none, but there's still a change. Ever checked it with a test tone? All plug-ins that affect the signal cause some delay. Some introduce a negligible amount of delay and don't affect musical timing, or, when combined with an unprocessed version of the same signal, only introduce significant comb filtering at high frequencies, or, for a small enough delay, outside of the audio range. Some plug-ins report a theoretical amount of latency, through the audio driver, based on the number of calculations that it's doing. It's up to the DAW program to interpret that information and do something with it. Some do a better job than others. And of course if there's no latency reported, there's no compensation. Nothing in digital audio "just works" all the time, on all systems. Whenever you introduce some processing in the signal path, you need to listen and decide if you've made the sound better or worse. |
#15
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
From what I understand linear phase EQ's do keep phase in check, but have other undesirable issues. It's possible to do that in the digital domain, not in the analogue domain. The phase remains constant with frequency... but the "undesirable issue" that they have is that they don't sound like an equalizer should. Because some of the point of the analogue equalizer _is_ the group delay. They're still useful for shelving filters and the like. But they are really a special-purpose thing. As for latency, I've heard that a lot of plugins report none, but there's still a change. Ever checked it with a test tone? Haven't seen that with ProTools, and I have checked it with a click. But as I said, I don't mix in the box so I don't really have much of this to worry about. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#16
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 9/3/2017 9:55 PM, James Price wrote:
As soon as you sum two signals where one was processed with EQ, the phase relationship changes, doesn't it? I'm not talking about a pure sine wave but rather a complex signal. That depends on the signals and the type of equalization. For instance, if you add a .5ms delay to a sound, won't 1k, 3k, 5k, and 7k be 180 degrees out of phase whereas 2k, 4k, 6k and 8k will still be in phase? If you finish the sentence. Those frequencies in the delayed version of the signal will be 180 degrees out of phase with those frequencies in the unprocessed signal. A single signal isn't "out of phase" unless you state what it's out of phase with. Oh, and any other frequency will be some amount of out of phase with its undelayed counterpart. How you use those two versions of a signal determines whether or not you have an audio problem - or fix a problem - or create a desirable effect. Doesn't using a compressor for color add overtones? Yes. That's what "color" is. But a compressor adds harmonic distortion, introducing overtones that weren't in the original signal. What happens to the phase between the altered and unaltered material? It depends on how long it takes for the signal to pass from the input to the output of the compressor. With an analog compressor, which has essentially no throughput delay, there will be no cancellation when the compressed and uncompressed signals are mixed. This is why "parallel compression" became a popular technique before we had DAWs and plug-ins. On the other hand, if the compressor introduces a delay, then, when you sum its output with the uncompressed signal, either you'll get some phase cancellation (typically called "comb filtering"). From what I understand linear phase EQ's do keep phase in check, but have other undesirable issues. A linear phase filter doesn't really exist in the analog world (there are some squirrelly exceptions so don't get all "what about..." with me). It's a product of digital filtering, which is based on delays. Linear phase equalizers can introduce several hundred samples of delay. As for latency, I've heard that a lot of plugins report none, but there's still a change. Ever checked it with a test tone? All plug-ins that affect the signal cause some delay. Some introduce a negligible amount of delay and don't affect musical timing, or, when combined with an unprocessed version of the same signal, only introduce significant comb filtering at high frequencies, or, for a small enough delay, outside of the audio range. Some plug-ins report a theoretical amount of latency, through the audio driver, based on the number of calculations that it's doing. It's up to the DAW program to interpret that information and do something with it. Some do a better job than others. And of course if there's no latency reported, there's no compensation. Nothing in digital audio "just works" all the time, on all systems. Whenever you introduce some processing in the signal path, you need to listen and decide if you've made the sound better or worse. -- For a good time, call http://mikeriversaudio.wordpress.com |
#17
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Monday, September 4, 2017 at 6:04:19 AM UTC-5, Mike Rivers wrote:
On 9/3/2017 9:55 PM, James Price wrote: As soon as you sum two signals where one was processed with EQ, the phase relationship changes, doesn't it? I'm not talking about a pure sine wave but rather a complex signal. That depends on the signals and the type of equalization. For instance, if you add a .5ms delay to a sound, won't 1k, 3k, 5k, and 7k be 180 degrees out of phase whereas 2k, 4k, 6k and 8k will still be in phase? If you finish the sentence. Those frequencies in the delayed version of the signal will be 180 degrees out of phase with those frequencies in the unprocessed signal. A single signal isn't "out of phase" unless you state what it's out of phase with. Oh, and any other frequency will be some amount of out of phase with its undelayed counterpart. How you use those two versions of a signal determines whether or not you have an audio problem - or fix a problem - or create a desirable effect. Doesn't using a compressor for color add overtones? Yes. That's what "color" is. What happens to the phase between the altered and unaltered material? It depends on how long it takes for the signal to pass from the input to the output of the compressor. With an analog compressor, which has essentially no throughput delay, there will be no cancellation when the compressed and uncompressed signals are mixed. This is why "parallel compression" became a popular technique before we had DAWs and plug-ins. On the other hand, if the compressor introduces a delay, then, when you sum its output with the uncompressed signal, either you'll get some phase cancellation (typically called "comb filtering"). Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null and it's not related to latency. Not really an issue when you using an analog console and outboard gear. That's all I'm saying. |
#18
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 5/09/2017 3:09 AM, James Price wrote:
Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null and it's not related to latency. Not really an issue when you using an analog console and outboard gear. That's all I'm saying. Alter one track in any way at all - of course it won't null ! geoff |
#19
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null and it's not related to latency. Not really an issue when you using an analog console and outboard gear. That's all I'm saying. Of course they won't null, they are different tracks. The end result is the difference between a compressed track and an uncompressed track. If it nulled out, what would be the point of using compression in the first place? What you should get is a track with exaggerated dynamics... that is on the quiet passages you will have removed more of the original signal than you will have on the softer passages. What you have done has been to build an expander. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#20
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Monday, September 4, 2017 at 4:11:16 PM UTC-5, Scott Dorsey wrote:
James Price wrote: Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null and it's not related to latency. Not really an issue when you using an analog console and outboard gear. That's all I'm saying. Of course they won't null, they are different tracks. The end result is the difference between a compressed track and an uncompressed track. If it nulled out, what would be the point of using compression in the first place? I guess I expected it to null at -60dB below and sum at -10dB. I also expected that with adjusted gain, the result to be somewhere below -20dB. If you don't use compressors to compress but to add harmonics, aren't we back to changing phase? |
#21
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 5/09/2017 6:16 AM, geoff wrote:
On 5/09/2017 3:09 AM, James Price wrote: Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null.... Alter one track in any way at all - of course it won't null ! Yeah I just thought, well duh! Trevor. |
#22
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 5/09/2017 7:11 AM, Scott Dorsey wrote:
James Price wrote: Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null and it's not related to latency. Not really an issue when you using an analog console and outboard gear. That's all I'm saying. Of course they won't null, they are different tracks. The end result is the difference between a compressed track and an uncompressed track. If it nulled out, what would be the point of using compression in the first place? What you should get is a track with exaggerated dynamics... that is on the quiet passages you will have removed more of the original signal than you will have on the softer passages. Perhaps you should try that sentence again. :-) |
#23
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
OP
in general, it is a very bad idea to route THE SAME signal over two paths and then re-combine them. You can get comb filtering. I would work hard to avoid such a signal flow. Parallel compression with an analog compressor is an example of violating this rule. I would not even try it with a digital compressor that has any latency. In general it is not an issue to combine signals from __different__ sources over different paths. m |
#24
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
On Monday, September 4, 2017 at 4:11:16 PM UTC-5, Scott Dorsey wrote: James Price wrote: Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null and it's not related to latency. Not really an issue when you using an analog console and outboard gear. That's all I'm saying. Of course they won't null, they are different tracks. The end result is the difference between a compressed track and an uncompressed track. If it nulled out, what would be the point of using compression in the first place? I guess I expected it to null at -60dB below and sum at -10dB. I also expected that with adjusted gain, the result to be somewhere below -20dB. If you try it with a 1KC test tone, there is one level at which you should be able to make it null but it may not stay stable for very long depending on the compressor. The same is true of doing it with an analogue compressor, an analogue signal source, and an analogue console. All the same behaviour. If you don't use compressors to compress but to add harmonics, aren't we back to changing phase? No, that's adding additional frequencies to the original signal, it has nothing to do with changing phase relationships between the ones that are already there. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#25
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Tuesday, September 5, 2017 at 9:39:42 AM UTC-5, Scott Dorsey wrote:
James Price wrote: On Monday, September 4, 2017 at 4:11:16 PM UTC-5, Scott Dorsey wrote: James Price wrote: Well, for example, load a mono file in Logic and copy it to a second track. Invert the polarity of the second track, add a compressor to it and use extreme threshold and attack settings. Enable auto gain. The tracks won't null and it's not related to latency. Not really an issue when you using an analog console and outboard gear. That's all I'm saying. Of course they won't null, they are different tracks. The end result is the difference between a compressed track and an uncompressed track. If it nulled out, what would be the point of using compression in the first place? I guess I expected it to null at -60dB below and sum at -10dB. I also expected that with adjusted gain, the result to be somewhere below -20dB. If you try it with a 1KC test tone, there is one level at which you should be able to make it null but it may not stay stable for very long depending on the compressor. The same is true of doing it with an analogue compressor, an analogue signal source, and an analogue console. All the same behaviour. If you don't use compressors to compress but to add harmonics, aren't we back to changing phase? No, that's adding additional frequencies to the original signal, it has nothing to do with changing phase relationships between the ones that are already there. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." I know in theory it should behave the same way in analog but it simply doesn't, though I'd love to know why. As soon as you add frequencies by adding overtones or summing tones, isn't the way they interact by adding difference and combination tones, all which have their own wave length? Some will be harmonic, some inharmonic, but as I understand it, it most definitely affects phase. Am I wrong in assuming that phase isn't just the sinusoidal wave you see in your DAW(which is why phase rotators let you adjust more than just top and bottom of the wave or slip it)? The ones created below your starting pitch will change the wavelength of your complex tone. My understanding is that lower pitch equals longer wave length, thus changing phase. For example, take 110Hz test tone. Now take another test tone and set it to 123.75Hz, and add one more at 165Hz. This gives you a beatless version of a typical Sus2 chord in tune based on a 110Hz root. Now look at a frequency analyzer (sum the 3 test tones). Aside from the expected 110Hz, 123.75Hz and 162.50Hz, you will see new nodes at 52.5Hz, 71.25Hz, 96.25Hz, etc. My interpretation is that the ones below your starting point will change the wave length of your complex tone (longer) thus changing phase. Now run it through anything that's Class A and crank it so it produces partials on top, harmonic or inharmonic. This is all before we add, say, a guitar string's inharmonicity and the instrument's resonance properties into the mix. This is merely the pure sounds summed. Set it to 110HZ, 123.47Hz and 164.81Hz if you want to try with equal temperament values. The point is, when you produce combination tones, and sum two tones in the same space, whether it's done acoustically or electronically makes no difference. You add longer wave forms, correct? Even if the interval is beatless it will cancel half the partials(ie. comb filtering). This is all stuff that, if you want to geek out, you can try with just about any DAW that has an EQ with analyzer and a test tone generator. To make it messier, record a few strings being plucked on a guitar to separate tracks. It'll just add the strings and guitars inherent harmonics. |
#26
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
|
#27
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 9/5/2017 2:32 PM, James Price wrote:
I know in theory it should behave the same way in analog but it simply doesn't, though I'd love to know why. "In theory, theory and practice are identical. In practice, they're different." - Me -- For a good time, call http://mikeriversaudio.wordpress.com |
#28
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
I know in theory it should behave the same way in analog but it simply doesn't, though I'd love to know why. Try it. I can make it behave the same way. If it's not... my first guess would be latency issues. As soon as you add frequencies by adding overtones or summing tones, isn't the way they interact by adding difference and combination tones, all which have their own wave length? Not necessarily. If you put them into some nonlinear thing, THEN you'll get beat notes, though. But that won't affect the phase of the original components. Some will be harmonic, some inharmonic, but as I understand it, it most definitely affects phase. Phase shift is a time delay. It can be a time delay that is constant with frequency or not constant with frequency, but it is time delay. Am I wrong in assuming that phase isn't just the sinusoidal wave you see in your DAW(which is why phase rotators let you adjust more than just top and bottom of the wave or slip it)? Most of what you see in your DAW isn't a sinsusoidal wave at all, but merely the wave envelope. If you were able to zoom in far enough you would be able to see the actual waveform... and if you did that, and it was a pure tone, than the phase would be the degree to which the waveform was shifted right or left across the screen. Phase rotators are gadgets that deliberately screw up the phase relationship between the original components of the signal. The end result of this is that the waveform is changed and the waveform envelope becomes more symmetric. But those are _symptoms_ of the non-constant phase shift. The ones created below your starting pitch will change the wavelength of your complex tone. My understanding is that lower pitch equals longer wave length, thus changing phase. For example, take 110Hz test tone. Now take another test tone and set it to 123.75Hz, and add one more at 165Hz. This gives you a beatless version of a typical Sus2 chord in tune based on a 110Hz root. Now look at a frequency analyzer (sum the 3 test tones). Aside from the expected 110Hz, 123.75Hz and 162.50Hz, you will see new nodes at 52.5Hz, 71.25Hz, 96.25Hz, etc. My interpretation is that the ones below your starting point will change the wave length of your complex tone (longer) thus changing phase. I have absolutely no idea what you're talking about but it doesn't have anything to do with phase shift. Certainly the creation of distortion products has nothing to do with phase shift. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#29
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
Okay, how about this: listen to a phase rotator. It makes the waveform look different... it clearly injures the phase relationship between the components of that signal. But listen to it, and it has really very little audible effect, if it's a good one. No tonal changes, no dynamic changes. This should be the first hint that nonlinearity and phase shift are not related. Now... a lot of people talk about "phasiness" when they are meaning "comb filtering." This is because one major source of comb filtering is when a delayed signal is added to the original undelayed signal, and the end result is cancellation that varies with frequency. But this is a _frequency response problem_ that is a product of _both_ the signal summation _and_ the delay. It's not really a "phase thing." --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#30
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Tuesday, September 5, 2017 at 5:48:23 PM UTC-5, Scott Dorsey wrote:
James Price wrote: I know in theory it should behave the same way in analog but it simply doesn't, though I'd love to know why. Try it. I can make it behave the same way. If it's not... my first guess would be latency issues. As soon as you add frequencies by adding overtones or summing tones, isn't the way they interact by adding difference and combination tones, all which have their own wave length? Not necessarily. If you put them into some nonlinear thing, THEN you'll get beat notes, though. But that won't affect the phase of the original components. Some will be harmonic, some inharmonic, but as I understand it, it most definitely affects phase. Phase shift is a time delay. It can be a time delay that is constant with frequency or not constant with frequency, but it is time delay. Am I wrong in assuming that phase isn't just the sinusoidal wave you see in your DAW(which is why phase rotators let you adjust more than just top and bottom of the wave or slip it)? Most of what you see in your DAW isn't a sinsusoidal wave at all, but merely the wave envelope. If you were able to zoom in far enough you would be able to see the actual waveform... and if you did that, and it was a pure tone, than the phase would be the degree to which the waveform was shifted right or left across the screen. Phase rotators are gadgets that deliberately screw up the phase relationship between the original components of the signal. The end result of this is that the waveform is changed and the waveform envelope becomes more symmetric. But those are _symptoms_ of the non-constant phase shift. The ones created below your starting pitch will change the wavelength of your complex tone. My understanding is that lower pitch equals longer wave length, thus changing phase. For example, take 110Hz test tone. Now take another test tone and set it to 123.75Hz, and add one more at 165Hz. This gives you a beatless version of a typical Sus2 chord in tune based on a 110Hz root. Now look at a frequency analyzer (sum the 3 test tones). Aside from the expected 110Hz, 123.75Hz and 162.50Hz, you will see new nodes at 52.5Hz, 71.25Hz, 96.25Hz, etc. My interpretation is that the ones below your starting point will change the wave length of your complex tone (longer) thus changing phase. I have absolutely no idea what you're talking about but it doesn't have anything to do with phase shift. Certainly the creation of distortion products has nothing to do with phase shift. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." However, frequency isn't constant, is it? Unless it's a sine wave, my understanding is that it's made of components. And I don't think the last bit has anything to do with distortion but rather what happens when multiple frequencies and their overtones interact. What do you think happens when you play multiple notes together? Even simpler, what happens to the wavelength of a signal when you go from one note to the next? |
#31
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Tuesday, September 5, 2017 at 6:33:27 PM UTC-5, Scott Dorsey wrote:
Okay, how about this: listen to a phase rotator. It makes the waveform look different... it clearly injures the phase relationship between the components of that signal. But listen to it, and it has really very little audible effect, if it's a good one. No tonal changes, no dynamic changes. This should be the first hint that nonlinearity and phase shift are not related. Now... a lot of people talk about "phasiness" when they are meaning "comb filtering." This is because one major source of comb filtering is when a delayed signal is added to the original undelayed signal, and the end result is cancellation that varies with frequency. But this is a _frequency response problem_ that is a product of _both_ the signal summation _and_ the delay. It's not really a "phase thing." We're now talking about two things, one of them is keeping stuff in phase in general. The other, you're asserting that frequency remains constant. Yes, you can get something perfectly in phase but it will change as soon as the frequency content changes. Do we agree on that? As for injuring the phase relationship, isn't that what it's intended for? When I'm summing multiple mics, there's no issue. Add the DI from an amp and it comes into play. We may be getting on the same page here. I never said delay wasn't an issue. I'm saying I don't think it's the only one. |
#32
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
However, frequency isn't constant, is it? Unless it's a sine wave, my understanding is that it's made of components. Right, every complex signal can be thought of as components with different frequency and phase summed together. The phase isn't really very audible, though. You have to change it a lot before it's noticeable, unlike either frequency or amplitude. And I don't think the last bit has anything to do with distortion but rather what happens when multiple frequencies and their overtones interact. Beat notes _only_ happen when there is something nonlinear going on. When you hear it while tuning a guitar, the nonlinearity is in your ear. But you don't get any new frequencies added if the system is completely linear. That effect is unrelated to phase shift. What do you think happens when you play multiple notes together? They are summed together. You had sin(3t+theta) and sin(5t+theta2) and now you have sin(3t+theta)+sin(3t+theta2). No beats created, not until you put it into a nonlinear mixing function. Even simpler, what happens to the wavelength of a signal when you go from one note to the next? It is likely to change, although of course middle A and A-above-middle-C on a piano both will have 880 Hz components. But this also has nothing to do with phase shift. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#33
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
We're now talking about two things, one of them is keeping stuff in phase in general. The other, you're asserting that frequency remains constant. Yes, you can get something perfectly in phase but it will change as soon as the frequency content changes. Do we agree on that? No. You can add distortion products, but the original stuff is still there and the phase relationship between them is not changed just by adding more components. There is no magic going on when you add distortion products. Try it. Look on a scope. As for injuring the phase relationship, isn't that what it's intended for? When I'm summing multiple mics, there's no issue. Add the DI from an amp and it comes into play. Oh, there can be issue with summing multiple mikes too if there is leakage into them..... and it's for the same reason that it's an issue with the DI... with the DI you have a microphone feed and a direct feed and they are shifted in time by a couple milliseconds. When you sum them together, you get comb filtering. But it is the comb filtering that is audible, not the delay. We may be getting on the same page here. I never said delay wasn't an issue. I'm saying I don't think it's the only one. ALL PHASE SHIFT IS DELAY. Some of it is delay that is constant with frequency. Some of is delay that is not constant with frequency. but it is ALL DELAY. Delay isn't an issue. Group delay isn't an issue. What is an issue is when you sum signals that are shifted in time together and get comb filtering. It's the _comb filtering_ that is the issue, not the phase shift. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#34
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Tuesday, September 5, 2017 at 7:35:34 PM UTC-5, Scott Dorsey wrote:
James Price wrote: However, frequency isn't constant, is it? Unless it's a sine wave, my understanding is that it's made of components. Right, every complex signal can be thought of as components with different frequency and phase summed together. The phase isn't really very audible, though. You have to change it a lot before it's noticeable, unlike either frequency or amplitude. And I don't think the last bit has anything to do with distortion but rather what happens when multiple frequencies and their overtones interact. Beat notes _only_ happen when there is something nonlinear going on. When you hear it while tuning a guitar, the nonlinearity is in your ear. But you don't get any new frequencies added if the system is completely linear. That effect is unrelated to phase shift. What do you think happens when you play multiple notes together? They are summed together. You had sin(3t+theta) and sin(5t+theta2) and now you have sin(3t+theta)+sin(3t+theta2). No beats created, not until you put it into a nonlinear mixing function. Even simpler, what happens to the wavelength of a signal when you go from one note to the next? It is likely to change, although of course middle A and A-above-middle-C on a piano both will have 880 Hz components. But this also has nothing to do with phase shift. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." You're assuming only harmonic overtones are present, which isn't the case, in my opinion. Also, the non-linearities of the ear are difference and combination tones generated by the ear, however I don't think that makes them psychoacoustic, as they can be seen / measured. Differential tones are very much audible. Indeed, combination tones do get obstructed by higher harmonic over tones. |
#35
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Tuesday, September 5, 2017 at 7:42:08 PM UTC-5, Scott Dorsey wrote:
James Price wrote: We're now talking about two things, one of them is keeping stuff in phase in general. The other, you're asserting that frequency remains constant. Yes, you can get something perfectly in phase but it will change as soon as the frequency content changes. Do we agree on that? No. You can add distortion products, but the original stuff is still there and the phase relationship between them is not changed just by adding more components. There is no magic going on when you add distortion products. Try it. Look on a scope. I've looked at it many times. You add overtones and they create combination tones. Add odd and even overtones and see. On Tuesday, September 5, 2017 at 7:42:08 PM UTC-5, Scott Dorsey wrote: James Price wrote: As for injuring the phase relationship, isn't that what it's intended for? When I'm summing multiple mics, there's no issue. Add the DI from an amp and it comes into play. Oh, there can be issue with summing multiple mikes too if there is leakage into them..... and it's for the same reason that it's an issue with the DI... with the DI you have a microphone feed and a direct feed and they are shifted in time by a couple milliseconds. When you sum them together, you get comb filtering. But it is the comb filtering that is audible, not the delay. We may be getting on the same page here. I never said delay wasn't an issue. I'm saying I don't think it's the only one. ALL PHASE SHIFT IS DELAY. Some of it is delay that is constant with frequency. Some of is delay that is not constant with frequency. but it is ALL DELAY. Delay isn't an issue. Group delay isn't an issue. What is an issue is when you sum signals that are shifted in time together and get comb filtering. It's the _comb filtering_ that is the issue, not the phase shift. I can agree that all phase shift is delay dependent on frequency and not constant. That said, comb filtering also applies to the reflection from the floor back into the mic(instant comb filtering), though I think it keeps stuff cleaner than multi mics. |
#36
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On 6/09/2017 12:42 PM, Scott Dorsey wrote:
Delay isn't an issue. Group delay isn't an issue. What is an issue is when you sum signals that are shifted in time together and get comb filtering. It's the _comb filtering_ that is the issue, not the phase shift. --scott Nice that it is trivial to realign them to sample exactness ! geoff |
#37
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Tue, 5 Sep 2017 18:39:00 -0400, Mike Rivers
wrote: On 9/5/2017 2:32 PM, James Price wrote: I know in theory it should behave the same way in analog but it simply doesn't, though I'd love to know why. "In theory, theory and practice are identical. In practice, they're different." - Me "The more theory you learn, and the better you learn it, the more you find it is identical to practice." - Me d --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus |
#38
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
James Price wrote:
That said, comb filtering also applies to the reflection from the floor back into the mic(instant comb filtering), though I think it keeps stuff cleaner than multi mics. Yes, that's comb filtering caused by phase shift. The signal is delayed by the additional distance of the reflection. It takes about a millisecond for sound to travel a foot in free air. Move the mike five feet, you have a uniform phase shift of 5ms at all frequencies. Same thing when you sum two microphones at different places in the room and leakage between them gets comb filtering. That is comb filtering caused by phase shift. I don't why you are getting so far afield talking about crazy unrelated stuff like beat notes. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#39
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
Geoff wrote:
On 6/09/2017 12:42 PM, Scott Dorsey wrote: Delay isn't an issue. Group delay isn't an issue. What is an issue is when you sum signals that are shifted in time together and get comb filtering. It's the _comb filtering_ that is the issue, not the phase shift. Nice that it is trivial to realign them to sample exactness ! You can do that if you want! You don't have to do it if you don't want! It's much nicer than the days when the orchestral spots had to be exactly the correct distance from the main pair, regardless of the geometry of the room and orchestra, because you had only one delay time with sel-sync. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#40
Posted to rec.audio.pro
|
|||
|
|||
Compensating for phase shift when bussing things out in digital?
On Tuesday, September 5, 2017 at 9:29:29 PM UTC-4, Geoff wrote:
On 6/09/2017 12:42 PM, Scott Dorsey wrote: Delay isn't an issue. Group delay isn't an issue. What is an issue is when you sum signals that are shifted in time together and get comb filtering. It's the _comb filtering_ that is the issue, not the phase shift. --scott Nice that it is trivial to realign them to sample exactness ! geoff the semantics can be confusing a time delay that is constant over frequency is the same as a phase shift that is NOT constant with frequency. For example, a constant 1 ms delay is =360 deg at 1 kHz and 180 deg at 500 Hz and 90 deg at 250 Hz etc. That is why combing the same signal over two paths that have a 1 ms time difference will create comb filtering with nulls spaced every 1 kHz. m |
Reply |
|
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Forum | |||
phase shift boxes. | Pro Audio | |||
Dynamic phase shift | Vacuum Tubes | |||
Dynamic phase shift | Audio Opinions | |||
phase shift eq question | Pro Audio | |||
why does an eq cause phase shift? | Pro Audio |