Home |
Search |
Today's Posts |
|
#1
|
|||
|
|||
Someone says "anything over -10db in digital video is distortion" and y
|
#2
|
|||
|
|||
Mike Rivers wrote:
In article writes: I have a vu indicator which gives me an audible tone at 0vu and set things up with my wif's help using her eyes so that a 1 khz tone generates -20 dbfs once we've gone over to digital. I get satisfactory recordings this way. The problem that's becoming more and more common is that there's no convenient way to adjust the input sensitivity of a digital recorder, so you can't calibrate your system. Well, when in Rome you do as the Romans. I suspect that the most common means for setting the gain of systems with digital recorders is to adjust something in the analog signal chain that preceeds it. For me, that's always the mic preamp. My recommended procedure is to record a loud passage during rehearsal, set gains as required to ensure that your recording has sufficient (at least 10 dB) headroom, based on visual inspection of the individual track recordings as displayed in full-screen mode, perhaps with some time expansion. A similar means is avaiable when you record off the inserts or direct outs of a console. If your gain staging through the console is reasonable, you just naturally end up with the right levels going into the digital recorder, as you set the individual channel trims by ear. Indeed, if the individual channel recordings fail the first test above, its probable that your gain staging through the console is off and needs adjusting. |
#3
|
|||
|
|||
|
#4
|
|||
|
|||
"Mike Rivers" wrote in message
news:znr1119976835k@trad In article writes: I suspect that the most common means for setting the gain of systems with digital recorders is to adjust something in the analog signal chain that preceeds it. I'm sure you're right about that. For me, that's always the mic preamp. But good gain management tells us that this isn't always the best way to do it. (and of course we always want to do our best, don't we?) You want to get all the gain you need in as early a stage as you can, and then keep unity gain past that. My approach does just that. So you adjust your preamp gain so that it's plenty high but safely away from clipping, and at point, your peaks are at +20 dBu. If your A/D converter is calibrated so that it clips at +14 dBu ("calabrated" to -14) then your nice clean and quiet preamp will cause your converter to clip. No, that's not how I do it, nor is it how I advise people to do it. I tell people to adjust the gain (in this case the mic preamp gain) so that they have the desired amount of headroom in the digital domain - by looking at the actual display of an actual recording they make during rehearsal. IOW, if the user manual says that the direct outs or inserts are +4, then I set the input sensitivity on the computer digital audio interface to +4 and then set the trims or mic preamp gains so that the display in the DAW software shows that the loudest part of the loudest music is still recording at least 10 dB below FS. If I work on mixing the tracks and see that the headroom for some channel is less than 10 dB, I nudge the related trim down as required before I record the next session. The right place to reduce the level is at the output of the preamp, not the input, or at the input of the A/D converter. I think that's what I meant by: "For me, that's always the mic preamp." Conversely, if your converter has lower sensitivity, you might be tempted to increase the gain of the preamp until it's going into clipping, in an attempt to "record a hot signal." We see that a lot around here. Hence my constant harping about maintaining about 10 dB headroom over actual observed levels, as seen in the digital domain. Indeed, if the individual channel recordings fail the first test above, its probable that your gain staging through the console is off and needs adjusting. Exactly - and that's what you can't always adjust in the right place. If your direct outs or insert points are running at the same nominal level as your digital recorder's input sensitivity is set for, then adjusting the mic preamp to make the digital recorder happy automatically ensures that the rest of the console will be happy, too. Most of the time, the only control you have over the direct output of a console (or channel insert send) is the mic preamp trim control. Agreed. You can usually make the record level meters look right but you might be compromising the signal-to-noise ratio or you might be driving your mic preamp into clipping. rant on Where did I say *anything* about meters? I hate meters. I never take them seriously when I am recording or mixing. I ordered my new 02R96 without a meter bridge, and I hope to *never* have any meters attached to it. I expect to make minimal use of the 02R96 built-in metering. The output leds on my Mackie SR32 are among its least-used features. BTW, when I do look at them they look *right*, but that's a natural consequence of good hygiene everyplace else. rant off If you're smart, in the latter case, you'll recognize that it's clipping and back it off (and complain that your preamp isn't "hot enough." but most of the time probalby what will happen is that the clipped signal will be reocrded and the A/D converter will get blamed. That's one reason why I tell people to set levels based on the individual channel display(s) at full magnification in the DAW software. Then, there's no question about how much headroom there is between peaks and FS, and there's no question about how meter response relates to the music you are recording. |
#6
|
|||
|
|||
"Mike Rivers" wrote in message
news:znr1120045997k@trad In article writes: IOW, if the user manual says that the direct outs or inserts are +4, then I set the input sensitivity on the computer digital audio interface to +4 . . . . . One of my all-too-often points is that many computer digital audio interfaces have no way to set the input sensitivity unless you do it externally. Well *many* is one of those vague words that seems to mean something... ;-) Almost all of the pro interfaces I've used (over 30) have had at least -10 and +4 as options. One recent big nasty surprise was the fact that the AP 24192 lacked the slectable input sensitivity feature that graces virtually all of the rest of the M-Audio line. But, it does a pretty credible +4, and that is *the* pro standard, right? "External" could be an output level control on the mic preamp, or, if you have no other choice, the input gain of the preamp. But that only works if you have to turn the level to the converter down, not up, to achieve the desired amount of headroom. The way almost every audio interface stacks up is that the the stated sensitivity is way under FS. IOW a -10 input will generally put FS someplace around 2 volts, and a +4 input runs from 2.5 to about 8 volts for FS. It can all be workable, and I know that you have the understanding to make it work. But most of the time when people are faced with this problem, they don't have time to learn what's happening, they work on instinct (or just turn knobs until the meters read right, not listening to what's being recorded) and then ask on r.a.p. after the fact what was wrong with their mic preamp. That's one reason why I keep telling people to match up the specified numbers on the preamp and the interface, and then *look* at what they are recording. If your direct outs or insert points are running at the same nominal level as your digital recorder's input sensitivity is set for, then adjusting the mic preamp to make the digital recorder happy automatically ensures that the rest of the console will be happy, too. But that "if" isn't universally true. Nothing is universally true, but what do you call a console that gets bent of shape if you take its specs at face value? I call it unprofessional junk. Its not like the console market has a sole source... If there was in interface standard to which the industry adhered, we'd have an easier time with this, but marketing pressures more often that not at least on gear that might be somewhat lacking) cause this to be moved around for the sake of the best advertiseable numbers. I don't see a lot of marketing grease in fudging specd output levels. Where did I say *anything* about meters? I hate meters. Meters are good, but you have to know what they're telling you. Plan B: forget about what the meters say and trust the most relevant and accurate empirical results. In the race between ears and meters I'll take ears every place they work. I woudln't want to be without a meter bridge on my console because it's a quick look at what's working and what needs some attention. If you haven't noticed, I recommend a paradigm for recording that really doesn't require a lot of attention during tracking. Someplace around 16 tracks one meter per track starts becoming more of a light show than a relevant tool. I'll take the ability to listen to the track with headphones (a la Mackie's grotesquely misnamed solo buttons) over a meter, any day. Thats one reason why consoles have headphone jacks - so you can listen! ;-) That's one reason why I tell people to set levels based on the individual channel display(s) at full magnification in the DAW software. I can't think of a program that allows you to do this in real time though. It's very hard to do metering right in real time. So, why make it a critical sucess factor? It takes a test recording (or several), and faith that the setting you've established during your test will represent what happens in an actual take. Admittedly you have to know something about the work habits of the talent and the equipment. Getting along with and having a feel for talent is one of those big advantages of using flesh and blood technical staff as opposed to relying on machines for that. Besides, level setting need only be very approximate during tracking - that's one of the things that headroom is for. Having 20 dB of headroom is usally safe, but on a graphic display, that just looks like a skinny wiggly line. Hence my repeated advice that people only seriously judge waves based a full-screen view per track. Besides 20 dB headroom is pretty excessive in most cases. Did someone say 10 dB headroom? ;-) A waveform display is really only useful when you're close to the limit. Agreed, so there's no reason to make the waveform display during recording, a critical sucess factor, either. Keeping a watchful eye on the meters allows you to make adjustments in real time if necessary. Real time adjustments make mixing more confusing and more work later on. Bad form for tracking except in dire emergencies. But then this is a skill that you have to develop, along with others. I see real time adjustments as a skill I only practice when I'm doing live sound. I think I made my last real time adjustment for recording levels during tracking about 3 months ago... That ruined the whole day for me! ;-) |
#7
|
|||
|
|||
Arny Krueger wrote: That's one reason why I tell people to set levels based on the individual channel display(s) at full magnification in the DAW software. Then, there's no question about how much headroom there is between peaks and FS, and there's no question about how meter response relates to the music you are recording. It's really just a different kind of meter, a fast one with history. I agree with you, by the way. Bob -- "Things should be described as simply as possible, but no simpler." A. Einstein |
Reply |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Forum | |||
on topic: we need a rec.audio.pro.ot newsgroup! | Pro Audio | |||
Artists cut out the record biz | Pro Audio |