Home |
Search |
Today's Posts |
#1
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it wasrecorded or something else in the production pipeline?
For example - the Batman series Blu-Ray release which I gather was given the royal treatment - the original films were scanned and then digitally massaged. While the release looks amazing the mono audio is decent but not as good as the mono hi-fi that had existed for some time as heard on numerous LP's. Presumably they used the best version available for the release.
Another example is the Green Hornet theme where there's no comparison between the theme song heard on the show vs a re-recording of it done for the album "The Horn Meets The Hornet" - which is the recording that was used many years later in the movie "Kill Bill", not the tv series recording which is clearly a different arrangement. Of course the intended playback devices were small, very limited-range speakers on tv's of the era, but I'm wondering about what happened to it between the musicians in the studio and broadcast. Was the gear used in the recording sessions any different than that used for album releases or is it a function of how it was subsequently treated/stored? |
#2
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
On 14/07/2020 18:36, James wrote:
For example - the Batman series Blu-Ray release which I gather was given the royal treatment - the original films were scanned and then digitally massaged. While the release looks amazing the mono audio is decent but not as good as the mono hi-fi that had existed for some time as heard on numerous LP's. Presumably they used the best version available for the release. A common way to distribute TV shows to stations at the period was to use 16mm film with an optical sound track. If the original sound recordings made on set are not available, there is no easy way to transfer voice and effects independently of the music, and reconstructing dialogue sync would be a nightmare anyway. While good enough for the broadcasting chain at the time, modern equipment shows up the problems. With care, well shot 16mm film can be scanned to produce better than normal HD images, and if they used 35mm film, with care, 4K could easily be achieved for image resolution. Optical sound has a sound frequency range not much better than a telephone, and has, by modern standards, a truly awful distortion spectrum. For UK releases, it doesn't help that American TV film for broadcast was shot at 24 fps, not the 25 fps used in the UK, so the pitch of the sound is normally 4% higher than it should be, as the audio artifacts produced by shift pitching are easily noticeable. As a result, it is far easier to get good quality images on re-releases from 16mm film than even barely acceptable sound. If they exist, better sound can be got from magnetic sound tracks, but they cost more to copy, so were very rarely used for TV show distribution until later on. The LP producers would have had access to the original tape recordings of the music sessions. -- Tciao for Now! John. |
#3
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it wasrecorded or something else in the production pipeline?
James wrote:
For example - the Batman series Blu-Ray release which I gather was given th= e royal treatment - the original films were scanned and then digitally mass= aged. While the release looks amazing the mono audio is decent but not as g= ood as the mono hi-fi that had existed for some time as heard on numerous L= P's. Presumably they used the best version available for the release.=20 What you heard was the original cues recorded on 1/4" tape, dubbed to magfilm (probably 16mm magfilm, which runs a little slower than 7.5ips tape), mixed with the dialogue and effects to a second magfilm generation. If you were lucky. If you weren't lucky, you heard a third generation mixed from second generation stems. If you were really unlucky, you heard an optical sound track cut from the second or third generation mag track. IN a lot of cases, that is the only thing left, the final optical release. 35mm optical sounds kind of crappy but 16mm optical is awful. Still, nobody cared because the end customer was listening to it on a 3" TV speaker. In many cases, it was standard to just aggressively high pass and low pass everything at every step in the mixing process in order to make it come across better on a 3" speaker. Was the gear used in the recording sessions any different than that used fo= r album releases or is it a function of how it was subsequently treated/sto= red? The album release was often a totally different arrangement, recorded with a different orchestra, with different recording procedures. In some cases (like the Poseidon Adventure soundtrack album) even the vocal tracks were sung by totally different performers with different styles. But in general the whole production process was not hi-fi because it was not necessary for it to be. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#4
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it wasrecorded or something else in the production pipeline?
|
#5
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
TV audio was amplitude-modulation-based until the
aforementioned switch to stereo broadcasts(U.S. early '80s). Europe and Asia had stereo OTA TV several years before us - while America as usual was figuring out which war it could get itself into... |
#6
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
Theckmah, the retarded dumb-****, gibbered in message
... TV audio was amplitude-modulation-based until the aforementioned switch to stereo broadcasts(U.S. early '80s). As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. |
#7
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
On 28/07/2020 12:25, None wrote:
As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. In PAL land, I have a memory that the audio was FM on a sub carrier and the video was inverse modulated AM, so interference tended to show as black spots. Before PAL, we used 405 lines where more signal gave a brighter spot, so you noticed when a car drove past or someone turned a light off and the switch sparked. -- Tciao for Now! John. |
#8
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
On Tue, 28 Jul 2020 13:21:04 +0100, John Williamson
wrote: On 28/07/2020 12:25, None wrote: As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. In PAL land, I have a memory that the audio was FM on a sub carrier and the video was inverse modulated AM, so interference tended to show as black spots. Before PAL, we used 405 lines where more signal gave a brighter spot, so you noticed when a car drove past or someone turned a light off and the switch sparked. The worst thing with white-up was that the signal amplitude was proportion to brightness, so it was impossible to find the black level. Sync would start to roll on bright scenes and the brightness would breathe in and out as the scene's peak brightness changed. Once we changed to black-up, every frame had the same amplitude, black level was always the same so you could use a simple diode clamp, and a simple level sampler would pull the sync off. Happy days. d -- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus |
#9
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
Don Pearce wrote:
The worst thing with white-up was that the signal amplitude was proportion to brightness, so it was impossible to find the black level. Sync would start to roll on bright scenes and the brightness would breathe in and out as the scene's peak brightness changed. For audio people, the first thing is that if the peak brightness was too high, the thing would overmodulate and the splatter would cause buzzing in the audio. This was most common with character generators set to make bright white titles... the buzzing continued as long as the titles were on the screen and of course the audio people were blamed. Video is evil. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#10
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
On Tuesday, July 28, 2020 at 7:26:04 AM UTC-4, None wrote:
Theckmah, the retarded dumb-****, gibbered in message ... TV audio was amplitude-modulation-based until the aforementioned switch to stereo broadcasts(U.S. early '80s). As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. Thank you. |
#11
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
Ty Ford:
By AM I meant not the AM Band but the format. I read such, in print, over thirty years ago before the internet took off. |
#12
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
|
#13
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
Peter Irwin wrote:
Sorry, RCA offered Armstong only one million - which was a ridiculous lowball, but it is better to be alive and moderately rich than dead and seriously rich. wrote: Ty Ford: By AM I meant not the AM Band but the format. I read such, in print, over thirty years ago before the internet took off. Still wrong. NTSC TV used FM from 1941. This meant that Major Armstrong was supposed get a percentage of the television manufacuring business until his patents ran out in 1950. RCA did not want to pay him on this basis and this resulted in one of the largest and messiest lawsuits ever. (That being said, he should have just taken the piddly $10 million from RCA instead of persuing the lawsuit that lead to his suicide even if his widow won.) Peter. |
#14
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
Peter Irwin:
Then what the hell did I read back in 1980-something? MTS stereo was an improvement in broadcast TV sound, not just spatially, but in the way it was carried on the broadcast channel. I could have sworn the documentation I was reading mentioned AM modulation prior to the MTS transition. And I'm not referring back to the pre-war days, when program audio was simulcast over a local AM radio station either. |
#15
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
|
#16
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
geoff:
"Morning"? What does time of day have to do with anything? |
#17
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
wrote:
Peter Irwin: Then what the hell did I read back in 1980-something? I don't know. You can tell the pix carrier is AM and the sound carrier is FM by the way they degrade as the signal strength drops. You'll have a good sound carrier as the pixture gets worse and worse and then all of a sudden the sound drops out of full FM quieting. The effect is sudden, unlike the effect on the picture. There were some folks who were arguing in favor of FM encoding of the video signal too, but that was too much bandwidth for consumer grade detectors in 1939. So instead we got vestigial sideband AM. MTS stereo was an improvement in broadcast TV sound, not just spatially, but in the way it was carried on the broadcast channel. I could have sworn the documentation I was reading mentioned AM modulation prior to the MTS transition. Nobody actually used MTS in the US, sadly. Some stations broadcast it, but hardly anyone was set up to receive it. It still didn't fix the central issue which is that video people don't care about sound. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#18
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
Scott Dorsey wrote:
Nobody actually used MTS in the US, sadly. _________ Then how was I getting stereo sound OTA broadcast from '84-85 - after which my family started subscribing to cable? |
#19
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
Scott Dorsey wrote:
Don Pearce wrote: The worst thing with white-up was that the signal amplitude was proportion to brightness, so it was impossible to find the black level. Sync would start to roll on bright scenes and the brightness would breathe in and out as the scene's peak brightness changed. For audio people, the first thing is that if the peak brightness was too high, the thing would overmodulate and the splatter would cause buzzing in the audio. This was most common with character generators set to make bright white titles... the buzzing continued as long as the titles were on the screen and of course the audio people were blamed. Video is evil. --scott There's a comedy show "Tim & Eric: Awesome Show Great Job!", and they used those old character generators ( some of the show was intentionally built around a public access channel aesthetic ). The sound wasn't done with the old equipment, but they'd still have the noise. -- Les Cargill |
#20
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
wrote:
Scott Dorsey wrote: Nobody actually used MTS in the US, sadly. _________ Then how was I getting stereo sound OTA broadcast from '84-85 - after which my family started subscribing to cable? MTS sets existed... but not many people bought them. And there were not many broadcasts distributed in stereo. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#21
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
Scott Dorsey:
MTS sets: I heard Miami Vice in stereo the first time it was broadcast as such. |
#22
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
Theckmah, the 'tarded dumb-****, wrote in message
... By AM I meant not the AM Band but the format. Still wrong, li'l buddy, whatever the **** you think you meant. I read such, in print, over thirty years ago before the internet took off. You can't even remember what you read yesterday, dumb ****. |
#23
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
|
Reply |
Thread Tools | |
Display Modes | |
|
|