Home |
Search |
Today's Posts |
|
#1
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it wasrecorded or something else in the production pipeline?
For example - the Batman series Blu-Ray release which I gather was given the royal treatment - the original films were scanned and then digitally massaged. While the release looks amazing the mono audio is decent but not as good as the mono hi-fi that had existed for some time as heard on numerous LP's. Presumably they used the best version available for the release.
Another example is the Green Hornet theme where there's no comparison between the theme song heard on the show vs a re-recording of it done for the album "The Horn Meets The Hornet" - which is the recording that was used many years later in the movie "Kill Bill", not the tv series recording which is clearly a different arrangement. Of course the intended playback devices were small, very limited-range speakers on tv's of the era, but I'm wondering about what happened to it between the musicians in the studio and broadcast. Was the gear used in the recording sessions any different than that used for album releases or is it a function of how it was subsequently treated/stored? |
#2
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
On 14/07/2020 18:36, James wrote:
For example - the Batman series Blu-Ray release which I gather was given the royal treatment - the original films were scanned and then digitally massaged. While the release looks amazing the mono audio is decent but not as good as the mono hi-fi that had existed for some time as heard on numerous LP's. Presumably they used the best version available for the release. A common way to distribute TV shows to stations at the period was to use 16mm film with an optical sound track. If the original sound recordings made on set are not available, there is no easy way to transfer voice and effects independently of the music, and reconstructing dialogue sync would be a nightmare anyway. While good enough for the broadcasting chain at the time, modern equipment shows up the problems. With care, well shot 16mm film can be scanned to produce better than normal HD images, and if they used 35mm film, with care, 4K could easily be achieved for image resolution. Optical sound has a sound frequency range not much better than a telephone, and has, by modern standards, a truly awful distortion spectrum. For UK releases, it doesn't help that American TV film for broadcast was shot at 24 fps, not the 25 fps used in the UK, so the pitch of the sound is normally 4% higher than it should be, as the audio artifacts produced by shift pitching are easily noticeable. As a result, it is far easier to get good quality images on re-releases from 16mm film than even barely acceptable sound. If they exist, better sound can be got from magnetic sound tracks, but they cost more to copy, so were very rarely used for TV show distribution until later on. The LP producers would have had access to the original tape recordings of the music sessions. -- Tciao for Now! John. |
#3
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it wasrecorded or something else in the production pipeline?
James wrote:
For example - the Batman series Blu-Ray release which I gather was given th= e royal treatment - the original films were scanned and then digitally mass= aged. While the release looks amazing the mono audio is decent but not as g= ood as the mono hi-fi that had existed for some time as heard on numerous L= P's. Presumably they used the best version available for the release.=20 What you heard was the original cues recorded on 1/4" tape, dubbed to magfilm (probably 16mm magfilm, which runs a little slower than 7.5ips tape), mixed with the dialogue and effects to a second magfilm generation. If you were lucky. If you weren't lucky, you heard a third generation mixed from second generation stems. If you were really unlucky, you heard an optical sound track cut from the second or third generation mag track. IN a lot of cases, that is the only thing left, the final optical release. 35mm optical sounds kind of crappy but 16mm optical is awful. Still, nobody cared because the end customer was listening to it on a 3" TV speaker. In many cases, it was standard to just aggressively high pass and low pass everything at every step in the mixing process in order to make it come across better on a 3" speaker. Was the gear used in the recording sessions any different than that used fo= r album releases or is it a function of how it was subsequently treated/sto= red? The album release was often a totally different arrangement, recorded with a different orchestra, with different recording procedures. In some cases (like the Poseidon Adventure soundtrack album) even the vocal tracks were sung by totally different performers with different styles. But in general the whole production process was not hi-fi because it was not necessary for it to be. --scott -- "C'est un Nagra. C'est suisse, et tres, tres precis." |
#4
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it wasrecorded or something else in the production pipeline?
|
#5
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
TV audio was amplitude-modulation-based until the
aforementioned switch to stereo broadcasts(U.S. early '80s). Europe and Asia had stereo OTA TV several years before us - while America as usual was figuring out which war it could get itself into... |
#6
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
Theckmah, the retarded dumb-****, gibbered in message
... TV audio was amplitude-modulation-based until the aforementioned switch to stereo broadcasts(U.S. early '80s). As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. |
#7
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
On 28/07/2020 12:25, None wrote:
As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. In PAL land, I have a memory that the audio was FM on a sub carrier and the video was inverse modulated AM, so interference tended to show as black spots. Before PAL, we used 405 lines where more signal gave a brighter spot, so you noticed when a car drove past or someone turned a light off and the switch sparked. -- Tciao for Now! John. |
#8
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how it was recorded or something else in the production pipeline?
On Tue, 28 Jul 2020 13:21:04 +0100, John Williamson
wrote: On 28/07/2020 12:25, None wrote: As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. In PAL land, I have a memory that the audio was FM on a sub carrier and the video was inverse modulated AM, so interference tended to show as black spots. Before PAL, we used 405 lines where more signal gave a brighter spot, so you noticed when a car drove past or someone turned a light off and the switch sparked. The worst thing with white-up was that the signal amplitude was proportion to brightness, so it was impossible to find the black level. Sync would start to roll on bright scenes and the brightness would breathe in and out as the scene's peak brightness changed. Once we changed to black-up, every frame had the same amplitude, black level was always the same so you could use a simple diode clamp, and a simple level sampler would pull the sync off. Happy days. d -- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus |
#9
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
On Tuesday, July 28, 2020 at 7:26:04 AM UTC-4, None wrote:
Theckmah, the retarded dumb-****, gibbered in message ... TV audio was amplitude-modulation-based until the aforementioned switch to stereo broadcasts(U.S. early '80s). As always, the village idiot has no idea what he's on about. NTSC audio was FM, not AM. Thank you. |
#10
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
Ty Ford:
By AM I meant not the AM Band but the format. I read such, in print, over thirty years ago before the internet took off. |
#11
Posted to rec.audio.pro
|
|||
|
|||
Was lowered fidelity of 60's tv show music a function of how itwas recorded or something else in the production pipeline?
|
Reply |
Thread Tools | |
Display Modes | |
|
|