View Single Post
  #4   Report Post  
Posted to rec.audio.pro
Robert Orban[_3_] Robert Orban[_3_] is offline
external usenet poster
 
Posts: 4
Default Was lowered fidelity of 60's tv show music a function of how it wasrecorded or something else in the production pipeline?

In article ,
says...
For example - the Batman series Blu-Ray release which I gather was given

the
royal treatment - the original films were scanned and then digitally

massaged.
While the release looks amazing the mono audio is decent but not as good

as the
mono hi-fi that had existed for some time as heard on numerous LP's.

Presumably
they used the best version available for the release.

Another example is the Green Hornet theme where there's no comparison

between
the theme song heard on the show vs a re-recording of it done for the

album "The
Horn Meets The Hornet" - which is the recording that was used many years

later
in the movie "Kill Bill", not the tv series recording which is clearly a
different arrangement.

Of course the intended playback devices were small, very limited-range

speakers
on tv's of the era, but I'm wondering about what happened to it between

the
musicians in the studio and broadcast.


One reason for the mixing style of the '60s was that at the time, the AT&T
network connections from the networks' production centers to their
affiliates limited audio bandwidth to 5 kHz, so network affiliates except
in the originating cities (New York and Los Angeles) suffered from very
low-fi audio.

If I recall correctly, this changed in the '80s with the advent of BTSC
stereo audio.

--Bob Orban