View Single Post
  #10   Report Post  
Posted to rec.audio.high-end
ST[_2_] ST[_2_] is offline
external usenet poster
 
Posts: 8
Default How pure is the signal when it reaches our ears?

On Wednesday, 22 January 2014 11:12:32 UTC+8, KH wrote:
...
Well, the point is that once leaving the amplifier, all signals are

analog. Whatever front end you use, your speakers are oblivious, and

the physics of sound transmission from speaker to ear - for any given

room and ear - are identical irrespective of whether the signal started

as analog or digital.



Agreed. But the recorded sound is subjected to further distortion with vinyl. That’s where the difference lies. The accepted practice when recording music is to place the microphones close to the source so that the distortion (reverberation) is avoided. The sound together with the distortion is actually what we hear all the time once it leaves the source. Here the term distortion is used to refer to anything that alters the original signal.

Therefore, the sound played back using the loudspeakers' should sound identical to the source because it also goes through the same impediment while travelling from the speakers to the ears.(Please ignore other limitations for now. I know recorded sound can never sound like live sound).

That brings us to another question; what was captured by the microphones?. We assume that all the sound emitted from the source is capable of being captured by the microphone. In reality, the microphones capture only a tiny fraction of the sound. So too our ears. The only difference here is the distance AND the inherent defects of our hearing mechanism. This is followed by the brain that is capable of doing the filtering such as ignoring the early reverberation. The ear hears a fraction of a bigger “distorted” sound, where else the microphones hear a fraction of clean undistorted sound at close distance. Finally, recording engineers add the right amount of “distortion” (reverberation) to make the sound to be correct to our ears.

Now going back to the playback, in either digital or analogue, what's happens is the tiny energy which is probably less than 0.001% (maybe a couple of more 0s?) of the original sound now amplified to represent the actual sound. In the case of vinyl, the amplification contains further distortion due to the mechanical process. Such distortion escapes digital. Now, which of these two versions would arrive to our ears with the right amount of “distortion” as heard by us in the live event? What kind of measurement we should take into consideration when judging sound quality as perceived by us. Here, I am referring to what the ears hear and matters to us. Not the measurements using oscilloscope or instruments which is not what we are hearing in real life.




So whatever "wigglyness" happens between the speaker and your ear

happens to all such signals irrespective of origin.


Agreed. Since “wigglyness” is part of the equation which is important to our subjective assessment of sound then why are we eliminating them in vinyl vs digital discussion by showing proof such as the video mentioned in the first post. Common argument used to be that analogue (vinyl) is too distorted to be accepted as preferred sound when the reality is “distorted” sound is what we hear. Shouldn’t the question be how much and what type “distortion” is needed for more realistic recorded sound playback? Is vinyl somehow getting it right?

p.s. I do not play vinyl.