View Single Post
  #9   Report Post  
Posted to rec.audio.pro
geoff geoff is offline
external usenet poster
 
Posts: 1,812
Default Bit Rate versus Sample Rate

On 16/05/2019 11:09 PM, mcp6453 wrote:
I've been asked to explain bit rate versus sample rate to a
non-technical audience. Can anyone think of a good *analogy* to help
explain the difference?


Sample-rate is a solid figure based on double the highest frequency to
be sampled.

Bit-rate in linear systems (LPCM) is the sample rate multiplied by the
bit depth.

In non-linear systems the bit rate is the rate of digital data being
delivered, however this has no relationship to that of LPCM other that
lower bit rates equate to lower fidelity.... the exception being where
lossless data compression has been applied, which although being a
bit-rate lower than the equivalent spec LPCM, should have bit-identical
output.

geoff