View Single Post
  #28   Report Post  
Posted to rec.audio.opinion,rec.audio.tech
Radium Radium is offline
external usenet poster
 
Posts: 147
Default Poll: WMA vs. MP3

wrote:
Well, that explains a lot behind your question.

Dropping CD quality audio 20 an effective bit rate of
20 kbits/sec is VERY likely going to sound perfectly
LOUSY no matter what encoder you use. That's a
compression ratio of about 70:1.


Okay.

SO your question "WMA vs MP3" really boils down
to which miserable piece of **** audio are you willing
to tolerate.

The answer is neither, they both suck.


Well, if my bandwidth is suffering I don't mind 20 kbps of WMA. Just
make sure that it is monoaural and at least 44.1 khz sample rate. Oh,
and make sure that it is 44.1 khz and monoaural before it is compressed
as well. Don't want aliasing or any artifacts associated with
sample-rate change. IOW, the sample rate of the digital audio should be
the same [and at least 44.1 khz] before compression and after
compression.

As for MP3s and non-WMA compression schemes. F--k those s--ts. If I am
going to listen to digital audio that is compressed with something
other than WMA, than it needs to be at least 44.1 khz, monoaural, and
no less than 320 kbps. Even in 128 kbps MP3s I notice the ear-foaming
degradation.

I really don't get how anyone can stand the noise that occurs in MP3
and other non-WMA compressions.