I'm beginning to think...
On Wednesday, June 26, 2013 7:00:02 AM UTC-7, Arny Krueger wrote:
"Audio_Empire" wrote in message
...
The organizer of this session had
assembled (with my help) a bunch of compressed files. He ripped some CDs
at different data rates: 32, 64, 128, 192, 384 kbps (MP3) and FLAC. I
supplied some internet radio at 128 and 194 kbps (one of them was a live
concert).
Nobody (except me and one other guy) could really hear any statistically
significant difference. The vast majority of the 15 "high-enders" there
were wrong more than 50% of the time! Most said that they really heard
no difference in anything above 64kbps! They couldn't hear the obvious
compression artifacts in the music at 128 Kbps, which surprised me.
Based on past performance I doubt that a time-synched level matched DBT was
involved. Let's get that cleared up first.
There's nothing to clear up AFAICS. Had the results been positive; I.E. everyone heard the artifacts, then I would say that a carefully level-matched and time-sync'd DBT was important to the outcome. But they weren't asked to hear differences between cables or amps or even DACs, they were asked to hear artifacts in compressed audio and even though the levels were only matched to within about a dB using a Radio Shack hand-held digital sound level meter (you know the one), the important thing is that almost no one could hear the artifacts. I could hear them, one other well-known Bay Area audiophile could hear them, the rest could not!
|