Home |
Search |
Today's Posts |
|
#1
![]()
Posted to rec.audio.high-end
|
|||
|
|||
![]()
"Audio Empire" wrote in message
On the playback end, it was D/A converters that were not able to do a full 16-bits linearly (early Philips players (Magnavox) didn't even try. They used 14-bit D/A converters and the little Magnavox FD-1000 sounded MUCH better than the Japanese 16-bit units of the day). The above account ignores the fact that oversampling was used to obtain 16 bit performance from 14 bit parts. For all practical purposes, the converters were 16 bit. The claim that there was a signficant and large audible difference has been investigated with DBTs and found to be yet another audiophile myth. They also had really crude multi-pole anti-alaising filters and produced, what would be considered today, unacceptable levels of quantization error. As a rule there are no anti-aliasing filters in playback devices. Aliasing is only possible in ADCs and resamplers. Quantization error and aliasing are orthogonal effects and exist independently. Something that addresses one generally has no effect on the other. The fix for aliasing is better filters, and the fix for quantization error is not filtering but rather randomizing schemes such as dither. Therefore the statement that crude multi-pole anti-alaising filters and produced, what would be considered today,unacceptable levels of quantization error" is a technical impossibility. Thus the above claim must also be dismissed as an audiophile myth on the grounds that it is a confused misuse of technical terminology. The first generations of Sony CD players were just terrible and even with good, modern CDs, they sound simply wretched. I have an acquaintance who still uses a Sony CDP-101 (the first publicly available CD player, IIRC) and thinks it's just fine. Of course, he's 84 and deaf as a post. Anyone would have to be to put-up with that wretchedness! I still have an operational CDP 101 and so does a friend. They both have are well-maintained and sound good. I once had a CDP101 that had problems with its servo chips, and it did indeed sound bad - it didn't track most CDs. In the late 1980s Stereo Review used several teams of audiophiles to investigate the sound quality of CDP 101s via DVTs and found only tiny barely audible differences and that only with very specific program kinds material, or artificial test signals. |
#2
![]()
Posted to rec.audio.high-end
|
|||
|
|||
![]()
On Mon, 9 Aug 2010 17:05:12 -0700, Arny Krueger wrote
(in article ): "Audio Empire" wrote in message On the playback end, it was D/A converters that were not able to do a full 16-bits linearly (early Philips players (Magnavox) didn't even try. They used 14-bit D/A converters and the little Magnavox FD-1000 sounded MUCH better than the Japanese 16-bit units of the day). The above account ignores the fact that oversampling was used to obtain 16 bit performance from 14 bit parts. For all practical purposes, the converters were 16 bit. No, the D/A converters were 14-bit. They used 14-bit converters because Philips believed (and rightly so) that the then current 16-bit DACs weren't very linear. The fact that they used 4X oversampling to achieve 16-bit resolution is irrelevant to my statement. The claim that there was a signficant and large audible difference has been investigated with DBTs and found to be yet another audiophile myth. Sorry. I had both the Sony CDP-101 and The Philips-Maganvox FD-1000, and I beg to differ. The Sony sounded awful (still does) and the little Maggie was much more listenable (and still is). I ended-up giving the Sony to a friend - he didn't like it either. They also had really crude multi-pole anti-alaising filters and produced, what would be considered today, unacceptable levels of quantization error. As a rule there are no anti-aliasing filters in playback devices. Aliasing is only possible in ADCs and resamplers. Nyquist requires that the upper frequency response limit of the reconstructed waveform (the Nyquist frequency) be half of the sampling rate and the signal at the sampling rate must not have sufficient amplitude to be quantifiable. This means that the reconstruction filter must be very steep to avoid there being significant signal at 44.1 Khz. Meaning that above the Nyquist frequency (in this case 22.05KHz) cutoff needs to be as absolute as possible leading to designs of filters with as many as six poles (before the advent of cheap digital filtering, that is). Some players (like the aforementioned Philips) used oversampling to lessen the burden of the reconstruction filter (which I've always heard generally called an anti-ailasing filter, although you are right, technically. Anti-ailasing is used to bandwidth limit an analog signal BEFORE quantization in order to satisfy the Nyquist theorem) by allowing said filter to be less steep. Therefore the statement that crude multi-pole anti-alaising filters and produced, what would be considered today,unacceptable levels of quantization error" is a technical impossibility. Thus the above claim must also be dismissed as an audiophile myth on the grounds that it is a confused misuse of technical terminology. I'm afraid the confusion is on your end, my friend. My statement: "They also had really crude multi-pole anti-alaising filters AND produced, what would be considered today, unacceptable levels of quantization error." are actually two statements linked by "and" . If I had meant to say what you characterize above, I would have said: "They also had really crude multi-pole anti-alaising filters WHICH produced, what would be considered today, unacceptable levels of quantization error." But I clearly didn't say (or mean) that. The first generations of Sony CD players were just terrible and even with good, modern CDs, they sound simply wretched. I have an acquaintance who still uses a Sony CDP-101 (the first publicly available CD player, IIRC) and thinks it's just fine. Of course, he's 84 and deaf as a post. Anyone would have to be to put-up with that wretchedness! I still have an operational CDP 101 and so does a friend. They both have are well-maintained and sound good. Er, it's hard to account for a reaction like that... |
#3
![]()
Posted to rec.audio.high-end
|
|||
|
|||
![]()
"Audio Empire" wrote in message
On Mon, 9 Aug 2010 17:05:12 -0700, Arny Krueger wrote (in article ): "Audio Empire" wrote in message On the playback end, it was D/A converters that were not able to do a full 16-bits linearly (early Philips players (Magnavox) didn't even try. They used 14-bit D/A converters and the little Magnavox FD-1000 sounded MUCH better than the Japanese 16-bit units of the day). The above account ignores the fact that oversampling was used to obtain 16 bit performance from 14 bit parts. For all practical purposes, the converters were 16 bit. No, the D/A converters were 14-bit. They were in an oversapling configuration. This is well known. The objective of the oversampling was a trade off of speed which was in abundance, for linearity which was costly. They used 14-bit converters because Philips believed (and rightly so) that the then current 16-bit DACs weren't very linear. In 1972 (ten years earlier) I worked with 16 bit, 200 KHz DACs that had 1/2 bit linearity and monotonicity. The only problem with 16 bit DACs was their price before the CD player market ramped up production. The fact that they used 4X oversampling to achieve 16-bit resolution is irrelevant to my statement. Your statement was false because of the false claims that it included including "..the little Magnavox FD-1000 sounded MUCH better than the Japanese 16-bit units of the day). In fact they both were sonically transparent or very nearly so to the extent that they absolutely blew away the analog equipment of the day, given proper source material to play which was readily available from the onset. The claim that there was a signficant and large audible difference has been investigated with DBTs and found to be yet another audiophile myth. Sorry. I had both the Sony CDP-101 and The Philips-Maganvox FD-1000, and I beg to differ. The Sony sounded awful (still does) and the little Maggie was much more listenable (and still is). I ended-up giving the Sony to a friend - he didn't like it either. I don't believe that we have ever been treated to your technical measurements or the results of proper statistically-analyzed, time-synched level, matched comparisons of them. The extant well-controlled listennig tests involving them tell a different story - both units were eminantely listenable given that they were in good working order. They also had really crude multi-pole anti-alaising filters and produced, what would be considered today, unacceptable levels of quantization error. As a rule there are no anti-aliasing filters in playback devices. Aliasing is only possible in ADCs and resamplers. Nyquist requires that the upper frequency response limit of the reconstructed waveform (the Nyquist frequency) be half of the sampling rate and the signal at the sampling rate must not have sufficient amplitude to be quantifiable. This means that the reconstruction filter must be very steep to avoid there being significant signal at 44.1 Khz. Now you've had a chance to review the relevant technical material and change your story. The filters are now properly identified as "reconstruction filters". Yet you present this all like its a correction to my statement which was correct all along. Meaning that above the Nyquist frequency (in this case 22.05KHz) cutoff needs to be as absolute as possible leading to designs of filters with as many as six poles (before the advent of cheap digital filtering, that is). If you think that the origional CD players had 6 pole filters, then you are again not telling it like it was. If memory serves there were about 15 inductors and 15 capacitors per channel in the reconstruction filters of the CDP 101. This was pretty typical. Any second year engineering student knows that filters like these have about 30 poles (in pairs). |
#4
![]()
Posted to rec.audio.high-end
|
|||
|
|||
![]()
On Tue, 10 Aug 2010 18:01:57 -0700, Arny Krueger wrote
(in article ): "Audio Empire" wrote in message On Mon, 9 Aug 2010 17:05:12 -0700, Arny Krueger wrote (in article ): "Audio Empire" wrote in message On the playback end, it was D/A converters that were not able to do a full 16-bits linearly (early Philips players (Magnavox) didn't even try. They used 14-bit D/A converters and the little Magnavox FD-1000 sounded MUCH better than the Japanese 16-bit units of the day). The above account ignores the fact that oversampling was used to obtain 16 bit performance from 14 bit parts. For all practical purposes, the converters were 16 bit. No, the D/A converters were 14-bit. They were in an oversapling configuration. This is well known. The objective of the oversampling was a trade off of speed which was in abundance, for linearity which was costly. They used 14-bit converters because Philips believed (and rightly so) that the then current 16-bit DACs weren't very linear. In 1972 (ten years earlier) I worked with 16 bit, 200 KHz DACs that had 1/2 bit linearity and monotonicity. The only problem with 16 bit DACs was their price before the CD player market ramped up production. Yes, so the ones used by many CD manufacturers weren't very linear, and those which were were more expensive than mass-market manufacturers wanted to spend. In the early days, numerous things were tried to get around this problem, lower bit D/As, over sampling, single bit D/As that used the same bit for everything (insuring the steps were absolutely the same, and therefore linear) etc. Eventually, the D/As got better (laser trimming, etc.) and the sound of CD players improved. Today, they're pretty close to "perfect". The fact that they used 4X oversampling to achieve 16-bit resolution is irrelevant to my statement. Your statement was false because of the false claims that it included including "..the little Magnavox FD-1000 sounded MUCH better than the Japanese 16-bit units of the day). In fact they both were sonically transparent or very nearly so to the extent that they absolutely blew away the analog equipment of the day, given proper source material to play which was readily available from the onset. My experience tells me otherwise. Sorry about that. The claim that there was a signficant and large audible difference has been investigated with DBTs and found to be yet another audiophile myth. Sorry. I had both the Sony CDP-101 and The Philips-Maganvox FD-1000, and I beg to differ. The Sony sounded awful (still does) and the little Maggie was much more listenable (and still is). I ended-up giving the Sony to a friend - he didn't like it either. I don't believe that we have ever been treated to your technical measurements or the results of proper statistically-analyzed, time-synched level, matched comparisons of them. The extant well-controlled listennig tests involving them tell a different story - both units were eminantely listenable given that they were in good working order. Nor have we been treated to your test results and technical measurements or the results of proper statistically-analyzed, time-synched level, matched comparisons of them, either. They also had really crude multi-pole anti-alaising filters and produced, what would be considered today, unacceptable levels of quantization error. As a rule there are no anti-aliasing filters in playback devices. Aliasing is only possible in ADCs and resamplers. Nyquist requires that the upper frequency response limit of the reconstructed waveform (the Nyquist frequency) be half of the sampling rate and the signal at the sampling rate must not have sufficient amplitude to be quantifiable. This means that the reconstruction filter must be very steep to avoid there being significant signal at 44.1 Khz. Now you've had a chance to review the relevant technical material and change your story. The filters are now properly identified as "reconstruction filters". Yet you present this all like its a correction to my statement which was correct all along. I was just using the standard parlance as I explained above (and you "conveniently" snipped the part where I SAID THAT YOU WERE RIGHT, but that these reconstruction filters are commonly called anti-alaising filters, even though that term is not strictly correct. I'm not just addressing you in this thread, you know? And if you're going to debate with me, I'd appreciate it if you would try to be a little more honest in your snippage, OK?). Meaning that above the Nyquist frequency (in this case 22.05KHz) cutoff needs to be as absolute as possible leading to designs of filters with as many as six poles (before the advent of cheap digital filtering, that is). If you think that the origional CD players had 6 pole filters, then you are again not telling it like it was. If memory serves there were about 15 inductors and 15 capacitors per channel in the reconstruction filters of the CDP 101. This was pretty typical. Any second year engineering student knows that filters like these have about 30 poles (in pairs). Even worse. I had forgotten and memory "didn't serve". It's been a long time, so what? |
#5
![]()
Posted to rec.audio.high-end
|
|||
|
|||
![]()
"Audio Empire" wrote in message
The claim that there was a signficant and large audible difference has been investigated with DBTs and found to be yet another audiophile myth. Sorry. I had both the Sony CDP-101 and The Philips-Maganvox FD-1000, and I beg to differ. The Sony sounded awful (still does) and the little Maggie was much more listenable (and still is). I ended-up giving the Sony to a friend - he didn't like it either. I don't believe that we have ever been treated to your technical measurements or the results of proper statistically-analyzed, time-synched level, matched comparisons of them. The extant well-controlled listennig tests involving them tell a different story - both units were eminantely listenable given that they were in good working order. Nor have we been treated to your test results and technical measurements or the results of proper statistically-analyzed, time-synched level, matched comparisons of them, either. How quickly some forget evidence that does not fit with their prejudices? Masters, Ian G. and Clark, D. L., "Do All CD Players Sound the Same?", Stereo Review, pp.50-57 (January 1986) I don't know how many times I've posted this reference, just in RAHE. |
#6
![]()
Posted to rec.audio.high-end
|
|||
|
|||
![]()
In article ,
"Arny Krueger" wrote: In 1972 (ten years earlier) I worked with 16 bit, 200 KHz DACs that had 1/2 bit linearity and monotonicity. I'm curious about this experience. In what way were you using this gear? |
#7
![]()
Posted to rec.audio.high-end
|
|||
|
|||
![]()
"Jenn" wrote in message
In article , "Arny Krueger" wrote: In 1972 (ten years earlier) I worked with 16 bit, 200 KHz DACs that had 1/2 bit linearity and monotonicity. I'm curious about this experience. In what way were you using this gear? It was part of an EAI 680 hybrid computer system. The other major component was an IBM 1130 computer. We used it to solve differential equations and do simulations, some of which related directly to audio. We also used it as a digital record/playback system for recordings of music. It sounded pretty good! |