Reply
 
Thread Tools Display Modes
  #1   Report Post  
Jezza
 
Posts: n/a
Default Is CD burning at slower speeds better?

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?
  #8   Report Post  
Mark A
 
Posts: n/a
Default Is CD burning at slower speeds better?

"Jezza" wrote in message
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I have seen tests of various burners and CD's. They usually record with
fewer errors at a speed less than the maximum, but higher than the minimum
for a particular recorder. I think these tests are on the web somewhere, but
don't recall where.


  #9   Report Post  
Mark A
 
Posts: n/a
Default Is CD burning at slower speeds better?

"Jezza" wrote in message
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I have seen tests of various burners and CD's. They usually record with
fewer errors at a speed less than the maximum, but higher than the minimum
for a particular recorder. I think these tests are on the web somewhere, but
don't recall where.


  #10   Report Post  
Mark A
 
Posts: n/a
Default Is CD burning at slower speeds better?

"Jezza" wrote in message
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I have seen tests of various burners and CD's. They usually record with
fewer errors at a speed less than the maximum, but higher than the minimum
for a particular recorder. I think these tests are on the web somewhere, but
don't recall where.




  #11   Report Post  
Bubba
 
Posts: n/a
Default Is CD burning at slower speeds better?


"Jezza" wrote in message
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


If your criteria in question was to see if slower
speed increases playing probability on more
CD players, answer is yes. Many of the older CD
players may not be ablke to play CDs produced at
speeds 4x and higher speeds. Media type and
burber type also are a factor.

Bubba


  #12   Report Post  
Bubba
 
Posts: n/a
Default Is CD burning at slower speeds better?


"Jezza" wrote in message
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


If your criteria in question was to see if slower
speed increases playing probability on more
CD players, answer is yes. Many of the older CD
players may not be ablke to play CDs produced at
speeds 4x and higher speeds. Media type and
burber type also are a factor.

Bubba


  #13   Report Post  
Bubba
 
Posts: n/a
Default Is CD burning at slower speeds better?


"Jezza" wrote in message
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


If your criteria in question was to see if slower
speed increases playing probability on more
CD players, answer is yes. Many of the older CD
players may not be ablke to play CDs produced at
speeds 4x and higher speeds. Media type and
burber type also are a factor.

Bubba


  #14   Report Post  
Jeff Wiseman
 
Posts: n/a
Default CD burning at slower speeds (long)



Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?



Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)

For burning CDs of any sort there is an issue of errors when
writing to disk. If I understand corectly, this is normally not a
big issue at all since the writer shouldn't be rated at a speed
that it cannot write without errors. Error correction is used
when played back. In general, data accuracy can always be
maintained fairly well, even on Audio disks.

However, the real issue that relates to the the OP's question is
a second issue specific to Audio CDs and applies even if there
are no read errors during playback. When playing back an Audio
CD, it is done in real time. A clock signal is recovered from the
bits comeing off of the CD and it is used to convert the digital
signal back to an analog one. all Conversion is done relative to
this clock signal. If the pits burned into the CD aren't spaced
prefectly, this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.

An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.

Obviously, if you are pushing the writer to its maximum speed,
there might be subtle variations in the spacing of the pits. The
data on the disk is exact (i.e., "reliable") but the inconsistent
spacing when played back in an audio system might not produce as
"accurate" a sound. Using a lower write speed just allows the
writer to operated well within its capability and produce a more
even (accurate) spacing of the pits. Again, you can get the
correct data off of the disk, the problem is in converting it to
real time analog music.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason. The
explanation seems to be that commercial CDs are "pressed" rather
than "burned". The edges of the pits on a pressed CD may not have
the clean edges and spacing that a burned CD run at lower speed
has resulting in potentially higher jitter than the duplicate
might generate. Since the actual data values on the original are
copied exactly onto the duplicate, it's 100% reliable and no data
is lost, however, the duplicat CD may have less jitter when
played back in real time.

Note again that this effect is very subtle and in general is not
going to be detected on many systems, even if the difference the
writer has in accuracy between fastest and slowest speeds is
great. On highly resolving systems, jitter's effects can be
noticed, however, the difference in jitter from a CD burned at 2x
and one at 12x on a 12x burner (all other things being equal)
might still be quite subtle and as a result is subject to much
discussion in the high end audio forums :-)

Bottom line? My opinion is that the effects are subtle enough
that it is more a matter of how much time I have to wait for the
thing to burn :-) If you have a fairly high quality sound system,
knock the burn rate down a notch or so it you have the time, but
if the CDs are for playing in the car or a walkman, blast away as
fast as you can.

All this is only IMHO of course!

- Jeff
  #15   Report Post  
Jeff Wiseman
 
Posts: n/a
Default CD burning at slower speeds (long)



Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?



Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)

For burning CDs of any sort there is an issue of errors when
writing to disk. If I understand corectly, this is normally not a
big issue at all since the writer shouldn't be rated at a speed
that it cannot write without errors. Error correction is used
when played back. In general, data accuracy can always be
maintained fairly well, even on Audio disks.

However, the real issue that relates to the the OP's question is
a second issue specific to Audio CDs and applies even if there
are no read errors during playback. When playing back an Audio
CD, it is done in real time. A clock signal is recovered from the
bits comeing off of the CD and it is used to convert the digital
signal back to an analog one. all Conversion is done relative to
this clock signal. If the pits burned into the CD aren't spaced
prefectly, this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.

An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.

Obviously, if you are pushing the writer to its maximum speed,
there might be subtle variations in the spacing of the pits. The
data on the disk is exact (i.e., "reliable") but the inconsistent
spacing when played back in an audio system might not produce as
"accurate" a sound. Using a lower write speed just allows the
writer to operated well within its capability and produce a more
even (accurate) spacing of the pits. Again, you can get the
correct data off of the disk, the problem is in converting it to
real time analog music.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason. The
explanation seems to be that commercial CDs are "pressed" rather
than "burned". The edges of the pits on a pressed CD may not have
the clean edges and spacing that a burned CD run at lower speed
has resulting in potentially higher jitter than the duplicate
might generate. Since the actual data values on the original are
copied exactly onto the duplicate, it's 100% reliable and no data
is lost, however, the duplicat CD may have less jitter when
played back in real time.

Note again that this effect is very subtle and in general is not
going to be detected on many systems, even if the difference the
writer has in accuracy between fastest and slowest speeds is
great. On highly resolving systems, jitter's effects can be
noticed, however, the difference in jitter from a CD burned at 2x
and one at 12x on a 12x burner (all other things being equal)
might still be quite subtle and as a result is subject to much
discussion in the high end audio forums :-)

Bottom line? My opinion is that the effects are subtle enough
that it is more a matter of how much time I have to wait for the
thing to burn :-) If you have a fairly high quality sound system,
knock the burn rate down a notch or so it you have the time, but
if the CDs are for playing in the car or a walkman, blast away as
fast as you can.

All this is only IMHO of course!

- Jeff


  #16   Report Post  
Jeff Wiseman
 
Posts: n/a
Default CD burning at slower speeds (long)



Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?



Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)

For burning CDs of any sort there is an issue of errors when
writing to disk. If I understand corectly, this is normally not a
big issue at all since the writer shouldn't be rated at a speed
that it cannot write without errors. Error correction is used
when played back. In general, data accuracy can always be
maintained fairly well, even on Audio disks.

However, the real issue that relates to the the OP's question is
a second issue specific to Audio CDs and applies even if there
are no read errors during playback. When playing back an Audio
CD, it is done in real time. A clock signal is recovered from the
bits comeing off of the CD and it is used to convert the digital
signal back to an analog one. all Conversion is done relative to
this clock signal. If the pits burned into the CD aren't spaced
prefectly, this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.

An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.

Obviously, if you are pushing the writer to its maximum speed,
there might be subtle variations in the spacing of the pits. The
data on the disk is exact (i.e., "reliable") but the inconsistent
spacing when played back in an audio system might not produce as
"accurate" a sound. Using a lower write speed just allows the
writer to operated well within its capability and produce a more
even (accurate) spacing of the pits. Again, you can get the
correct data off of the disk, the problem is in converting it to
real time analog music.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason. The
explanation seems to be that commercial CDs are "pressed" rather
than "burned". The edges of the pits on a pressed CD may not have
the clean edges and spacing that a burned CD run at lower speed
has resulting in potentially higher jitter than the duplicate
might generate. Since the actual data values on the original are
copied exactly onto the duplicate, it's 100% reliable and no data
is lost, however, the duplicat CD may have less jitter when
played back in real time.

Note again that this effect is very subtle and in general is not
going to be detected on many systems, even if the difference the
writer has in accuracy between fastest and slowest speeds is
great. On highly resolving systems, jitter's effects can be
noticed, however, the difference in jitter from a CD burned at 2x
and one at 12x on a 12x burner (all other things being equal)
might still be quite subtle and as a result is subject to much
discussion in the high end audio forums :-)

Bottom line? My opinion is that the effects are subtle enough
that it is more a matter of how much time I have to wait for the
thing to burn :-) If you have a fairly high quality sound system,
knock the burn rate down a notch or so it you have the time, but
if the CDs are for playing in the car or a walkman, blast away as
fast as you can.

All this is only IMHO of course!

- Jeff
  #17   Report Post  
Murray Peterson
 
Posts: n/a
Default CD burning at slower speeds (long)

Jeff Wiseman wrote in
:


Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)


Only if your explanation is accurate -- this one isn't even close. If the
data is retrievable without errors, then the burner was running at an
acceptable speed.

[snip]
A clock signal is recovered from the bits comeing off of the CD


This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.

If the pits burned into the CD aren't spaced prefectly,


A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.

this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.
[snip]


CDs don't work this way at all. The data on a CD isn't even in the correct
order when read back by the player -- blocks of data are intentionally
scattered and interleaved with error correction data, and the data itslef
is encoded to a different form.

It really isn't possible for "pit spacing" to have any effect on the D/A
jitter. The *only* way to read data from a CD requires that the data be
read into memory buffers, de-interleaved, and decoded back into the
original 16 bit samples. Only then is the data in a form to be fed into
the D/A converter, and only at that point does clock timing (and jitter)
come into play.

The "pit spacing" is completely irrelevant -- the data represented by those
pits is a mathematical encoding, which must be decoded and error corrected
before being fed to the D/A converter. The timing of the raw data is
completely separate from the timing of the final data being fed to the D/A.

Here is a web page that explains some of the details:
http://www.ee.washington.edu/consele...udio2/95x7.htm

In particular, read part II, which describes how the data is encoded and
interleaved before being written to the disk.


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.


This analogy is completely useless -- the only way for a CD to be read
involves changing the speed of the read head on a continuous basis. CD
players contain read buffers, and the disk will speed up or slow down in
order to keep those buffers full. From there, the data is fed into the
decoding circuitry, which decodes the raw EFM data, removes the
interleaving, performs error checking and correction, and finally feeds the
resulting 16 bit samples into buffers for feeding to the D/A converter.
Any jitter arises from poor clock circuitry that is taking the data from
those final buffers and feeding it into the D/A converter, and has nothing
to do with how the data is being read from the disk.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason.


Probably for the same reason that they hear improvements when supporting
their speaker cables on blocks of exotic woods (no, I am not joking here --
some of them actually believe this).
  #18   Report Post  
Murray Peterson
 
Posts: n/a
Default CD burning at slower speeds (long)

Jeff Wiseman wrote in
:


Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)


Only if your explanation is accurate -- this one isn't even close. If the
data is retrievable without errors, then the burner was running at an
acceptable speed.

[snip]
A clock signal is recovered from the bits comeing off of the CD


This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.

If the pits burned into the CD aren't spaced prefectly,


A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.

this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.
[snip]


CDs don't work this way at all. The data on a CD isn't even in the correct
order when read back by the player -- blocks of data are intentionally
scattered and interleaved with error correction data, and the data itslef
is encoded to a different form.

It really isn't possible for "pit spacing" to have any effect on the D/A
jitter. The *only* way to read data from a CD requires that the data be
read into memory buffers, de-interleaved, and decoded back into the
original 16 bit samples. Only then is the data in a form to be fed into
the D/A converter, and only at that point does clock timing (and jitter)
come into play.

The "pit spacing" is completely irrelevant -- the data represented by those
pits is a mathematical encoding, which must be decoded and error corrected
before being fed to the D/A converter. The timing of the raw data is
completely separate from the timing of the final data being fed to the D/A.

Here is a web page that explains some of the details:
http://www.ee.washington.edu/consele...udio2/95x7.htm

In particular, read part II, which describes how the data is encoded and
interleaved before being written to the disk.


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.


This analogy is completely useless -- the only way for a CD to be read
involves changing the speed of the read head on a continuous basis. CD
players contain read buffers, and the disk will speed up or slow down in
order to keep those buffers full. From there, the data is fed into the
decoding circuitry, which decodes the raw EFM data, removes the
interleaving, performs error checking and correction, and finally feeds the
resulting 16 bit samples into buffers for feeding to the D/A converter.
Any jitter arises from poor clock circuitry that is taking the data from
those final buffers and feeding it into the D/A converter, and has nothing
to do with how the data is being read from the disk.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason.


Probably for the same reason that they hear improvements when supporting
their speaker cables on blocks of exotic woods (no, I am not joking here --
some of them actually believe this).
  #19   Report Post  
Murray Peterson
 
Posts: n/a
Default CD burning at slower speeds (long)

Jeff Wiseman wrote in
:


Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)


Only if your explanation is accurate -- this one isn't even close. If the
data is retrievable without errors, then the burner was running at an
acceptable speed.

[snip]
A clock signal is recovered from the bits comeing off of the CD


This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.

If the pits burned into the CD aren't spaced prefectly,


A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.

this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.
[snip]


CDs don't work this way at all. The data on a CD isn't even in the correct
order when read back by the player -- blocks of data are intentionally
scattered and interleaved with error correction data, and the data itslef
is encoded to a different form.

It really isn't possible for "pit spacing" to have any effect on the D/A
jitter. The *only* way to read data from a CD requires that the data be
read into memory buffers, de-interleaved, and decoded back into the
original 16 bit samples. Only then is the data in a form to be fed into
the D/A converter, and only at that point does clock timing (and jitter)
come into play.

The "pit spacing" is completely irrelevant -- the data represented by those
pits is a mathematical encoding, which must be decoded and error corrected
before being fed to the D/A converter. The timing of the raw data is
completely separate from the timing of the final data being fed to the D/A.

Here is a web page that explains some of the details:
http://www.ee.washington.edu/consele...udio2/95x7.htm

In particular, read part II, which describes how the data is encoded and
interleaved before being written to the disk.


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.


This analogy is completely useless -- the only way for a CD to be read
involves changing the speed of the read head on a continuous basis. CD
players contain read buffers, and the disk will speed up or slow down in
order to keep those buffers full. From there, the data is fed into the
decoding circuitry, which decodes the raw EFM data, removes the
interleaving, performs error checking and correction, and finally feeds the
resulting 16 bit samples into buffers for feeding to the D/A converter.
Any jitter arises from poor clock circuitry that is taking the data from
those final buffers and feeding it into the D/A converter, and has nothing
to do with how the data is being read from the disk.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason.


Probably for the same reason that they hear improvements when supporting
their speaker cables on blocks of exotic woods (no, I am not joking here --
some of them actually believe this).
  #20   Report Post  
s.stef
 
Posts: n/a
Default Is CD burning at slower speeds better?


"Jezza" ha scritto nel messaggio
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I tested that cds burned at 8x max plays on every cdplayer.

16x or over don't.

bye
stef




  #21   Report Post  
s.stef
 
Posts: n/a
Default Is CD burning at slower speeds better?


"Jezza" ha scritto nel messaggio
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I tested that cds burned at 8x max plays on every cdplayer.

16x or over don't.

bye
stef


  #22   Report Post  
s.stef
 
Posts: n/a
Default Is CD burning at slower speeds better?


"Jezza" ha scritto nel messaggio
om...
I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I tested that cds burned at 8x max plays on every cdplayer.

16x or over don't.

bye
stef


  #23   Report Post  
jriegle
 
Posts: n/a
Default CD burning at slower speeds (long)


"Jeff Wiseman" wrote in message
...


Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?



Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)

For burning CDs of any sort there is an issue of errors when
writing to disk. If I understand corectly, this is normally not a
big issue at all since the writer shouldn't be rated at a speed
that it cannot write without errors. Error correction is used
when played back. In general, data accuracy can always be
maintained fairly well, even on Audio disks.

However, the real issue that relates to the the OP's question is
a second issue specific to Audio CDs and applies even if there
are no read errors during playback. When playing back an Audio
CD, it is done in real time. A clock signal is recovered from the
bits comeing off of the CD and it is used to convert the digital
signal back to an analog one. all Conversion is done relative to
this clock signal. If the pits burned into the CD aren't spaced
prefectly, this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.


It is not real time. The data is read into a small buffer where it is
processed by the CPU. The spindle motor is constantly changing speed so,
real time would be an issue. The CPU must have a storage space to fix data
errors and allow time to recover from the speed adjustments and mistracks
from minor bumps and such.
John


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.

Obviously, if you are pushing the writer to its maximum speed,
there might be subtle variations in the spacing of the pits. The
data on the disk is exact (i.e., "reliable") but the inconsistent
spacing when played back in an audio system might not produce as
"accurate" a sound. Using a lower write speed just allows the
writer to operated well within its capability and produce a more
even (accurate) spacing of the pits. Again, you can get the
correct data off of the disk, the problem is in converting it to
real time analog music.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason. The
explanation seems to be that commercial CDs are "pressed" rather
than "burned". The edges of the pits on a pressed CD may not have
the clean edges and spacing that a burned CD run at lower speed
has resulting in potentially higher jitter than the duplicate
might generate. Since the actual data values on the original are
copied exactly onto the duplicate, it's 100% reliable and no data
is lost, however, the duplicat CD may have less jitter when
played back in real time.

Note again that this effect is very subtle and in general is not
going to be detected on many systems, even if the difference the
writer has in accuracy between fastest and slowest speeds is
great. On highly resolving systems, jitter's effects can be
noticed, however, the difference in jitter from a CD burned at 2x
and one at 12x on a 12x burner (all other things being equal)
might still be quite subtle and as a result is subject to much
discussion in the high end audio forums :-)

Bottom line? My opinion is that the effects are subtle enough
that it is more a matter of how much time I have to wait for the
thing to burn :-) If you have a fairly high quality sound system,
knock the burn rate down a notch or so it you have the time, but
if the CDs are for playing in the car or a walkman, blast away as
fast as you can.

All this is only IMHO of course!

- Jeff



  #24   Report Post  
jriegle
 
Posts: n/a
Default CD burning at slower speeds (long)


"Jeff Wiseman" wrote in message
...


Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?



Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)

For burning CDs of any sort there is an issue of errors when
writing to disk. If I understand corectly, this is normally not a
big issue at all since the writer shouldn't be rated at a speed
that it cannot write without errors. Error correction is used
when played back. In general, data accuracy can always be
maintained fairly well, even on Audio disks.

However, the real issue that relates to the the OP's question is
a second issue specific to Audio CDs and applies even if there
are no read errors during playback. When playing back an Audio
CD, it is done in real time. A clock signal is recovered from the
bits comeing off of the CD and it is used to convert the digital
signal back to an analog one. all Conversion is done relative to
this clock signal. If the pits burned into the CD aren't spaced
prefectly, this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.


It is not real time. The data is read into a small buffer where it is
processed by the CPU. The spindle motor is constantly changing speed so,
real time would be an issue. The CPU must have a storage space to fix data
errors and allow time to recover from the speed adjustments and mistracks
from minor bumps and such.
John


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.

Obviously, if you are pushing the writer to its maximum speed,
there might be subtle variations in the spacing of the pits. The
data on the disk is exact (i.e., "reliable") but the inconsistent
spacing when played back in an audio system might not produce as
"accurate" a sound. Using a lower write speed just allows the
writer to operated well within its capability and produce a more
even (accurate) spacing of the pits. Again, you can get the
correct data off of the disk, the problem is in converting it to
real time analog music.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason. The
explanation seems to be that commercial CDs are "pressed" rather
than "burned". The edges of the pits on a pressed CD may not have
the clean edges and spacing that a burned CD run at lower speed
has resulting in potentially higher jitter than the duplicate
might generate. Since the actual data values on the original are
copied exactly onto the duplicate, it's 100% reliable and no data
is lost, however, the duplicat CD may have less jitter when
played back in real time.

Note again that this effect is very subtle and in general is not
going to be detected on many systems, even if the difference the
writer has in accuracy between fastest and slowest speeds is
great. On highly resolving systems, jitter's effects can be
noticed, however, the difference in jitter from a CD burned at 2x
and one at 12x on a 12x burner (all other things being equal)
might still be quite subtle and as a result is subject to much
discussion in the high end audio forums :-)

Bottom line? My opinion is that the effects are subtle enough
that it is more a matter of how much time I have to wait for the
thing to burn :-) If you have a fairly high quality sound system,
knock the burn rate down a notch or so it you have the time, but
if the CDs are for playing in the car or a walkman, blast away as
fast as you can.

All this is only IMHO of course!

- Jeff



  #25   Report Post  
jriegle
 
Posts: n/a
Default CD burning at slower speeds (long)


"Jeff Wiseman" wrote in message
...


Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?



Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)

For burning CDs of any sort there is an issue of errors when
writing to disk. If I understand corectly, this is normally not a
big issue at all since the writer shouldn't be rated at a speed
that it cannot write without errors. Error correction is used
when played back. In general, data accuracy can always be
maintained fairly well, even on Audio disks.

However, the real issue that relates to the the OP's question is
a second issue specific to Audio CDs and applies even if there
are no read errors during playback. When playing back an Audio
CD, it is done in real time. A clock signal is recovered from the
bits comeing off of the CD and it is used to convert the digital
signal back to an analog one. all Conversion is done relative to
this clock signal. If the pits burned into the CD aren't spaced
prefectly, this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.


It is not real time. The data is read into a small buffer where it is
processed by the CPU. The spindle motor is constantly changing speed so,
real time would be an issue. The CPU must have a storage space to fix data
errors and allow time to recover from the speed adjustments and mistracks
from minor bumps and such.
John


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.

Obviously, if you are pushing the writer to its maximum speed,
there might be subtle variations in the spacing of the pits. The
data on the disk is exact (i.e., "reliable") but the inconsistent
spacing when played back in an audio system might not produce as
"accurate" a sound. Using a lower write speed just allows the
writer to operated well within its capability and produce a more
even (accurate) spacing of the pits. Again, you can get the
correct data off of the disk, the problem is in converting it to
real time analog music.

An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason. The
explanation seems to be that commercial CDs are "pressed" rather
than "burned". The edges of the pits on a pressed CD may not have
the clean edges and spacing that a burned CD run at lower speed
has resulting in potentially higher jitter than the duplicate
might generate. Since the actual data values on the original are
copied exactly onto the duplicate, it's 100% reliable and no data
is lost, however, the duplicat CD may have less jitter when
played back in real time.

Note again that this effect is very subtle and in general is not
going to be detected on many systems, even if the difference the
writer has in accuracy between fastest and slowest speeds is
great. On highly resolving systems, jitter's effects can be
noticed, however, the difference in jitter from a CD burned at 2x
and one at 12x on a 12x burner (all other things being equal)
might still be quite subtle and as a result is subject to much
discussion in the high end audio forums :-)

Bottom line? My opinion is that the effects are subtle enough
that it is more a matter of how much time I have to wait for the
thing to burn :-) If you have a fairly high quality sound system,
knock the burn rate down a notch or so it you have the time, but
if the CDs are for playing in the car or a walkman, blast away as
fast as you can.

All this is only IMHO of course!

- Jeff





  #26   Report Post  
Jeff Wiseman
 
Posts: n/a
Default CD burning at slower speeds (long)



Murray Peterson wrote:

Jeff Wiseman wrote in
:

Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)


Only if your explanation is accurate -- this one isn't even close. If the
data is retrievable without errors, then the burner was running at an
acceptable speed.



As I mentioned further down in my note, the issue has nothing to
do with correctly reading the data. My entire issue had to do
with converting it back to an analog signal.


A clock signal is recovered from the bits comeing off of the CD


This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.



I might be really missing the mark here but this statement of
yours seems to indicate that you don't have a clue what clock
recovery and jitter issues involve. Yes, the clock signal is
generated by the CD player's circuitry, but it is NOT free
running, otherwise you would be constantly overrunning or
underrunning your data buffers. The term "clock recovery" is the
standard term for this process. In reality, the clock is
"derived" from the flow of data bits being read. It is
semi-locked in a way with the databits coming off of the CD.
That's how it knows how fast to run. If the average rate of music
data samples coming off of the disk is a bit slow or fast, the
clock must adjust to compensate. The clock is constantly
adjusting by design based on the timeing of the leading edge of
pits. If the edge spacing varys a lot, the clock also adjusts lot
(jitter).

I may be mistaking in some of my understanding of these things
and I invite correction. However, it does appear that you may
need to read up some on the concepts of clock recovery, jitter,
and their effects on real time D/A conversion.


If the pits burned into the CD aren't spaced prefectly,


A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.



I stand corrected. On a CD-ROM I believe that the laser burns a
"hole" in the reflective area instead of the pit that exists in a
pressed CD. For all intents and purposes, the function of the pit
and/or hole is the same for discussion on jitter and I carelessly
used them interchangebly.


this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.
[snip]


CDs don't work this way at all. The data on a CD isn't even in the correct
order when read back by the player -- blocks of data are intentionally
scattered and interleaved with error correction data, and the data itslef
is encoded to a different form.



I disagree, CDs DO work this way (with some variances in design).
It's not about where the data is on disk and how it comes off,
it's about the rate that it comes off and how it affects the clock.


It really isn't possible for "pit spacing" to have any effect on the D/A
jitter. The *only* way to read data from a CD requires that the data be
read into memory buffers, de-interleaved, and decoded back into the
original 16 bit samples. Only then is the data in a form to be fed into
the D/A converter, and only at that point does clock timing (and jitter)
come into play.



My understanding was always that the clock is derived directly
from pit spacing. If jitter doesn't come from that, where does it
come from? Also, clock timeing is used to drive ALL of the
buffering, de-interleaving, etc. functions that were just
mentioned, not just the D/A conversion.



The "pit spacing" is completely irrelevant -- the data represented by those
pits is a mathematical encoding, which must be decoded and error corrected
before being fed to the D/A converter. The timing of the raw data is
completely separate from the timing of the final data being fed to the D/A.



If so, then how do you prevent buffer over/underuns in all of the
functions that you just mentioned above?


Here is a web page that explains some of the details:
http://www.ee.washington.edu/consele...udio2/95x7.htm

In particular, read part II, which describes how the data is encoded and
interleaved before being written to the disk.



I don't see how any of that article changes anything. Jitter is
introduced as a real time product of taking the data off the disk
and converting it. In fact, jitter can be seen in the eye
pattern. According to the article, that comes from the pit
frequency of different bit streams, right?

If I'm totally in left field here, fine I need to know. But give
me a article on jitter introduction in the CD play'er circuitry
and clock recovery. Not one that just talks about data
distribution and redundancy encoding. That doesn't tell me
anything about the issue here in question.


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.


This analogy is completely useless -- the only way for a CD to be read
involves changing the speed of the read head on a continuous basis. CD
players contain read buffers, and the disk will speed up or slow down in
order to keep those buffers full. From there, the data is fed into the
decoding circuitry, which decodes the raw EFM data, removes the
interleaving, performs error checking and correction, and finally feeds the
resulting 16 bit samples into buffers for feeding to the D/A converter.
Any jitter arises from poor clock circuitry that is taking the data from
those final buffers and feeding it into the D/A converter, and has nothing
to do with how the data is being read from the disk.



Maybe not as useless as you think. The transport drive is all
controlled off the clock and in some systems, the transport
mechanism itself originates the clock.

So I'm still not sure where you are saying that you think jitter
originates. From the above statement it sounds like you believe
it originates somewhere when the reconstructed data is passed to
the D/A. It's the same clock that drives all of the transport and
decoding functions. Jitter is introduced at wherever the clock is
derived from. If the clock is free-wheeling, Then it's the clock
crystal/oscillator itself. But that's not possible because it has
to adjust for read speed variations (at a bit level). So where is
the clock synching/adjusting occur?


An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason.


Probably for the same reason that they hear improvements when supporting
their speaker cables on blocks of exotic woods (no, I am not joking here --
some of them actually believe this).



Very highly resolving audio equipment can reveal a lot, many
things of which is hard to describe so like anything else, there
is a lot of snake oil there. However, as an original critic
myself, I've spent a lot of hours listening to some very well
matched systems and it's NOT all bogus. Cables can make a
SUBSTANTIAL difference on many systems, but they make no
noticable difference at all on many more mediocre type systems.
Be careful of judging purely on assumed logic without actually
sitting down and exploring it first in an unbiased fashion. I
have no idea of why cables might make such a major difference in
some cases and no noticable difference in others. I have my
theories and other have theirs. All I know is that in fact it
does occur is fascinating and the effects are desirable to those
who experience them.

I also know that a lot of people pass up the opportunity to have
unique experiences because they've already convinced themselves
that some things are impossible.

- Jeff
  #27   Report Post  
Jeff Wiseman
 
Posts: n/a
Default CD burning at slower speeds (long)



Murray Peterson wrote:

Jeff Wiseman wrote in
:

Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)


Only if your explanation is accurate -- this one isn't even close. If the
data is retrievable without errors, then the burner was running at an
acceptable speed.



As I mentioned further down in my note, the issue has nothing to
do with correctly reading the data. My entire issue had to do
with converting it back to an analog signal.


A clock signal is recovered from the bits comeing off of the CD


This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.



I might be really missing the mark here but this statement of
yours seems to indicate that you don't have a clue what clock
recovery and jitter issues involve. Yes, the clock signal is
generated by the CD player's circuitry, but it is NOT free
running, otherwise you would be constantly overrunning or
underrunning your data buffers. The term "clock recovery" is the
standard term for this process. In reality, the clock is
"derived" from the flow of data bits being read. It is
semi-locked in a way with the databits coming off of the CD.
That's how it knows how fast to run. If the average rate of music
data samples coming off of the disk is a bit slow or fast, the
clock must adjust to compensate. The clock is constantly
adjusting by design based on the timeing of the leading edge of
pits. If the edge spacing varys a lot, the clock also adjusts lot
(jitter).

I may be mistaking in some of my understanding of these things
and I invite correction. However, it does appear that you may
need to read up some on the concepts of clock recovery, jitter,
and their effects on real time D/A conversion.


If the pits burned into the CD aren't spaced prefectly,


A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.



I stand corrected. On a CD-ROM I believe that the laser burns a
"hole" in the reflective area instead of the pit that exists in a
pressed CD. For all intents and purposes, the function of the pit
and/or hole is the same for discussion on jitter and I carelessly
used them interchangebly.


this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.
[snip]


CDs don't work this way at all. The data on a CD isn't even in the correct
order when read back by the player -- blocks of data are intentionally
scattered and interleaved with error correction data, and the data itslef
is encoded to a different form.



I disagree, CDs DO work this way (with some variances in design).
It's not about where the data is on disk and how it comes off,
it's about the rate that it comes off and how it affects the clock.


It really isn't possible for "pit spacing" to have any effect on the D/A
jitter. The *only* way to read data from a CD requires that the data be
read into memory buffers, de-interleaved, and decoded back into the
original 16 bit samples. Only then is the data in a form to be fed into
the D/A converter, and only at that point does clock timing (and jitter)
come into play.



My understanding was always that the clock is derived directly
from pit spacing. If jitter doesn't come from that, where does it
come from? Also, clock timeing is used to drive ALL of the
buffering, de-interleaving, etc. functions that were just
mentioned, not just the D/A conversion.



The "pit spacing" is completely irrelevant -- the data represented by those
pits is a mathematical encoding, which must be decoded and error corrected
before being fed to the D/A converter. The timing of the raw data is
completely separate from the timing of the final data being fed to the D/A.



If so, then how do you prevent buffer over/underuns in all of the
functions that you just mentioned above?


Here is a web page that explains some of the details:
http://www.ee.washington.edu/consele...udio2/95x7.htm

In particular, read part II, which describes how the data is encoded and
interleaved before being written to the disk.



I don't see how any of that article changes anything. Jitter is
introduced as a real time product of taking the data off the disk
and converting it. In fact, jitter can be seen in the eye
pattern. According to the article, that comes from the pit
frequency of different bit streams, right?

If I'm totally in left field here, fine I need to know. But give
me a article on jitter introduction in the CD play'er circuitry
and clock recovery. Not one that just talks about data
distribution and redundancy encoding. That doesn't tell me
anything about the issue here in question.


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.


This analogy is completely useless -- the only way for a CD to be read
involves changing the speed of the read head on a continuous basis. CD
players contain read buffers, and the disk will speed up or slow down in
order to keep those buffers full. From there, the data is fed into the
decoding circuitry, which decodes the raw EFM data, removes the
interleaving, performs error checking and correction, and finally feeds the
resulting 16 bit samples into buffers for feeding to the D/A converter.
Any jitter arises from poor clock circuitry that is taking the data from
those final buffers and feeding it into the D/A converter, and has nothing
to do with how the data is being read from the disk.



Maybe not as useless as you think. The transport drive is all
controlled off the clock and in some systems, the transport
mechanism itself originates the clock.

So I'm still not sure where you are saying that you think jitter
originates. From the above statement it sounds like you believe
it originates somewhere when the reconstructed data is passed to
the D/A. It's the same clock that drives all of the transport and
decoding functions. Jitter is introduced at wherever the clock is
derived from. If the clock is free-wheeling, Then it's the clock
crystal/oscillator itself. But that's not possible because it has
to adjust for read speed variations (at a bit level). So where is
the clock synching/adjusting occur?


An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason.


Probably for the same reason that they hear improvements when supporting
their speaker cables on blocks of exotic woods (no, I am not joking here --
some of them actually believe this).



Very highly resolving audio equipment can reveal a lot, many
things of which is hard to describe so like anything else, there
is a lot of snake oil there. However, as an original critic
myself, I've spent a lot of hours listening to some very well
matched systems and it's NOT all bogus. Cables can make a
SUBSTANTIAL difference on many systems, but they make no
noticable difference at all on many more mediocre type systems.
Be careful of judging purely on assumed logic without actually
sitting down and exploring it first in an unbiased fashion. I
have no idea of why cables might make such a major difference in
some cases and no noticable difference in others. I have my
theories and other have theirs. All I know is that in fact it
does occur is fascinating and the effects are desirable to those
who experience them.

I also know that a lot of people pass up the opportunity to have
unique experiences because they've already convinced themselves
that some things are impossible.

- Jeff
  #28   Report Post  
Jeff Wiseman
 
Posts: n/a
Default CD burning at slower speeds (long)



Murray Peterson wrote:

Jeff Wiseman wrote in
:

Well, not necessarily more reliable, but occasionally, more accurate.

Let me try to explain :-)


Only if your explanation is accurate -- this one isn't even close. If the
data is retrievable without errors, then the burner was running at an
acceptable speed.



As I mentioned further down in my note, the issue has nothing to
do with correctly reading the data. My entire issue had to do
with converting it back to an analog signal.


A clock signal is recovered from the bits comeing off of the CD


This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.



I might be really missing the mark here but this statement of
yours seems to indicate that you don't have a clue what clock
recovery and jitter issues involve. Yes, the clock signal is
generated by the CD player's circuitry, but it is NOT free
running, otherwise you would be constantly overrunning or
underrunning your data buffers. The term "clock recovery" is the
standard term for this process. In reality, the clock is
"derived" from the flow of data bits being read. It is
semi-locked in a way with the databits coming off of the CD.
That's how it knows how fast to run. If the average rate of music
data samples coming off of the disk is a bit slow or fast, the
clock must adjust to compensate. The clock is constantly
adjusting by design based on the timeing of the leading edge of
pits. If the edge spacing varys a lot, the clock also adjusts lot
(jitter).

I may be mistaking in some of my understanding of these things
and I invite correction. However, it does appear that you may
need to read up some on the concepts of clock recovery, jitter,
and their effects on real time D/A conversion.


If the pits burned into the CD aren't spaced prefectly,


A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.



I stand corrected. On a CD-ROM I believe that the laser burns a
"hole" in the reflective area instead of the pit that exists in a
pressed CD. For all intents and purposes, the function of the pit
and/or hole is the same for discussion on jitter and I carelessly
used them interchangebly.


this clock signal "jumps around" a little. In the
digital world this is called "jitter". Depending on how the
digital to analog conversion is done in the sound equipment, this
jitter can cause the output analog signal to reproduce with
slight time domain distortions in it. This effect is extremly
small and tends to only be noticable on highly resolving audio systems.
[snip]


CDs don't work this way at all. The data on a CD isn't even in the correct
order when read back by the player -- blocks of data are intentionally
scattered and interleaved with error correction data, and the data itslef
is encoded to a different form.



I disagree, CDs DO work this way (with some variances in design).
It's not about where the data is on disk and how it comes off,
it's about the rate that it comes off and how it affects the clock.


It really isn't possible for "pit spacing" to have any effect on the D/A
jitter. The *only* way to read data from a CD requires that the data be
read into memory buffers, de-interleaved, and decoded back into the
original 16 bit samples. Only then is the data in a form to be fed into
the D/A converter, and only at that point does clock timing (and jitter)
come into play.



My understanding was always that the clock is derived directly
from pit spacing. If jitter doesn't come from that, where does it
come from? Also, clock timeing is used to drive ALL of the
buffering, de-interleaving, etc. functions that were just
mentioned, not just the D/A conversion.



The "pit spacing" is completely irrelevant -- the data represented by those
pits is a mathematical encoding, which must be decoded and error corrected
before being fed to the D/A converter. The timing of the raw data is
completely separate from the timing of the final data being fed to the D/A.



If so, then how do you prevent buffer over/underuns in all of the
functions that you just mentioned above?


Here is a web page that explains some of the details:
http://www.ee.washington.edu/consele...udio2/95x7.htm

In particular, read part II, which describes how the data is encoded and
interleaved before being written to the disk.



I don't see how any of that article changes anything. Jitter is
introduced as a real time product of taking the data off the disk
and converting it. In fact, jitter can be seen in the eye
pattern. According to the article, that comes from the pit
frequency of different bit streams, right?

If I'm totally in left field here, fine I need to know. But give
me a article on jitter introduction in the CD play'er circuitry
and clock recovery. Not one that just talks about data
distribution and redundancy encoding. That doesn't tell me
anything about the issue here in question.


An analogy might be a record player that varies from moment to
moment in speed. The mechanical information on the record is all
there but it's not being converted to analog audio well. The
"time" element of converting the mechanical data into analog
music is being corrupted. There is a type of "time domain
distortion" being introduced in the conversion process.


This analogy is completely useless -- the only way for a CD to be read
involves changing the speed of the read head on a continuous basis. CD
players contain read buffers, and the disk will speed up or slow down in
order to keep those buffers full. From there, the data is fed into the
decoding circuitry, which decodes the raw EFM data, removes the
interleaving, performs error checking and correction, and finally feeds the
resulting 16 bit samples into buffers for feeding to the D/A converter.
Any jitter arises from poor clock circuitry that is taking the data from
those final buffers and feeding it into the D/A converter, and has nothing
to do with how the data is being read from the disk.



Maybe not as useless as you think. The transport drive is all
controlled off the clock and in some systems, the transport
mechanism itself originates the clock.

So I'm still not sure where you are saying that you think jitter
originates. From the above statement it sounds like you believe
it originates somewhere when the reconstructed data is passed to
the D/A. It's the same clock that drives all of the transport and
decoding functions. Jitter is introduced at wherever the clock is
derived from. If the clock is free-wheeling, Then it's the clock
crystal/oscillator itself. But that's not possible because it has
to adjust for read speed variations (at a bit level). So where is
the clock synching/adjusting occur?


An interesting side here is that some audiophiles have discovered
that when they made a duplicate of a store purchased CD, the
duplicate seemed to sound "better" for some reason.


Probably for the same reason that they hear improvements when supporting
their speaker cables on blocks of exotic woods (no, I am not joking here --
some of them actually believe this).



Very highly resolving audio equipment can reveal a lot, many
things of which is hard to describe so like anything else, there
is a lot of snake oil there. However, as an original critic
myself, I've spent a lot of hours listening to some very well
matched systems and it's NOT all bogus. Cables can make a
SUBSTANTIAL difference on many systems, but they make no
noticable difference at all on many more mediocre type systems.
Be careful of judging purely on assumed logic without actually
sitting down and exploring it first in an unbiased fashion. I
have no idea of why cables might make such a major difference in
some cases and no noticable difference in others. I have my
theories and other have theirs. All I know is that in fact it
does occur is fascinating and the effects are desirable to those
who experience them.

I also know that a lot of people pass up the opportunity to have
unique experiences because they've already convinced themselves
that some things are impossible.

- Jeff
  #29   Report Post  
Dave Platt
 
Posts: n/a
Default CD burning at slower speeds (long)

In article ,
Jeff Wiseman wrote:

This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.



I might be really missing the mark here but this statement of
yours seems to indicate that you don't have a clue what clock
recovery and jitter issues involve. Yes, the clock signal is
generated by the CD player's circuitry, but it is NOT free
running, otherwise you would be constantly overrunning or
underrunning your data buffers. The term "clock recovery" is the
standard term for this process. In reality, the clock is
"derived" from the flow of data bits being read. It is
semi-locked in a way with the databits coming off of the CD.
That's how it knows how fast to run. If the average rate of music
data samples coming off of the disk is a bit slow or fast, the
clock must adjust to compensate. The clock is constantly
adjusting by design based on the timeing of the leading edge of
pits. If the edge spacing varys a lot, the clock also adjusts lot
(jitter).

I may be mistaking in some of my understanding of these things
and I invite correction. However, it does appear that you may
need to read up some on the concepts of clock recovery, jitter,
and their effects on real time D/A conversion.


You are mistaken in at least some of your understanding. I'd suggest
referring to a basic text on digital audio, such as one of Ken
Pohlmann's, for the details.

Briefly, though: there are two interesting cases to look at -
single-box CD players, and transport/DAC systems connected by S/PDIF
or similar path. Let's look at the single-box CD player first, as
it's the simpler case.

In a single-box player of competent design, the timing of the
conversion between the final digital samples, and analog voltages, is
controlled by a quartz-crystal operator running at a fixed speed...
period. The conversion rate is _not_ derived from a clock recovered
from the data arriving from the spinning disc - the disc speed does
not control the conversion process. It's very much the other way
around. The data is clocked out of the Reed-Solomon C2/C2 error
correction and de-interleaving chip at a very predictable rate
(controlled by the fixed-speed quartz oscillator). Raw data arriving
from the CD is fed into the de-interleaver at whatever rate it's
happening to arrive.

As you point out, this can cause buffer underflow/overflow in the
correction chip, due to the difference in the clock rates between
"data arriving" and "data leaving". The CD player controller prevents
this, by adjusting the rate at which the CD is spinning (in rather
coarse increments), so that the "data arriving" rate varies as needed
to keep the buffer's fill level within acceptable bounds. The "data
leaving the chip and being converted to analog" rate does _not_
change.

As a result of this process, a single-box player of good design is (or
should be) quite immune to small variations in the pit/land timing.
The clock recovered from the raw pit/land data is used only to control
the feeding of the data _into_ the error-correction chip - when the
data comes out of the chip, it's under the control of a much more
stable, fixed-speed, as-low-in-jitter-as-you-care-to-engineer-it
oscillator. Timing jitter in the incoming-data data is simply
stripped out by the de-interleaving process.

Things get a bit more complicated in the case of a transport/DAC
two-box system. The process works much as above, until the data is
clocked out of the error-correction chip. In this case, instead of it
going right into a DAC chip, it's fed to an S/PDIF encoder/transmitter
chip, which Manchester-encodes (I think) the data and sents it out the
optical or RCA or XLR connector.

The DAC-box must then receive this encoded data, recover a clock
signal from it, recover the data, and convert the data to analog. A
lot of DAC-boxes do a rather poor job of the clock recovery and
conversion, and the conversion timing is rather jitter-prone. This
jitter is from the S/PDIF encoding/decoding process, _not_ from the
jitter in the pit/land timings, though - as in the case of a
single-box player, the pit/land timing jitter was stripped out when
the data was fed through the error-correction and de-interleaving
process.

The jitter in the S/PDIF process occurs for a number of reasons... the
finite rise and fall time of the S/PDIF electrical or optical signal,
electrical noise, the sensitivity of certain S/PDIF receivers to the
pattern of the data arriving, and to the electrical behavior of the
the phase-locked loop circuits often used in clock recovery.

It's possible for an external DAC box to do a good job of clock
recovery (or re-creation), but many do not.

A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.



I stand corrected. On a CD-ROM I believe that the laser burns a
"hole" in the reflective area instead of the pit that exists in a
pressed CD. For all intents and purposes, the function of the pit
and/or hole is the same for discussion on jitter and I carelessly
used them interchangebly.


My undererstanding is that in a "manufactured" CD (whether
digital-audio or CD-ROM) there actually are physical pits and lands.
They're created by an injection-molding process, with the "negatives"
of the pits and lands having been formed in the mold/stamper.

In a CD-R, the laser makes physical changes in a layer of organic dye
on the disc... I don't know whether this actually "burns" it away, or
just changes it from reflective to nonreflective.

In a CD-RW, the laser changes the phase of a thin layer of rare-earth
alloy from crystalline to amorphous, or vice versa, and thus changes
its reflectivity.

My understanding was always that the clock is derived directly
from pit spacing.


The _first_ clock, used to transfer the data into the Reed-Solomon
error correction chip, is derived from the pit spacing. However, this
clock is _not_ used in the actual digital-to-analog conversion
process... it plays no further role once the data has started its way
into the error corrector / de-interleaver.

The clock which controls the D-to-A step is a separate, independent
one.

--
Dave Platt AE6EO
Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!
  #30   Report Post  
Dave Platt
 
Posts: n/a
Default CD burning at slower speeds (long)

In article ,
Jeff Wiseman wrote:

This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.



I might be really missing the mark here but this statement of
yours seems to indicate that you don't have a clue what clock
recovery and jitter issues involve. Yes, the clock signal is
generated by the CD player's circuitry, but it is NOT free
running, otherwise you would be constantly overrunning or
underrunning your data buffers. The term "clock recovery" is the
standard term for this process. In reality, the clock is
"derived" from the flow of data bits being read. It is
semi-locked in a way with the databits coming off of the CD.
That's how it knows how fast to run. If the average rate of music
data samples coming off of the disk is a bit slow or fast, the
clock must adjust to compensate. The clock is constantly
adjusting by design based on the timeing of the leading edge of
pits. If the edge spacing varys a lot, the clock also adjusts lot
(jitter).

I may be mistaking in some of my understanding of these things
and I invite correction. However, it does appear that you may
need to read up some on the concepts of clock recovery, jitter,
and their effects on real time D/A conversion.


You are mistaken in at least some of your understanding. I'd suggest
referring to a basic text on digital audio, such as one of Ken
Pohlmann's, for the details.

Briefly, though: there are two interesting cases to look at -
single-box CD players, and transport/DAC systems connected by S/PDIF
or similar path. Let's look at the single-box CD player first, as
it's the simpler case.

In a single-box player of competent design, the timing of the
conversion between the final digital samples, and analog voltages, is
controlled by a quartz-crystal operator running at a fixed speed...
period. The conversion rate is _not_ derived from a clock recovered
from the data arriving from the spinning disc - the disc speed does
not control the conversion process. It's very much the other way
around. The data is clocked out of the Reed-Solomon C2/C2 error
correction and de-interleaving chip at a very predictable rate
(controlled by the fixed-speed quartz oscillator). Raw data arriving
from the CD is fed into the de-interleaver at whatever rate it's
happening to arrive.

As you point out, this can cause buffer underflow/overflow in the
correction chip, due to the difference in the clock rates between
"data arriving" and "data leaving". The CD player controller prevents
this, by adjusting the rate at which the CD is spinning (in rather
coarse increments), so that the "data arriving" rate varies as needed
to keep the buffer's fill level within acceptable bounds. The "data
leaving the chip and being converted to analog" rate does _not_
change.

As a result of this process, a single-box player of good design is (or
should be) quite immune to small variations in the pit/land timing.
The clock recovered from the raw pit/land data is used only to control
the feeding of the data _into_ the error-correction chip - when the
data comes out of the chip, it's under the control of a much more
stable, fixed-speed, as-low-in-jitter-as-you-care-to-engineer-it
oscillator. Timing jitter in the incoming-data data is simply
stripped out by the de-interleaving process.

Things get a bit more complicated in the case of a transport/DAC
two-box system. The process works much as above, until the data is
clocked out of the error-correction chip. In this case, instead of it
going right into a DAC chip, it's fed to an S/PDIF encoder/transmitter
chip, which Manchester-encodes (I think) the data and sents it out the
optical or RCA or XLR connector.

The DAC-box must then receive this encoded data, recover a clock
signal from it, recover the data, and convert the data to analog. A
lot of DAC-boxes do a rather poor job of the clock recovery and
conversion, and the conversion timing is rather jitter-prone. This
jitter is from the S/PDIF encoding/decoding process, _not_ from the
jitter in the pit/land timings, though - as in the case of a
single-box player, the pit/land timing jitter was stripped out when
the data was fed through the error-correction and de-interleaving
process.

The jitter in the S/PDIF process occurs for a number of reasons... the
finite rise and fall time of the S/PDIF electrical or optical signal,
electrical noise, the sensitivity of certain S/PDIF receivers to the
pattern of the data arriving, and to the electrical behavior of the
the phase-locked loop circuits often used in clock recovery.

It's possible for an external DAC box to do a good job of clock
recovery (or re-creation), but many do not.

A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.



I stand corrected. On a CD-ROM I believe that the laser burns a
"hole" in the reflective area instead of the pit that exists in a
pressed CD. For all intents and purposes, the function of the pit
and/or hole is the same for discussion on jitter and I carelessly
used them interchangebly.


My undererstanding is that in a "manufactured" CD (whether
digital-audio or CD-ROM) there actually are physical pits and lands.
They're created by an injection-molding process, with the "negatives"
of the pits and lands having been formed in the mold/stamper.

In a CD-R, the laser makes physical changes in a layer of organic dye
on the disc... I don't know whether this actually "burns" it away, or
just changes it from reflective to nonreflective.

In a CD-RW, the laser changes the phase of a thin layer of rare-earth
alloy from crystalline to amorphous, or vice versa, and thus changes
its reflectivity.

My understanding was always that the clock is derived directly
from pit spacing.


The _first_ clock, used to transfer the data into the Reed-Solomon
error correction chip, is derived from the pit spacing. However, this
clock is _not_ used in the actual digital-to-analog conversion
process... it plays no further role once the data has started its way
into the error corrector / de-interleaver.

The clock which controls the D-to-A step is a separate, independent
one.

--
Dave Platt AE6EO
Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!


  #31   Report Post  
Dave Platt
 
Posts: n/a
Default CD burning at slower speeds (long)

In article ,
Jeff Wiseman wrote:

This is false -- there are no clock signals contained in the CD data. The
clock signal is generated by the CD player's circuitry. The CD itself
contains encoded data and correction information only, and even that
information is intentionally scattered across the disk.



I might be really missing the mark here but this statement of
yours seems to indicate that you don't have a clue what clock
recovery and jitter issues involve. Yes, the clock signal is
generated by the CD player's circuitry, but it is NOT free
running, otherwise you would be constantly overrunning or
underrunning your data buffers. The term "clock recovery" is the
standard term for this process. In reality, the clock is
"derived" from the flow of data bits being read. It is
semi-locked in a way with the databits coming off of the CD.
That's how it knows how fast to run. If the average rate of music
data samples coming off of the disk is a bit slow or fast, the
clock must adjust to compensate. The clock is constantly
adjusting by design based on the timeing of the leading edge of
pits. If the edge spacing varys a lot, the clock also adjusts lot
(jitter).

I may be mistaking in some of my understanding of these things
and I invite correction. However, it does appear that you may
need to read up some on the concepts of clock recovery, jitter,
and their effects on real time D/A conversion.


You are mistaken in at least some of your understanding. I'd suggest
referring to a basic text on digital audio, such as one of Ken
Pohlmann's, for the details.

Briefly, though: there are two interesting cases to look at -
single-box CD players, and transport/DAC systems connected by S/PDIF
or similar path. Let's look at the single-box CD player first, as
it's the simpler case.

In a single-box player of competent design, the timing of the
conversion between the final digital samples, and analog voltages, is
controlled by a quartz-crystal operator running at a fixed speed...
period. The conversion rate is _not_ derived from a clock recovered
from the data arriving from the spinning disc - the disc speed does
not control the conversion process. It's very much the other way
around. The data is clocked out of the Reed-Solomon C2/C2 error
correction and de-interleaving chip at a very predictable rate
(controlled by the fixed-speed quartz oscillator). Raw data arriving
from the CD is fed into the de-interleaver at whatever rate it's
happening to arrive.

As you point out, this can cause buffer underflow/overflow in the
correction chip, due to the difference in the clock rates between
"data arriving" and "data leaving". The CD player controller prevents
this, by adjusting the rate at which the CD is spinning (in rather
coarse increments), so that the "data arriving" rate varies as needed
to keep the buffer's fill level within acceptable bounds. The "data
leaving the chip and being converted to analog" rate does _not_
change.

As a result of this process, a single-box player of good design is (or
should be) quite immune to small variations in the pit/land timing.
The clock recovered from the raw pit/land data is used only to control
the feeding of the data _into_ the error-correction chip - when the
data comes out of the chip, it's under the control of a much more
stable, fixed-speed, as-low-in-jitter-as-you-care-to-engineer-it
oscillator. Timing jitter in the incoming-data data is simply
stripped out by the de-interleaving process.

Things get a bit more complicated in the case of a transport/DAC
two-box system. The process works much as above, until the data is
clocked out of the error-correction chip. In this case, instead of it
going right into a DAC chip, it's fed to an S/PDIF encoder/transmitter
chip, which Manchester-encodes (I think) the data and sents it out the
optical or RCA or XLR connector.

The DAC-box must then receive this encoded data, recover a clock
signal from it, recover the data, and convert the data to analog. A
lot of DAC-boxes do a rather poor job of the clock recovery and
conversion, and the conversion timing is rather jitter-prone. This
jitter is from the S/PDIF encoding/decoding process, _not_ from the
jitter in the pit/land timings, though - as in the case of a
single-box player, the pit/land timing jitter was stripped out when
the data was fed through the error-correction and de-interleaving
process.

The jitter in the S/PDIF process occurs for a number of reasons... the
finite rise and fall time of the S/PDIF electrical or optical signal,
electrical noise, the sensitivity of certain S/PDIF receivers to the
pattern of the data arriving, and to the electrical behavior of the
the phase-locked loop circuits often used in clock recovery.

It's possible for an external DAC box to do a good job of clock
recovery (or re-creation), but many do not.

A CD-ROM doesn't contain any "pits" -- the burning process only changes the
optical properties of the media.



I stand corrected. On a CD-ROM I believe that the laser burns a
"hole" in the reflective area instead of the pit that exists in a
pressed CD. For all intents and purposes, the function of the pit
and/or hole is the same for discussion on jitter and I carelessly
used them interchangebly.


My undererstanding is that in a "manufactured" CD (whether
digital-audio or CD-ROM) there actually are physical pits and lands.
They're created by an injection-molding process, with the "negatives"
of the pits and lands having been formed in the mold/stamper.

In a CD-R, the laser makes physical changes in a layer of organic dye
on the disc... I don't know whether this actually "burns" it away, or
just changes it from reflective to nonreflective.

In a CD-RW, the laser changes the phase of a thin layer of rare-earth
alloy from crystalline to amorphous, or vice versa, and thus changes
its reflectivity.

My understanding was always that the clock is derived directly
from pit spacing.


The _first_ clock, used to transfer the data into the Reed-Solomon
error correction chip, is derived from the pit spacing. However, this
clock is _not_ used in the actual digital-to-analog conversion
process... it plays no further role once the data has started its way
into the error corrector / de-interleaver.

The clock which controls the D-to-A step is a separate, independent
one.

--
Dave Platt AE6EO
Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!
  #35   Report Post  
Peter Larsen
 
Posts: n/a
Default Is CD burning at slower speeds better?

Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


Error rate is likely to be lowest at some speed setting for some
combination of media and burner, but you need to determine actual error
rate with actual media and actual burner to have knowledge. If you want
to assume: don't use the fastest setting, don't use the slowest setting,
use a "moderately slow" setting, on an old burner that cold be X2, on a
new it could be X8 or x12 .... note: examples from thin air, not
verified except that I know that a "problem cd player" is more likely to
play things that are burned x2 on my no longer new Plextor than things
that are burned at max speed, x8.


Kind regards

Peter Larsen

--
*******************************************
* My site is at: http://www.muyiovatki.dk *
*******************************************


  #36   Report Post  
Peter Larsen
 
Posts: n/a
Default Is CD burning at slower speeds better?

Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


Error rate is likely to be lowest at some speed setting for some
combination of media and burner, but you need to determine actual error
rate with actual media and actual burner to have knowledge. If you want
to assume: don't use the fastest setting, don't use the slowest setting,
use a "moderately slow" setting, on an old burner that cold be X2, on a
new it could be X8 or x12 .... note: examples from thin air, not
verified except that I know that a "problem cd player" is more likely to
play things that are burned x2 on my no longer new Plextor than things
that are burned at max speed, x8.


Kind regards

Peter Larsen

--
*******************************************
* My site is at: http://www.muyiovatki.dk *
*******************************************
  #37   Report Post  
Peter Larsen
 
Posts: n/a
Default Is CD burning at slower speeds better?

Jezza wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


Error rate is likely to be lowest at some speed setting for some
combination of media and burner, but you need to determine actual error
rate with actual media and actual burner to have knowledge. If you want
to assume: don't use the fastest setting, don't use the slowest setting,
use a "moderately slow" setting, on an old burner that cold be X2, on a
new it could be X8 or x12 .... note: examples from thin air, not
verified except that I know that a "problem cd player" is more likely to
play things that are burned x2 on my no longer new Plextor than things
that are burned at max speed, x8.


Kind regards

Peter Larsen

--
*******************************************
* My site is at: http://www.muyiovatki.dk *
*******************************************
  #38   Report Post  
Laurence Payne
 
Posts: n/a
Default Is CD burning at slower speeds better?

On Sat, 26 Jun 2004 22:41:21 GMT, "s.stef"
wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I tested that cds burned at 8x max plays on every cdplayer.

16x or over don't.


That agrees with my experience.

Want to carry on? Does your burner/software/media offer slower burn
speeds? If you CAN burn at 2X (maybe even 1X) are the results good?
  #39   Report Post  
Laurence Payne
 
Posts: n/a
Default Is CD burning at slower speeds better?

On Sat, 26 Jun 2004 22:41:21 GMT, "s.stef"
wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I tested that cds burned at 8x max plays on every cdplayer.

16x or over don't.


That agrees with my experience.

Want to carry on? Does your burner/software/media offer slower burn
speeds? If you CAN burn at 2X (maybe even 1X) are the results good?
  #40   Report Post  
Laurence Payne
 
Posts: n/a
Default Is CD burning at slower speeds better?

On Sat, 26 Jun 2004 22:41:21 GMT, "s.stef"
wrote:

I was told that burning audio CDs at slower speeds is somehow better
and produces more reliable recordings- is there any truth to this?


I tested that cds burned at 8x max plays on every cdplayer.

16x or over don't.


That agrees with my experience.

Want to carry on? Does your burner/software/media offer slower burn
speeds? If you CAN burn at 2X (maybe even 1X) are the results good?
Reply
Thread Tools
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Burning audio cds Stephen McLuckie High End Audio 5 April 8th 04 01:37 PM
Electrical Burning Smell after Install Mitzi Car Audio 3 February 20th 04 06:42 AM
Record rotation speeds Irwin Schwartz Pro Audio 3 November 15th 03 12:19 PM
So. Cal Burning ScottW Audio Opinions 72 November 1st 03 06:52 PM
CD burning Peter Adamson Pro Audio 15 August 4th 03 01:00 PM


All times are GMT +1. The time now is 12:00 PM.

Powered by: vBulletin
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 AudioBanter.com.
The comments are property of their posters.
 

About Us

"It's about Audio and hi-fi"