Reply
 
Thread Tools Display Modes
  #81   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default Harsh, "aliased" sound with digital TV converter box.

The last YIQ wncoder I saw was in an RCA TK45 color camera from
around '75. All I've seen since then are equiband Y, R-Y, B-Y encoders
where the modulation axes are on the 0 and 90 degree axes rather than
rotated 33 degrees. That bandwidth filtering hasn'r been done that way
in a long time. S- video _could_ have more bandwidth if a custom
encoder was built but in fact they are simply an encoder where the
modulated subcarrier and luminance (Y) are not summed together and
are instead sent out separately on their own cables. Why would anyone
bother to make a "super" S-video encoder since the destination is a
limited response (sompared to a broadcast VTR) VHS deck?


I can see generating and recording R-Y and B-Y signals, as it makes it
easier to convert the signal to PAL and SECAM. But you can't transmit
_equal-bandwidth_ NTSC signals, as that would push the bandwidth beyond
5MHz. At some point NTSC color has to be converted to IQ.


OK, you guys talk about "500 lines" resolution on a Betamax
or an S-VHS but those numbers are pure BS.


I don't remember any of us talking aobut such things.

My memory is that Betamax originally had 240 lines, which equates to about
3MHz, the minimum bandwidth needed for an "acceptably" sharp picture on a
20" screen. 500 lines of resolution on American TV would require more than
6MHz of luminance bandwidth, an impossibility for Beta or VHS.


  #82   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Scott Dorsey Scott Dorsey is offline
external usenet poster
 
Posts: 16,853
Default Harsh, "aliased" sound with digital TV converter box.

William Sommerwerck wrote:
Since the s-video output and the composite output are both NTSC,
it is impossible for either the s-video output or the composite output
to have *more* output than the NTSC output -- they *ARE* NTSC
outputs.


This might be true in practice, but "it ain't necessarily so".


How would they not be NTSC?
--scott
--
"C'est un Nagra. C'est suisse, et tres, tres precis."
  #83   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Arny Krueger Arny Krueger is offline
external usenet poster
 
Posts: 17,262
Default Harsh, "aliased" sound with digital TV converter box.

"William Sommerwerck" wrote in
message

Yes, it's brilliant. (It's one of the great 20th century
inventions.) But -- and I will keep repeating this ad
nauseum -- the reason color TV systems (of all sorts) can
"get away" with reduced chroma bandwidth


If we extrapolate this discussion to audio, then we have Willaim
Sommerwerck, MP3 advocate! ;-)

has little to do
with the eye's (relatively) limited color resolution and


The eye's limited color and intensity resolution is at least an effect that
is at least perceivable.

We have to distinguish between sensory limits like those due to rods and
cones in the eye, and perceptual limits due to the brain's data crunching
bottlenecks and limited training.

a great deal to do with what I said -- most objects are
colored with constant saturation,


This is like saying that all objects are each well-modeled as being painted
all over with the same paint, with such variations in coloration that exist
being related to things like the location of light sources.

which greatly reduces
the required bandwidth for the chrominance signals -- or
more precisely, lets them convey more useful information.


The "same paint" model works well for a lot of artificial objects, some
natural objects, and many more natural objects if viewed from a distance.

But it doesn't work for everything.

Many objects, both artificial and natural, don't follow the "same paint"
rule.

I contemplate the evolution of special effects in movies. In their day, I
found the special effects in the better early-1950s science fiction movies
to be compelling. Today, my brain is trained by experience to see through
many of them, and I perceive them as being hopelessly toy-like. I can do a
binary search of sorts and contemplate the special effects in the hottest
movies of the early 80s. While the better movies of those days are not so
toy-like, they are also not nearly as compelling as the latest-greatest.


  #84   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Arny Krueger Arny Krueger is offline
external usenet poster
 
Posts: 17,262
Default Harsh, "aliased" sound with digital TV converter box.

"Scott Dorsey" wrote in message

William Sommerwerck wrote:


Since the s-video output and the composite output are
both NTSC,


Only if the source is NTSC. Today we have many common video sources that
exceed NTSC limits in many ways.

it is impossible for either the s-video output or the
composite output to have *more* output than the NTSC
output -- they *ARE* NTSC outputs.


This might be true in practice, but "it ain't
necessarily so".


How would they not be NTSC?


Only broadcast video *must* be NTSC, right?


  #85   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Albert Manfredi[_2_] Albert Manfredi[_2_] is offline
external usenet poster
 
Posts: 37
Default Harsh, "aliased" sound with digital TV converter box.

On Feb 28, 8:37*am, "William Sommerwerck"
wrote:

But --
and I will keep repeating this ad nauseum -- the reason color TV systems (of
all sorts) can "get away" with reduced chroma bandwidth has little to do
with the eye's (relatively) limited color resolution and a great deal to do
with what I said -- most objects are colored with constant saturation, which
greatly reduces the required bandwidth for the chrominance signals -- or
more precisely, lets them convey more useful information.


If it's true that colors are applied with constant saturation, then
that would explain why many consider NTSC to provide cartoon-like
images. Perhaps what you're really saying is that the eyes are not as
sensitive to color intensity variations as they are to luminance
variations, in which case it's back to "limited color resolution."

Bert


  #86   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default Harsh, "aliased" sound with digital TV converter box.

"Albert Manfredi" wrote in message
...
On Feb 28, 8:37 am, "William Sommerwerck"
wrote:

and I will keep repeating this ad nauseum -- the reason color TV systems
(of all sorts) can "get away" with reduced chroma bandwidth has little to

do
with the eye's (relatively) limited color resolution and a great deal to

do
with what I said -- most objects are colored with constant saturation,

which
greatly reduces the required bandwidth for the chrominance signals -- or
more precisely, lets them convey more useful information.


If it's true that colors are applied with constant saturation, then
that would explain why many consider NTSC to provide cartoon-like
images. Perhaps what you're really saying is that the eyes are not as
sensitive to color intensity variations as they are to luminance
variations, in which case it's back to "limited color resolution."

I'm saying nothing of the sort.

Those who consider NTSC to provide "cartoon-like images" know nothing about
color television. PAL and SECAM use color-difference signals as well.

When you subtract the luminance signal from the color-primary signals, you
get a signal that represents ONLY the saturation of the color -- nothing
else. As most colors are of constant saturation, there is less "information"
in the color-difference signal -- redundant information that already appears
in the luminance -- thus requiring less bandwidth.

If you don't understand this, think of a cube illuminated from the side, and
what the color-primary image and color-difference image would look like.
Then you will understand.


  #87   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default Harsh, "aliased" sound with digital TV converter box.

"Arny Krueger" wrote in message
...
"William Sommerwerck" wrote in
message


Yes, it's brilliant. (It's one of the great 20th century
inventions.) But -- and I will keep repeating this ad
nauseum -- the reason color TV systems (of all sorts)
can "get away" with reduced chroma bandwidth


If we extrapolate this discussion to audio, then we have Willaim
Sommerwerck, MP3 advocate! ;-)


God, no. I hate compressed audio. (Dolby Digital, at least.)


has little to do
with the eye's (relatively) limited color resolution and


The eye's limited color and intensity resolution is at least
an effect that is at least perceivable.


In the NTSC system, this difference shows up in the bandwidth of the color
signals. The researchers determined that (for a 480-line, 30-frame system,
on a 21" screen, presumably) you could see full red/green/blue-primaries
color only up to 0.5MHz, while only colors that could be synthesized from
red-orange and blue-green primaries were visible from 0.5MHz to 1.5MHz. *
Above 1.5MHz, the eye saw only B&W. So the 0.5MHz I signal is yellow/purple,
and the 1.5MHz Q signal is red-orange/blue-green.

But this has _nothing whatever_ to do with what I'm talking about.

One of the nice things about NTSC (and PAL for that matter -- they're
basically the same system) is that the use of color-difference signals
(rather than color-primary signals) removes any redundancy. The color
signals contain ZERO information about brightness. Which is good, because
the brightness information is already conveyed by the Y signal. It doesn't
need to be contained in the color signals.

* Note that these are roughly the early two-strip Technicolor primaries. The
overall gamut is not very broad, but "Secret of the Wax Museum" is
surprisingly good.


This is like saying that all objects are each well-modeled as being
painted all over with the same paint, with such variations in coloration
that exist being related to things like the location of light sources.


Not coloration (hue), but lightness (value). Otherwise, that is absolutely
correct.


The "same paint" model works well for a lot of artificial objects, some
natural objects, and many more natural objects if viewed from a distance.


Many objects, both artificial and natural, don't follow the "same paint"
rule.


But it's true for most objects, natural or artificial. If you don't believe
this, try to find any colored object -- natural or artificial -- that is
_not_ "constant saturation". You're most likely to find it in flowers and
fabric patterns.

I don't want to argue this too much, because most people don't have a good
understanding of color analysis and systhesis.


  #88   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Arny Krueger Arny Krueger is offline
external usenet poster
 
Posts: 17,262
Default Harsh, "aliased" sound with digital TV converter box.

"William Sommerwerck" wrote in
message
"Arny Krueger" wrote in message
...
"William Sommerwerck" wrote
in
message


Yes, it's brilliant. (It's one of the great 20th century
inventions.) But -- and I will keep repeating this ad
nauseum -- the reason color TV systems (of all sorts)
can "get away" with reduced chroma bandwidth


If we extrapolate this discussion to audio, then we have
Willaim Sommerwerck, MP3 advocate! ;-)


God, no. I hate compressed audio. (Dolby Digital, at
least.)


(1) Dolby Digital is really old-old tech, predating MP3 by lots.
(2) Usual grousing about what people say and what they actually hear with
their eyes closed.

has little to do
with the eye's (relatively) limited color resolution and


The eye's limited color and intensity resolution is at
least
an effect that is at least perceivable.


In the NTSC system, this difference shows up in the
bandwidth of the color signals. The researchers
determined that (for a 480-line, 30-frame system, on a
21" screen, presumably) you could see full
red/green/blue-primaries color only up to 0.5MHz,


Unfortunately, by the end of the NTSC era, 32 and 36 inch sets were
mainstream, even average. 42 inch sets were common. NTSC looked like $#@!!
on large screens - barely tolerable on 32 inch sets.

while
only colors that could be synthesized from red-orange and
blue-green primaries were visible from 0.5MHz to 1.5MHz.
* Above 1.5MHz, the eye saw only B&W. So the 0.5MHz I
signal is yellow/purple, and the 1.5MHz Q signal is
red-orange/blue-green.


This would all be sensory-based, no doubt tested with what amounted to be
synthetic, worst-case test patterns.

But this has _nothing whatever_ to do with what I'm
talking about.


Right, you're talking about perception.

One of the nice things about NTSC (and PAL for that
matter -- they're basically the same system) is that the
use of color-difference signals (rather than
color-primary signals) removes any redundancy. The color
signals contain ZERO information about brightness. Which
is good, because the brightness information is already
conveyed by the Y signal. It doesn't need to be contained
in the color signals.


OK.

* Note that these are roughly the early two-strip
Technicolor primaries. The overall gamut is not very
broad, but "Secret of the Wax Museum" is surprisingly
good.


OK.

This is like saying that all objects are each
well-modeled as being painted all over with the same
paint, with such variations in coloration that exist
being related to things like the location of light
sources.


Not coloration (hue), but lightness (value). Otherwise,
that is absolutely correct.


Ah yes, I need to reload some long-unused parts of my vocabulary.

The "same paint" model works well for a lot of
artificial objects, some natural objects, and many more
natural objects if viewed from a distance.


Many objects, both artificial and natural, don't follow
the "same paint" rule.


But it's true for most objects, natural or artificial. If
you don't believe this, try to find any colored object --
natural or artificial -- that is _not_ "constant
saturation".


By this you mean constant saturation of a given color hue, no?

You're most likely to find it in flowers and
fabric patterns.


And certain trees and rocks. Water with certain lighting and/or degrees of
activity. Artificial objects with exposed frameworks. Artificial objects
designed to be highly visible. Much text.

I don't want to argue this too much, because most people
don't have a good understanding of color analysis and
systhesis.


One point is that the DVD was one of the larger beginnings of the end for
NTSC TV.


  #89   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Scott Dorsey Scott Dorsey is offline
external usenet poster
 
Posts: 16,853
Default Harsh, "aliased" sound with digital TV converter box.

Arny Krueger wrote:
"Scott Dorsey" wrote in message

William Sommerwerck wrote:


Since the s-video output and the composite output are
both NTSC,


Only if the source is NTSC. Today we have many common video sources that
exceed NTSC limits in many ways.

it is impossible for either the s-video output or the
composite output to have *more* output than the NTSC
output -- they *ARE* NTSC outputs.


This might be true in practice, but "it ain't
necessarily so".


How would they not be NTSC?


Only broadcast video *must* be NTSC, right?


Well, in terms of the fact that the FCC will only come after you if your
broadcast waveform doesn't match the NTSC specs, yes. But in fact, just
about everything in use today meets the NTSC specs, other than VHS machines
which need a time base corrector to meet timing specifications and which
are going away very fast.
--scott
--
"C'est un Nagra. C'est suisse, et tres, tres precis."
  #90   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default Harsh, "aliased" sound with digital TV converter box.

"Arny Krueger" wrote in message
...

In the NTSC system, this difference shows up in the
bandwidth of the color signals. The researchers
determined that (for a 480-line, 30-frame system,
on a 21" screen, presumably) you could see full
red/green/blue-primaries color only up to 0.5MHz,


Unfortunately, by the end of the NTSC era, 32 and 36 inch sets
were mainstream, even average. 42 inch sets were common.
NTSC looked like $#@!! on large screens -- barely tolerable on 32"
sets.


I remember the early 25" Sony consoles. They had really weak color, though I
don't know why.

However, I own a 32" Toshiba IDTV and and Sony 36" IDTV. They display
spectacularly good NTSC images. Both digitally goose the luminance, and (as
far as I know) both have full-bandwidth chroma demodulation.

By the way, the original Advent projector had full-bandwidth color.


But this has _nothing whatever_ to do with what I'm
talking about.


Right, you're talking about perception.


No, I'm talking objective fact. Color-difference signals require less
bandwidth than color-primary signals.


Many objects, both artificial and natural, don't follow
the "same paint" rule.


But it's true for most objects, natural or artificial. If
you don't believe this, try to find any colored object --
natural or artificial -- that is _not_ "constant
saturation".


By this you mean constant saturation of a given color hue, no?


Yes. It would be meaningless to talk about different hues.


And certain trees and rocks. Water with certain lighting and/or degrees
of activity. Artificial objects with exposed frameworks. Artificial

objects
designed to be highly visible. Much text.


Text? Are you referring to illuminated manuscripts? grin


One point is that the DVD was one of the larger beginnings
of the end for NTSC TV.


I don't want to be too quick to defend NTSC, but it can be exceptionally
good. It's not that NTSC is of lower quality than DVD, but rather that DVD
is better.




  #91   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
trotsky trotsky is offline
external usenet poster
 
Posts: 12
Default Harsh, "aliased" sound with digital TV converter box.

Arny Krueger wrote:
"William Sommerwerck" wrote in
message
"Arny Krueger" wrote in message
...
"William Sommerwerck" wrote
in
message
Yes, it's brilliant. (It's one of the great 20th century
inventions.) But -- and I will keep repeating this ad
nauseum -- the reason color TV systems (of all sorts)
can "get away" with reduced chroma bandwidth
If we extrapolate this discussion to audio, then we have
Willaim Sommerwerck, MP3 advocate! ;-)


God, no. I hate compressed audio. (Dolby Digital, at
least.)


(1) Dolby Digital is really old-old tech, predating MP3 by lots.


What difference does it make when it was created?
  #92   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
pj pj is offline
external usenet poster
 
Posts: 4
Default Harsh, "aliased" sound with digital TV converter box.

Scott Dorsey wrote:
William Sommerwerck wrote:
Since the s-video output and the composite output are both NTSC,
it is impossible for either the s-video output or the composite output
to have *more* output than the NTSC output -- they *ARE* NTSC
outputs.

This might be true in practice, but "it ain't necessarily so".


How would they not be NTSC?
--scott


A case for S-Video in preference to Composite:

Let's consider the case of cable delivery of a
480i broadcast.

The signal originates from the station as a
digital feed from the station to the cable
company. (In San Diego, Cox maintains fiber
feeds from each 'must carry' station.) At this
point, the signal it subject to the limitations
of the NTSC spec and Cox is receiving something
better than the OTA signal.

The cable company produces two distinct products:

1) A conventional NTSC analog signal that it
delivers to the customer (via format conversions
as it travels through the cable infrastructure).
This RF signal is delivered directly to the
customer's TV receiver or, is demodulated in the
STB and presented to the customer as either an
R.F. signal, in NTSC format on Channel 3/4; or,
a composite video signal -- essentially the
baseband version of the NTSC signal; or, an
S-Video output of luminance and color. All
three of these outputs are limited in quality by
the limitations inherent in NTSC.

2) A digital signal applied, along with one or
more other signals, to an RF channel compatible
with the STB. This signal will have been
sufficiently compressed to fit in the allocated
bandwidth.

This signal is detected and made available to
the customer by perhaps four outputs; RF (NTSC),
Composite (NTSC-baseband), Component and
S--Video. The RF and Composite outputs are
subject to the limitations inherent in the NTSC
spec. The S-Video and Component outputs may be
slightly superior since NTSC wasn't imposed
between the originating station and the
customer's STB.

I've also seen this work in reverse where the
cable company heavily compressed the digital
feeds for less popular media (to fit three or
four signals into a single RF slot). The analog
signals (raw NTSC-RF) were superior to the
output from the STB.
--
pj
  #93   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Arny Krueger Arny Krueger is offline
external usenet poster
 
Posts: 17,262
Default Harsh, "aliased" sound with digital TV converter box.

"trotsky" wrote in message
news:vACxj.1219$TT4.358@attbi_s22
Arny Krueger wrote:
"William Sommerwerck" wrote
in message

"Arny Krueger" wrote in message
...
"William Sommerwerck"
wrote in
message

Yes, it's brilliant. (It's one of the great 20th
century inventions.) But -- and I will keep repeating
this ad nauseum -- the reason color TV systems (of
all sorts) can "get away" with reduced chroma
bandwidth
If we extrapolate this discussion to audio, then we
have Willaim Sommerwerck, MP3 advocate! ;-)


God, no. I hate compressed audio. (Dolby Digital, at
least.)


(1) Dolby Digital is really old-old tech, predating MP3
by lots.


What difference does it make when it was created?


Perceptual coding was and is a work in progress. Progress was pretty rapid
at the time that DD was introduced and the decade following it.

Dolby AC-3 AKA Dolby Digital was introduced in 1991. It is a proprietary
standard, and has not changed a lot over the years.

MP3 has remained a work in progress since 1989. The rate at which MP3 coders
were improved slowed down quite a bit after ca. 1998, but improvement may
still be possible.




  #94   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
trotsky trotsky is offline
external usenet poster
 
Posts: 12
Default Harsh, "aliased" sound with digital TV converter box.

Arny Krueger wrote:
"trotsky" wrote in message
news:vACxj.1219$TT4.358@attbi_s22
Arny Krueger wrote:
"William Sommerwerck" wrote
in message

"Arny Krueger" wrote in message
...
"William Sommerwerck"
wrote in
message

Yes, it's brilliant. (It's one of the great 20th
century inventions.) But -- and I will keep repeating
this ad nauseum -- the reason color TV systems (of
all sorts) can "get away" with reduced chroma
bandwidth
If we extrapolate this discussion to audio, then we
have Willaim Sommerwerck, MP3 advocate! ;-)
God, no. I hate compressed audio. (Dolby Digital, at
least.)
(1) Dolby Digital is really old-old tech, predating MP3
by lots.


What difference does it make when it was created?


Perceptual coding was and is a work in progress. Progress was pretty rapid
at the time that DD was introduced and the decade following it.

Dolby AC-3 AKA Dolby Digital was introduced in 1991. It is a proprietary
standard, and has not changed a lot over the years.

MP3 has remained a work in progress since 1989. The rate at which MP3 coders
were improved slowed down quite a bit after ca. 1998, but improvement may
still be possible.



You're not making sense. Did Dolby do their homework and do sufficient
blind tests to "prove" that their codec was transparent to people?
Maybe you're a different Arny Krueger and have come to realize that
these blind tests are ineffective.
  #95   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Scott Dorsey Scott Dorsey is offline
external usenet poster
 
Posts: 16,853
Default Harsh, "aliased" sound with digital TV converter box.

In article , pj wrote:
Scott Dorsey wrote:
William Sommerwerck wrote:
Since the s-video output and the composite output are both NTSC,
it is impossible for either the s-video output or the composite output
to have *more* output than the NTSC output -- they *ARE* NTSC
outputs.
This might be true in practice, but "it ain't necessarily so".


How would they not be NTSC?


A case for S-Video in preference to Composite:


Oh, there are many strong cases for S-Video over composite. But both are
NTSC. The S-Video is also NTSC, it's just not RS-170.
--scott

--
"C'est un Nagra. C'est suisse, et tres, tres precis."


  #96   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default Harsh, "aliased" sound with digital TV converter box.

"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21...

You're not making sense. Did Dolby do their homework and do
sufficient blind tests to "prove" that their codec was transparent
to people? Maybe you're a different Arny Krueger and have come
to realize that these blind tests are ineffective.


It doesn't matter. Dolby Digital is so bad that you can hear its problems
without comparing it with anything else.

Before Arny objects... I was accustomed to listening to CD-format stereo
from my LaserDisks. I was continually surprised and pleased with the great
transparency, cleanliness, and "ease" of the sound.

The first time I decoded a Dolby Digital signal ("The Incredibles") I could
hear the difference -- flat, grainy, dry, blah sound.

The audibility of lossy codecs varies with the quality of the playback
system. Over my computer speakers (Monsoon planar magnetics), KUOW sounds
fine. Not only is it clean and transparent, but I've never heard anything
that I interpreted as an artifact. (This is the Microsoft codec.)


  #97   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Arny Krueger Arny Krueger is offline
external usenet poster
 
Posts: 17,262
Default Harsh, "aliased" sound with digital TV converter box.

"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21
Arny Krueger wrote:
"trotsky" wrote in message
news:vACxj.1219$TT4.358@attbi_s22
Arny Krueger wrote:
"William Sommerwerck"
wrote in message

"Arny Krueger" wrote in message
...
"William Sommerwerck"
wrote in
message

Yes, it's brilliant. (It's one of the great 20th
century inventions.) But -- and I will keep
repeating this ad nauseum -- the reason color TV
systems (of all sorts) can "get away" with reduced
chroma bandwidth
If we extrapolate this discussion to audio, then we
have Willaim Sommerwerck, MP3 advocate! ;-)
God, no. I hate compressed audio. (Dolby Digital, at
least.)
(1) Dolby Digital is really old-old tech, predating MP3
by lots.


What difference does it make when it was created?


Perceptual coding was and is a work in progress.
Progress was pretty rapid at the time that DD was
introduced and the decade following it. Dolby AC-3 AKA Dolby Digital was
introduced in 1991. It
is a proprietary standard, and has not changed a lot
over the years. MP3 has remained a work in progress since 1989. The rate
at which MP3 coders were improved slowed down quite a
bit after ca. 1998, but improvement may still be
possible.


You're not making sense.


Please clarify, because the questions that follow are not requests for
clarification.

Did Dolby do their homework and
do sufficient blind tests to "prove" that their codec was
transparent to people?


AFAIK, Dolby never claimed that DD was perfectly transparent. The MPEG group
coder tests in the late 1990s showed that Dolby Digital was not sonically
transparent and generally inferior to other, more modern codecs.

Maybe you're a different Arny Krueger


Nope. Just older and wiser. ;-)

and have come to realize that these blind tests are ineffective.


How so? The fact that AC-3 was a substandard codec based on the MPEG
Group's blind tests was pretty well publicized by the MPEG and the AES. This
was no doubt a bit of an embarrassment to Dolby. Dolby has been doing their
own blind tests for decades.

Dolby subsequently came out with a new multimodal system for coding and
decoding audio known as Dolby TrueHD. In some modes, TrueHD is definitely
sonically transparent.


  #98   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
trotsky trotsky is offline
external usenet poster
 
Posts: 12
Default Harsh, "aliased" sound with digital TV converter box.

William Sommerwerck wrote:
"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21...

You're not making sense. Did Dolby do their homework and do
sufficient blind tests to "prove" that their codec was transparent
to people? Maybe you're a different Arny Krueger and have come
to realize that these blind tests are ineffective.


It doesn't matter. Dolby Digital is so bad that you can hear its problems
without comparing it with anything else.

Before Arny objects... I was accustomed to listening to CD-format stereo
from my LaserDisks. I was continually surprised and pleased with the great
transparency, cleanliness, and "ease" of the sound.

The first time I decoded a Dolby Digital signal ("The Incredibles") I could
hear the difference -- flat, grainy, dry, blah sound.



Agreed on all counts.


The audibility of lossy codecs varies with the quality of the playback
system. Over my computer speakers (Monsoon planar magnetics), KUOW sounds
fine. Not only is it clean and transparent, but I've never heard anything
that I interpreted as an artifact. (This is the Microsoft codec.)



What are you trying to say? Are you saying the lossiness of DD would be
audible over your computer speakers or not?
  #99   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default Harsh, "aliased" sound with digital TV converter box.

The audibility of lossy codecs varies with the quality of the playback
system. Over my computer speakers (Monsoon planar magnetics),
KUOW sounds fine. Not only is it clean and transparent, but I've never
heard anything that I interpreted as an artifact. (This is the Microsoft
codec.)


What are you trying to say? Are you saying the lossiness of DD would
be audible over your computer speakers or not?


No, I'm saying that the Monsoons, good as they are, aren't Apogees.


  #100   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Anim8rFSK Anim8rFSK is offline
external usenet poster
 
Posts: 14
Default Harsh, "aliased" sound with digital TV converter box.

In article ,
"William Sommerwerck" wrote:

The first time I decoded a Dolby Digital signal ("The Incredibles") I could
hear the difference -- flat, grainy, dry, blah sound.


Ooo, nice description.

--
Star Trek 09:

No Shat, No Show.
http://www.disneysub.com/board/noshat.jpg


  #101   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Alan[_2_] Alan[_2_] is offline
external usenet poster
 
Posts: 16
Default (now) color difference signals

In article "William Sommerwerck" writes:

No, I'm talking objective fact. Color-difference signals require less
bandwidth than color-primary signals.


I rather doubt that. Even small changes in small areas would produce the
same bandwidth. However, the magnitude of the signal may be less.

The critical thing is that *because of the limitations of human vision*,
one can get away with reducing the bandwidth of the color difference signals.

The original color difference signal may well have full bandwidth --
since the color difference signals vary with hue variation, even when
the saturation remains the same. (They also vary with saturation change
even with hue remaining the same.)

Alan
  #102   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default (now) color difference signals

"Alan" wrote in message
...
In article "William Sommerwerck" writes:


No, I'm talking objective fact. Color-difference signals require less
bandwidth than color-primary signals.


I rather doubt that. Even small changes in small areas would produce the
same bandwidth. However, the magnitude of the signal may be less.


Correct. But color-difference signals DON'T HAVE THOSE SMALL CHANGES.


The critical thing is that *because of the limitations of human vision*,
one can get away with reducing the bandwidth of the color difference

signals.

Not so. Think about it. Saturation (which is what the amplitude of the
color-difference signal represents) "never" (well, hardly ever) changes as
rapidly as luminance.


The original color difference signal may well have full bandwidth --
since the color difference signals vary with hue variation, even when
the saturation remains the same. (They also vary with saturation change
even with hue remaining the same.)


See above.

This is the typical knee-jerk reaction to something someone hasn't bothered
to think through.


  #103   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
trotsky trotsky is offline
external usenet poster
 
Posts: 12
Default Harsh, "aliased" sound with digital TV converter box.

Arny Krueger wrote:
"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21
Arny Krueger wrote:
"trotsky" wrote in message
news:vACxj.1219$TT4.358@attbi_s22
Arny Krueger wrote:
"William Sommerwerck"
wrote in message

"Arny Krueger" wrote in message
...
"William Sommerwerck"
wrote in
message

Yes, it's brilliant. (It's one of the great 20th
century inventions.) But -- and I will keep
repeating this ad nauseum -- the reason color TV
systems (of all sorts) can "get away" with reduced
chroma bandwidth
If we extrapolate this discussion to audio, then we
have Willaim Sommerwerck, MP3 advocate! ;-)
God, no. I hate compressed audio. (Dolby Digital, at
least.)
(1) Dolby Digital is really old-old tech, predating MP3
by lots.
What difference does it make when it was created?
Perceptual coding was and is a work in progress.
Progress was pretty rapid at the time that DD was
introduced and the decade following it. Dolby AC-3 AKA Dolby Digital was
introduced in 1991. It
is a proprietary standard, and has not changed a lot
over the years. MP3 has remained a work in progress since 1989. The rate
at which MP3 coders were improved slowed down quite a
bit after ca. 1998, but improvement may still be
possible.


You're not making sense.


Please clarify, because the questions that follow are not requests for
clarification.

Did Dolby do their homework and
do sufficient blind tests to "prove" that their codec was
transparent to people?


AFAIK, Dolby never claimed that DD was perfectly transparent.



And your working definition for "perfectly transparent" is what, exactly?


The MPEG group
coder tests in the late 1990s showed that Dolby Digital was not sonically
transparent and generally inferior to other, more modern codecs.



At all bit rates? And again, what is the definition or "sonically
transparent"--when people with Radio Shack stereos can tell the difference?


Maybe you're a different Arny Krueger


Nope. Just older and wiser. ;-)



Yeah, I'll buy the older part. If you're saying blind test results have
to be taken with a grain of salt then I'll buy the wiser part.


and have come to realize that these blind tests are ineffective.


How so? The fact that AC-3 was a substandard codec based on the MPEG
Group's blind tests was pretty well publicized by the MPEG and the AES.



Perhaps you can show us a cite for these results, then.


This
was no doubt a bit of an embarrassment to Dolby. Dolby has been doing their
own blind tests for decades.

Dolby subsequently came out with a new multimodal system for coding and
decoding audio known as Dolby TrueHD. In some modes, TrueHD is definitely
sonically transparent.



Again, you are using a term without defining its meaning.
  #104   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Arny Krueger Arny Krueger is offline
external usenet poster
 
Posts: 17,262
Default Harsh, "aliased" sound with digital TV converter box.

"trotsky" wrote in message
news%Txj.55219$yE1.36314@attbi_s21
Arny Krueger wrote:
"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21


Did Dolby do their homework and
do sufficient blind tests to "prove" that their codec
was transparent to people?


AFAIK, Dolby never claimed that DD was perfectly
transparent.


And your working definition for "perfectly transparent"
is what, exactly?


Passes a bypass test with under any relevant test condition without audible
alternation.

The MPEG group
coder tests in the late 1990s showed that Dolby Digital
was not sonically transparent and generally inferior to
other, more modern codecs.


At all bit rates?


As typically used.

And again, what is the definition or
"sonically transparent"--when people with Radio Shack
stereos can tell the difference?


Relevant tests are used with selected, trained listeners. Listener
sensitivity is essential. Please see ITU recommendation BS 1116, which is
availble through the web.

Maybe you're a different Arny Krueger


Nope. Just older and wiser. ;-)



Yeah, I'll buy the older part. If you're saying blind
test results have to be taken with a grain of salt then
I'll buy the wiser part.


Let's put it this way - every test result must be considered in its context.
Sighted tests for signal quality are generally so invalid that they need not
be taken seriously at all. Blind test results are at least worth
considering.

and have come to realize that these blind tests are
ineffective.


How so? The fact that AC-3 was a substandard codec
based on the MPEG Group's blind tests was pretty well
publicized by the MPEG and the AES.


Perhaps you can show us a cite for these results, then.


Check the AES web site. They were published in the JAES some years back.

This
was no doubt a bit of an embarrassment to Dolby. Dolby
has been doing their own blind tests for decades.


Dolby subsequently came out with a new multimodal system
for coding and decoding audio known as Dolby TrueHD. In
some modes, TrueHD is definitely sonically transparent.


Again, you are using a term without defining its meaning.


Which term? I've used tons of them. I would expect that the readers of the
newsgroups we are posting to know what most common audio terms mean.


  #105   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Arny Krueger Arny Krueger is offline
external usenet poster
 
Posts: 17,262
Default Harsh, "aliased" sound with digital TV converter box.

"Arny Krueger" wrote in message

"trotsky" wrote in message
news%Txj.55219$yE1.36314@attbi_s21
Arny Krueger wrote:
"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21


Did Dolby do their homework and
do sufficient blind tests to "prove" that their codec
was transparent to people?


AFAIK, Dolby never claimed that DD was perfectly
transparent.


And your working definition for "perfectly transparent"
is what, exactly?


Passes a bypass test with under any relevant test
condition without audible alternation.


Correction:

Passes a bypass test with under any relevant test
condition without audible alteration.




  #106   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
jwvm jwvm is offline
external usenet poster
 
Posts: 336
Default (now) color difference signals

On Feb 29, 7:15 am, "William Sommerwerck"
wrote:

snip

Not so. Think about it. Saturation (which is what the amplitude of the
color-difference signal represents) "never" (well, hardly ever) changes as
rapidly as luminance.


You need to be careful here. While saturation is a function of color
differences, it needs to be normalized by the intensity. Simple color
differences are functions of both saturation and luminosity. Consider,
for example, calculating saturation in the HSI coordinate system as
illustrated in this link:

http://homepages.inf.ed.ac.uk/rbf/CV....html#tth_sEc3

Similar examples for other coordinate systems can be found he

http://en.wikipedia.org/wiki/Saturation_(color_theory)
  #107   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default (now) color difference signals

Not so. Think about it. Saturation (which is what the amplitude of the
color-difference signal represents) "never" (well, hardly ever) changes

as
rapidly as luminance.


You need to be careful here. While saturation is a function of color
differences, it needs to be normalized by the intensity.


That is PRECISELY the point. Subtracting Y from R, G, or B provides the
normalization and produces a saturation -- color-difference -- signal from
which brightness information has been removed. This is what we want.

More than 50 years ago, Electronics magazine published pictures of the NTSC
color signals, based on real scenes. The colors are completely "flat" --
they are of constant saturation, with no variation in brightness.

It's important to understand that NTSC and PAL are non-redundant systems.
None of the three signals contains information present in another.


  #108   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
trotsky trotsky is offline
external usenet poster
 
Posts: 12
Default Harsh, "aliased" sound with digital TV converter box.

Arny Krueger wrote:
"trotsky" wrote in message
news%Txj.55219$yE1.36314@attbi_s21
Arny Krueger wrote:
"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21


Did Dolby do their homework and
do sufficient blind tests to "prove" that their codec
was transparent to people?


AFAIK, Dolby never claimed that DD was perfectly
transparent.


And your working definition for "perfectly transparent"
is what, exactly?


Passes a bypass test with under any relevant test condition without audible
alternation.



A) That sentence makes no sense grammatically, and B) you have provided
no definition for "passes a bypass test". You keep making the same
mistakes over and over.


The MPEG group
coder tests in the late 1990s showed that Dolby Digital
was not sonically transparent and generally inferior to
other, more modern codecs.


At all bit rates?


As typically used.

And again, what is the definition or
"sonically transparent"--when people with Radio Shack
stereos can tell the difference?


Relevant tests are used with selected, trained listeners. Listener
sensitivity is essential. Please see ITU recommendation BS 1116, which is
availble through the web.



This is silly. I'll ask again: can you define what you're talking
about? What's a passing grade for "sonically transparent"--100%? 90%?
If 80% of the trained listeners can't tell the difference, is it then
"sonically transparent"?

Then there's Bill Somerwerck's point: how good is the resolution of the
equipment they're using? If the speakers are mediocre, that will skew
the results.


Maybe you're a different Arny Krueger
Nope. Just older and wiser. ;-)


Yeah, I'll buy the older part. If you're saying blind
test results have to be taken with a grain of salt then
I'll buy the wiser part.


Let's put it this way - every test result must be considered in its context.
Sighted tests for signal quality are generally so invalid that they need not
be taken seriously at all. Blind test results are at least worth
considering.



Sure, if you believe in Jesus I guess you can believe in the vagaries of
blind testing.


and have come to realize that these blind tests are
ineffective.


How so? The fact that AC-3 was a substandard codec
based on the MPEG Group's blind tests was pretty well
publicized by the MPEG and the AES.


Perhaps you can show us a cite for these results, then.


Check the AES web site. They were published in the JAES some years back.



Nice try, Arny. AES charges $5 for members and $20 for non-members for
each paper on the topic. Again, please provide a cite for what you're
talking about.


This
was no doubt a bit of an embarrassment to Dolby. Dolby
has been doing their own blind tests for decades.


Dolby subsequently came out with a new multimodal system
for coding and decoding audio known as Dolby TrueHD. In
some modes, TrueHD is definitely sonically transparent.


Again, you are using a term without defining its meaning.


Which term? I've used tons of them. I would expect that the readers of the
newsgroups we are posting to know what most common audio terms mean.



You have no working definition of "sonically transparent".
Intellectually, you are about as credible as a crack addict. Are you a
crack addict?
  #109   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
trotsky trotsky is offline
external usenet poster
 
Posts: 12
Default Harsh, "aliased" sound with digital TV converter box.

Arny Krueger wrote:
"Arny Krueger" wrote in message

"trotsky" wrote in message
news%Txj.55219$yE1.36314@attbi_s21
Arny Krueger wrote:
"trotsky" wrote in message
news:MbDxj.54004$yE1.27091@attbi_s21
Did Dolby do their homework and
do sufficient blind tests to "prove" that their codec
was transparent to people?
AFAIK, Dolby never claimed that DD was perfectly
transparent.
And your working definition for "perfectly transparent"
is what, exactly?

Passes a bypass test with under any relevant test
condition without audible alternation.


Correction:

Passes a bypass test with under any relevant test
condition without audible alteration.



You're still not speaking English.
  #110   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Richard Crowley Richard Crowley is offline
external usenet poster
 
Posts: 4,172
Default (now) color difference signals

"William Sommerwerck" wrote ...
Not so. Think about it. Saturation (which is what the amplitude of the
color-difference signal represents) "never" (well, hardly ever) changes

as
rapidly as luminance.


You need to be careful here. While saturation is a function of color
differences, it needs to be normalized by the intensity.


That is PRECISELY the point. Subtracting Y from
R, G, or B provides the normalization and produces a
saturation -- color-difference -- signal from which
brightness information has been removed. This is what
we want.


The original reason for even matrixing R-G-B into Y-*-*
was to preserve monochrome compatibility for those who
have B&W receivers. And then to produce a color signal
that could be relatively easily compressed and encoded
onto a subcarrier and then decoded at the receiver.

This is not necessarily "what we want" for those who prefer
their video ucompressed and uncompromised. It was a
kludge workaround to fit the 15-pound color signal into the
5-pound monochrome sack (channel bandwidth).

More than 50 years ago, Electronics magazine published
pictures of the NTSC color signals, based on real scenes.
The colors are completely "flat" -- they are of constant
saturation, with no variation in brightness.


Then you were looking at the color difference signals. (Pb,
Pr, etc.) after the luminance (Y) had been removed. We
will have to disagree whether to call those signals "color".
They are maybe "color difference" at best. If you saw the
original Red Green Blue signals, they have plenty of contrast.

It's important to understand that NTSC and PAL are
non-redundant systems. None of the three signals contains
information present in another.


It is equally important to remember that in NTSC and PAL,
the color-difference part of the signal is artifically frequency-
limited to save bandwidth during transmission/storage. Any
time you execute this kind of lossy compression, you irretrievably
lose information. Whether you are talking about audio or video.




  #111   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Scott Dorsey Scott Dorsey is offline
external usenet poster
 
Posts: 16,853
Default (now) color difference signals

Richard Crowley wrote:

The original reason for even matrixing R-G-B into Y-*-*
was to preserve monochrome compatibility for those who
have B&W receivers. And then to produce a color signal
that could be relatively easily compressed and encoded
onto a subcarrier and then decoded at the receiver.


There's really no future in it. People don't really _want_ color
anyway. I'm waiting until it's perfected before I buy a color set.
--scott
--
"C'est un Nagra. C'est suisse, et tres, tres precis."
  #112   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Richard Crowley Richard Crowley is offline
external usenet poster
 
Posts: 4,172
Default (now) color difference signals

"Scott Dorsey" wrote ...
Richard Crowley wrote:
The original reason for even matrixing R-G-B into Y-*-*
was to preserve monochrome compatibility for those who
have B&W receivers. And then to produce a color signal
that could be relatively easily compressed and encoded
onto a subcarrier and then decoded at the receiver.


There's really no future in it. People don't really _want_ color
anyway. I'm waiting until it's perfected before I buy a color set.


You missed the peak. Even as screen sizes increase and we
get "High Definition" 16x9 video, lossy compression is being
cranked up and turning video into watery puddles of what
used to be pictures.

I was trying to watch "Bone Detectives" on the Discovery
Channel on DishNetwork last night and the compression
was so high that it couldn't even keep up with the guy
walking across the sand. It was almost un-watchable on
my 13-inch video monitor. It would have looked like water-
damaged wallaper on a big screen TV.

For the decline of technical quality, along with the decline
of programming worth watching, I'm letting my satellite
subscription just expire.


  #113   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default (now) color difference signals

"Richard Crowley" wrote in message
...
"William Sommerwerck" wrote ...
Not so. Think about it. Saturation (which is what the amplitude of the
color-difference signal represents) "never" (well, hardly ever)

changes
as
rapidly as luminance.


You need to be careful here. While saturation is a function of color
differences, it needs to be normalized by the intensity.


That is PRECISELY the point. Subtracting Y from
R, G, or B provides the normalization and produces a
saturation -- color-difference -- signal from which
brightness information has been removed. This is what
we want.


The original reason for even matrixing R-G-B into Y-*-*
was to preserve monochrome compatibility for those who
have B&W receivers. And then to produce a color signal
that could be relatively easily compressed and encoded
onto a subcarrier and then decoded at the receiver.

This is not necessarily "what we want" for those who prefer
their video ucompressed and uncompromised. It was a
kludge workaround to fit the 15-pound color signal into the
5-pound monochrome sack (channel bandwidth).

More than 50 years ago, Electronics magazine published
pictures of the NTSC color signals, based on real scenes.
The colors are completely "flat" -- they are of constant
saturation, with no variation in brightness.


Then you were looking at the color difference signals. (Pb,
Pr, etc.) after the luminance (Y) had been removed. We
will have to disagree whether to call those signals "color".
They are maybe "color difference" at best. If you saw the
original Red Green Blue signals, they have plenty of contrast.

It's important to understand that NTSC and PAL are
non-redundant systems. None of the three signals contains
information present in another.


It is equally important to remember that in NTSC and PAL,
the color-difference part of the signal is artifically frequency-
limited to save bandwidth during transmission/storage. Any
time you execute this kind of lossy compression, you irretrievably
lose information. Whether you are talking about audio or video.


I'm not going to beat this to death, because what I've said is 100% correct,
and a bit of though will confirm that.

Simply limiting the bandwidth of a signal is not "compression" in any
ordinary sense. The real compression -- which is not lossy -- is subtracting
Y from the primary color signals. This permits the color-difference signals
to more "advantageously" use their limited bandwidth.

I've stopped discussing this. A few nights from now, when you're mulling
this over in bed, and the light goes on, you can post a "Oh, yeah... Now I
get it." response.


  #114   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Richard Crowley Richard Crowley is offline
external usenet poster
 
Posts: 4,172
Default (now) color difference signals

"William Sommerwerck" wrote ...
I'm not going to beat this to death, because what I've said is 100%
correct,
and a bit of though will confirm that.

Simply limiting the bandwidth of a signal is not "compression" in any
ordinary sense.


It most certainly is information compression in the truest sense.
The people that worked on the telephone system have known
it for nearly a century. Perhaps you need a bit more thought.

The real compression -- which is not lossy -- is subtracting
Y from the primary color signals. This permits the color-difference
signals
to more "advantageously" use their limited bandwidth.


That is not compression at all. That is simply changing the
format of the information. Your view of this appears to be
fundamentally incorrect.

I've stopped discussing this. A few nights from now, when you're mulling
this over in bed, and the light goes on, you can post a "Oh, yeah... Now I
get it." response.


Perhaps YOU will have that experience. I don't need it.


  #115   Report Post  
Posted to rec.audio.pro
Paul Stamler Paul Stamler is offline
external usenet poster
 
Posts: 1,614
Default (now) color difference signals


"Richard Crowley" wrote in message
...
"Scott Dorsey" wrote ...

There's really no future in it. People don't really _want_ color
anyway. I'm waiting until it's perfected before I buy a color set.


You missed the peak. Even as screen sizes increase and we
get "High Definition" 16x9 video, lossy compression is being
cranked up and turning video into watery puddles of what
used to be pictures.


No kidding. I've watched CNN a couple of times in the last month
(primaries), on a large screen, and it was absolutely unwatchable. The
commentator's mouth looked like a squashed blintz constantly changing in
shape as its borders dissolved and re-formed. If I'd seen that in the
Sixties at a light show I would have thought "groovy", but as it was I got
queasy and left. Digital TV makes everything look like a bad acid trip.

It didn't help that the audio was badly out of sync.

Peace,
Paul




  #116   Report Post  
Posted to rec.audio.pro
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default (now) color difference signals

No kidding. I've watched CNN a couple of times in the last month
(primaries), on a large screen, and it was absolutely unwatchable. The
commentator's mouth looked like a squashed blintz constantly changing in
shape as its borders dissolved and re-formed. If I'd seen that in the
Sixties at a light show I would have thought "groovy", but as it was I got
queasy and left. Digital TV makes everything look like a bad acid trip.


How large is large? I own two IDTVs (32" and 36") , and their image quality
is generally quite good.


  #117   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
William Sommerwerck William Sommerwerck is offline
external usenet poster
 
Posts: 4,718
Default (now) color difference signals

"Richard Crowley" wrote in message
...
"William Sommerwerck" wrote ...


The real compression -- which is not lossy -- is subtracting
Y from the primary color signals. This permits the color-difference
signals to more "advantageously" use their limited bandwidth.


That is not compression at all. That is simply changing the
format of the information. Your view of this appears to be
fundamentally incorrect.


Changing the format is one way of presenting the information in a
more-compact, more-useful fashion.

The "compression" produced by subtracting the luminance is the removal of
redundant information.


I've stopped discussing this. A few nights from now, when you're mulling
this over in bed, and the light goes on, you can post a "Oh, yeah... Now

I
get it." response.


Perhaps YOU will have that experience. I don't need it.


You will. YOU WILL...


  #118   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
jwvm jwvm is offline
external usenet poster
 
Posts: 336
Default Harsh, "aliased" sound with digital TV converter box.

On Feb 28, 1:56 pm, (Scott Dorsey) wrote:
In article , pj wrote:
Scott Dorsey wrote:
William Sommerwerck wrote:
Since the s-video output and the composite output are both NTSC,
it is impossible for either the s-video output or the composite output
to have *more* output than the NTSC output -- they *ARE* NTSC
outputs.
This might be true in practice, but "it ain't necessarily so".


How would they not be NTSC?


A case for S-Video in preference to Composite:


Oh, there are many strong cases for S-Video over composite. But both are
NTSC. The S-Video is also NTSC, it's just not RS-170.
--scott

--
"C'est un Nagra. C'est suisse, et tres, tres precis."


Unless the S-video signal is based on the PAL standard! :-)
  #119   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
G-squared G-squared is offline
external usenet poster
 
Posts: 21
Default (now) color difference signals

On Feb 29, 1:51*pm, "William Sommerwerck"
wrote:
"Richard Crowley" wrote in message

...

"William Sommerwerck" *wrote ...
The real compression -- which is not lossy -- is subtracting
Y from the primary color signals. This permits the color-difference
signals to more "advantageously" use their limited bandwidth.

That is not compression at all. That is simply changing the
format of the information. Your view of this appears to be
fundamentally incorrect.


Changing the format is one way of presenting the information in a
more-compact, more-useful fashion.

The "compression" produced by subtracting the luminance is the removal of
redundant information.



I've stopped discussing this. A few nights from now, when you're mulling
this over in bed, and the light goes on, you can post a "Oh, yeah... Now

I
get it." response.

Perhaps YOU will have that experience. I don't need it.


You will. YOU WILL...


Just FYI, the transcoding from RGB to Y, R-Y, B-Y is a lossless
transform and does not one iota of bandwidth change _until_ you run
the components through the bandpass filters on the way to the balanced
modulators. Also, there most certainly is equiband encoding going on.
I was working on a BetaCam SP just yesterday and the filters in the
encoder have the same part numbers. How could they be differrent? The
Sony broadcast cameras are the same story as is the Accom D-122
digital encoder - all of which were VERY common in Hollywood. You'd
have a mauch harder time finding a true IQ encoder. As I said earlier,
the last IQ encoder I saw was an RCA studio camera from 1976

GG
  #120   Report Post  
Posted to alt.video.digital-tv,rec.arts.tv,rec.audio.pro,sci.engr.television.advanced,alt.tv.tech.hdtv
Scott Dorsey Scott Dorsey is offline
external usenet poster
 
Posts: 16,853
Default Harsh, "aliased" sound with digital TV converter box.

jwvm wrote:
On Feb 28, 1:56 pm, (Scott Dorsey) wrote:
In article , pj wrote:
Scott Dorsey wrote:
William Sommerwerck wrote:
Since the s-video output and the composite output are both NTSC,
it is impossible for either the s-video output or the composite output
to have *more* output than the NTSC output -- they *ARE* NTSC
outputs.
This might be true in practice, but "it ain't necessarily so".


How would they not be NTSC?


A case for S-Video in preference to Composite:


Oh, there are many strong cases for S-Video over composite. But both are
NTSC. The S-Video is also NTSC, it's just not RS-170.


Unless the S-video signal is based on the PAL standard! :-)


Or SECAM for that matter!
--scott
--
"C'est un Nagra. C'est suisse, et tres, tres precis."
Reply
Thread Tools
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
IEEE article "I don't really have a replacement career,"Morein said. "It's a very gnawing thing." Sylvan Morein, DDS Vacuum Tubes 21 May 3rd 06 01:10 AM
"AKAI", "KURZWEIL", "ROLAND", DVDs and CDs [email protected] Audio Opinions 0 January 31st 06 09:08 AM


All times are GMT +1. The time now is 09:03 AM.

Powered by: vBulletin
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 AudioBanter.com.
The comments are property of their posters.
 

About Us

"It's about Audio and hi-fi"