|
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
96.25.74.167
In Reply to: RE: RF coax posted by Jon Risch on August 04, 2012 at 22:35:40
Good article Jon, and I even agree which is why I've never tried them. But sometimes things audio work out differently than I think they will... However I have no problem continuing to give them a pass.
While not a TV fancier myself, I used to work in the same lab with both video and TV RF guys and I doubt if they would agree with you that "the levels of distortion that become visible are orders of magnitude higher than the levels of distortion that can be heard". The phrase most commonly heard, usually expressed in agony was 'group delay'. And even the RF guys fought it because even though the BW was far less than the baseband video they had to design incredibly tricky, high-Q filters to keep out of the adjacent channels. I was an ignorant, unbiased, uncaring guy working on avionics and so was felt to be a good 'man in the street' test case. The thing that really struck me was just how good clean NTSC video looked compared to the picture on my home TV and while I could see the problems they didn't seem very important. Not too surprisingly the video group leader was an audiophile...
Rick
Follow Ups:
FWIW, I did a stint as a video design engineer some years back. I was responsible for the design of an HDTV distribution amp, back when HDTV first was happening, and I also did a few other designs, among them, a video SCH phase meter module that fit into a rack mount mother chassis, etc.
So I do know what I am talking about, I myself SAW what the difference was between 8 bits and 10 bits of resolution (very little and only with training and having it pointed out to me) on a near-perfect NTSC signal generated from a live television camera (we're talking way past cable or DVD resolution of an NTSC signal) using a hand-built studio grade TV monitor. Past 10 bits, there wasn't ANYTHING visible as any sort of difference, even to trained and experienced eyes.
Did a similar experiment with HDTV signals several years ago when I visited an old friend from that industry, and while you would think that HDTV would respond to an even higher bit depth, in fact, it was not even certain if there was a noticeable difference between 8 and 10 bits. The resolution of an HDTV signal is in the bandwidth, not the bit depth, and the bit depth is what provides the dynamic range and literally "how far down you can see" capability.
Like many consumer items (such as how many Mega Pixels a digital camera has), having video units with 10 and 12 and even higher bit depth ADC's and DACs is mostly marketing BS. It looks better in the specs sheets, but doesn't provide a thing you can see. Similar to what goes on in audio, many of the 12 bit video ADC/DAC chips really only have 10 clean bits of resolution, and this is plenty.
So I stand by my statement, it was not made casually, but is based on real world experience and actual visual testing.
Jon Risch
"The resolution of an HDTV signal is in the bandwidth, not the bit depth"
That makes sense, my understanding is that the toughest distortion problems in video are usually temporal, but distortion none the less. Temporal that is while still a signal that is, they end up mapping to spatial problems when rendered.
I think audio suffers a similar fate but one harder to sift out with our senses and mind. They let us know something's wrong but not just what.
Guess that makes for an interesting hobby...
Regards, Rick
Post a Followup:
FAQ |
Post a Message! |
Forgot Password? |
|
||||||||||||||
|
This post is made possible by the generous support of people like you and our sponsors: