Welcome! Need support, you got it. Or share you ideas and experiences.
173.8.230.38
In Reply to: RE: PerfectWave I2S HDMI cable claims don't make sense to me posted by skytag on July 28, 2009 at 23:41:21
While this is an interesting argument I'll have to add something you may not have considered. Jitter and timing to the digital signal. Today's digital devices use edge triggering to read the data. Cables change the rise time of the edges, and they do so in uneven ways depending on the frequency. When the rise time changes the angle of the slope changes and this means that the threshold of the edge changes in time. So as the signals come down, the point in time where they are recognized as a 1 or a zero moves with varying rise times and this causes the signal to be jittered. I don't have time this morning to go into depth but this is just one (major) way a cable has a significant effect on the way digital "sounds". Trust me, it's not hype.
Follow Ups:
Somehow I have serious doubts that the amount jitter introduced by what you describe in a few feet of HDMI cable would would even come close to being detectable by the human eye or ear. Plus, I would expect jitter in the devices in the chain -- player, processor, and in the case of video, the TV -- to far, far exceed anything introduced by 3' or 6' of HDMI cable. So while it might be an issue at a theoretical level, I can't imagine the benefit I'd get from spending $1000+ (output from source and output from processor for video) would even be detectable, much less give me real bang for the buck over the less than $15 I have invested now.
Understood. You'll just have to see what you think if you have the chance.
I have to agree. Digital is not variance in voltages which means, there are no variations of the original signal...there's just the 0's or 1's and there's only one way to interpret each as a 1 or a 0! If you're running HDCP, then you can guarantee that the reciever is getting 100% original signal, sort of like parity checking...That's how digital works. 16-24bits...of 0's or 1's not kinda a 0 or kinda 1. it's either or. NOW, if you happen to be missing a 0 or a 1, then yes, you have a corrupt signal, in the pc world, this becomes unusable data...or you hear a dramatic hiccup or pop, or digital noise. This isn't acceptable by any means, no one can convince me this is a common occurance, or no one would use digital. So once again, explain to me...HDMI is a a digital bus that carries a digi signal, you can't have loss or you don't have a signal...so forget that. but to have a bus that can manipulate the digi signal b4 it reaches the the other dsp isn't just a bus then. It's not just carrying the signal...on a pc mainboard...you have a northbridge that serves as a traffic cop between the ram, cpu, pci-e, and southbridge (before i7 it was also the mem controller as well) the fsb wasn't dictated by the wires between the nodes, it was dictated by the nodes. In this situation, the BD-player and the dsp in the TV.
It's the old "bits are bits" argument that has been around for a long time. What you have to remember is that bits are just bits but there's much more to the story than just reading ones and zeros. There are timing issues, extracting clocks, etc. Digital isn't the simplistic clear cut format you describe.
Do a search for Robert Harley's great article on why identical digital data sounds different. It's well thought out, technically accurate and will help you understand.
Post a Followup:
FAQ |
Post a Message! |
Forgot Password? |
|
||||||||||||||
|
This post is made possible by the generous support of people like you and our sponsors: