|
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
In Reply to: Is there a critical lenght for glass toslink cables? posted by MoodyDragon on July 17, 2004 at 19:45:05:
...for either coax or optical cable. Rule of thumb suggest shorter is always better for any type of cable/optical :)
Follow Ups:
Here's a link to a thread that suggests that shorter is not always beetter for a digital RCA cable. I am not sure if the same reasoning would hold for the glass cable in terems of refections ( no impedence mismatch!).slope
- http://db.audioasylum.com/cgi/m.mpl?forum=digital&n=81904&highlight=slope&session= (Open in New Window)
Given that a period of a digital signal is only 200 nanosecond long, and sometimes there are two transitions (going low to high, and back to low or vice versa) within that 300 nanosecond period due to signal being a Manchester type, then it become obvious that 30 nanosecond rise time might not be correct. If it was true, the signal would look more like a triangular wave than square.Also I found a post by Audioengr in audiogon.com saying that rise time for digital signal is only 10-12 nanosecond, and one post (second link) saying that length for digital cables are not important. Mmmhhhhh......
http://forum.audiogon.com/cgi-bin/fr.pl?fcabl&975720190&openflup&102&4#102
http://forum.audiogon.com/cgi-bin/fr.pl?fcabl&975720190&openflup&95&4#95
Either way, when it comes to cable length, you also have to take into account the transmission line theory. Transmission line theory state that longer the cable length is as compare with period frequency it is carrying, then impedance mismatch become less important (I wonder why Jon didn't catch that).
Also by increasing the cable length, you are increasing cable's capacitance, resistance and EMI interference against the signal it is carrying :)
Hi Tony: I'm no expert on these things so take my comments in that context.Your last comments: "Transmission line theory state that longer the cable length is as compare with period frequency it is carrying, then impedance mismatch become less important" seems to validate a longer cable for the application being discussed.
Also, your statement "increasing the cable length, you are increasing cable's capacitance, resistance and EMI interference against the signal it is carrying" appears as a valid concern for an anologue signal, but would it be equally important for a digital signal??
Cheers
Hi SlopeThe first statement you quoted should read:"Transmission line theory state that SHORTER the cable length is as compare with period frequency it is carrying, then impedance mismatch become less important". So shorter the cable is, the mismatch will matter less.
The second statement is also very valid for digital signals also. Since digital signal have very sharp corners, to keep sharp corner intact, the cable capacitance should be kept to absolute minimum as cable capacitance (or excessive of it) will round off the corners which will throw off the signals timing :)
Yah, I was wondering if you were gonna change that..As for corners, they REALLY have to be rounded to change signal timing..they logic thresholds aren't near the rails, but should be closer to mid level with hysteresis, both to minimize noise susceptibility..
It's a tradeoff...sharp corners, more reflection..
BTW...what is transmission line theory?? :-)
Hi JohnAs for the corners...with digital signal output from DVD/CD player only having 0.5 v amplitude (1 v p-p), then there isn't too room that can be allowed before reaching the threshold voltages. So errors should be kept minimum.
"BTW...what is transmission line theory?? :-)"Jn
What I am talking about is amplitude invariant..lower signal, lower thresholds....percentage of peak to peak...by percentage of full amplitude, the corners don't come into play immediately..You would have to review the receiver spec for thresholds and hysteresis to determine where it applies..
means no further text. Apparantly I'm supposed to have no more to say; or no desire to add anything past the suject text. If this were only true..I have a lot on my mind ..kids...cars...wife...business...
There is also a theory that some of the optical based receivers are prone to overload, and that a long optical cable will provide enough attenuation of the signal to avoid this overload. I believe someone posted on this in Digital Drive.If this were the case, then longer for an optical cable would be better!
Hmmmm...an interesting thought...That would be a nice topic to look into...I make the assumption that the followup electronics are well designed, and that it would be a result of non linear receiver optical chip function...
I tested fiber optic receivers in the past, but don't recall any kind of saturation issue, or test for that matter, that would see high signal overload issues...but, at 600 dollars a pop, I was, shall we say, scared S'less to attempt any test out of spec range....The assumption that an emp pulse could introduce higher than normal signals in the fiber optic cable was not addressed..
Cheers, John
I was not implying an overload in the purely optical portion of the system, nor some sort of optically based phenomenon, but rather, a case of excessive gain in the stages following the opto-coupler portion, whether in the opto-coupler section itself, or subsequent ones.
It was "clear" to me you were, and I believe it to be a rather good thought....and I am not sure that anyone has really addressed that...If it's the optic, the electric, or the combo, it's still a rather interesting avenue to persue..
This post is made possible by the generous support of people like you and our sponsors: