|
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
66.220.121.48
In Reply to: RE: 1.5m USB Cable = More Jitter posted by Dynobot on April 10, 2012 at 13:23:08
they are using really cheap crappy USB cable. all bets are off.
Read this:
http://www.positive-feedback.com/Issue14/spdif.htm
This was independently proven in AB/X tests done by UHF magazine in Canada.
Follow Ups:
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Assuming that it is an async USB interface, however the other cable parameters are important, such as common-mode rejection, shielding etc..The test that TI did was about jitter and eye-pattern. That's what we are talking about here.
Edits: 04/12/12
People think so highly of themselves and their deductive reasoning that they disgard facts.Fact IS, Texas Instruments has the knowledge, money, technique, and experience to perform solid experiments and are indeed correct.
Anyone who would rather believe a magazine article over a published experiment by a world renouned corporation is nuts.
Steve seems to wonder why people here disregard most of his claims as pure rhetoric. Well if you are so blind as to dismiss solid experimental data over profit seeking propaganda, then you deserve to be doubted.
This is why when I went searching for methods to improve my Linux machine and sound I went to reputable sources like IBM and Texas Instruments. They helped me lower my OS Jitter, increase process efficiency in Linux and reduce Jitter associated with USB cables.
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
Edits: 04/11/12 04/11/12
You forget that TI developed the first Adaptive USB interface chip, and they did a really poor job of it. It took them several iterations to get it right.
> > > You forget that TI developed the first Adaptive USB interface chip, and they did a really poor job of it. It took them several iterations to get it right.
But they got it right correct, and they are still doing plenty of studies on USB as well as developing new methods to reduce USB jitter....the right way.
No doubt you believe in your abilities, but TI is more than a bunch of idiot Engineers with crappy test equipment and methods. Surely they spend more on R&D than most Audiophile Dac makers make within a year and their R&D stands to make them Billions.
Pride is good to an extent, you should be proud of yourself and what you have accomplished. I myself think you have done an excellent job with your business -but- if TI says one thing and you say another, don't be offended if I go with TI.
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
Look, you need to understand what large companies are really like. I designed and managed design engineers for 16 years at Intel and was a design team lead on Pentium 2. I have been doing digital design for 36 years for various computer and computer peripheral companies. The number of really talented designers in this large population was actually surprisingly small. After modding Sony and Pioneer products for many years, I have a really good sense of the caliber of designers they get for consumer products.
I am not saying that the TI plots are not accurate. What I am saying is that a better cable of this length would have a more open eye-pattern. Better drivers at the transmitting end would help also. These are facts.
Big companies are not a panacea by a long shot. Often what happens is the really talented designers become managers too early and then the design work gets done by junior engineeers. This keeps design costs down and they can work them to death without complaints. Its a shame and a waste IMO. Most big companies have major difficulty with innovation and making it happen. You will notice how long we have been waiting for TI to put out another USB chip. It was XMOS who put out the latest generation, not TI. TI could have done it.
> > > > > Look, you need to understand what large companies are really like.You worked at Intel for 16 years
I worked at Ford Design&Engineering for 21 yearsAlthough Ford is named after a family its hardly a mom and pop business...ya think????
I know very well how large corporations work after spending more than 2 decades of my life in one...
Large corps are slow movers, big ships take time to turn. On the other hand they do have talent and they do have solid processes.
> > > > Often what happens is the really talented designers become managers too early and then the design work gets done by junior engineeers.
Perhaps at Intel but not at Ford. Talented Engineers are too valuable to move out of that position. People with less Engineering talent but more people talent get moved up and promoted. To compensate the talented Engineers get nice raises and bonuses and actually make nearly as much or just as much as the Manager. I have spent more than 20 years working with some real talented people, the likes of which I have yet to see on these boards...except for a few...
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
Edits: 04/12/12
I found the distinct shape of the two eye patterns interesting after I blew up the images so I could see them.
The connection between the USB jitter as seen on an eye pattern and the jitter in the recovered audio clock is far from clear. USB is packetized and there must be buffering involved to get over the gaps in data caused by the protocol. This will create jitter problems in the output audio samples even if there is zero jitter on the input. Contrast that with the situation with SPDIF where the relationship between the SPDIF clock and the DAC word clock is quite simple and if somehow, magically, there could be no jitter on the incoming waveform a clean clock would be readily available.
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Good spdif dacs have as little (or even less) measured jitter as usb async dacs.
Stereophile and other measurements of the outputs show this.
What I don't like is the dpendency of usb dac sound quality on the cable. The overall sonic characters change much more than with good spdif cables.
we would certainly see USB as the 'standard' for digital signal transmission. Industry has a need for the lowest possible way to transmit data...this hits there bottom line in terms of MONEY.
But we know that AES/EBU and SPDIF are the standard in the professional world, this is the case for a reason.
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
"But we know that AES/EBU and SPDIF are the standard in the professional world, this is the case for a reason."
These were designed by audio engineers with little knowledge of digital design or data communications. The AES/EBU system was designed for use over existing studio wiring and in utter ignorance that jitter issues might arise that could impact sound quality. SPDIF was a cheaper consumer version. AES/EBU and SPDIF are well established by reason of history, not technical merit. (All of this was according to the general AES focus on mass market rather than quality and its subservience to large scale industry and their marketing slogans, e.g. "Perfect sound forever." IMO the AES is a trade organization with little or no scientific credibility compared to real professional associations such as the IEEE.)
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
"For this theory to work, the particles have to be in a "superposition" of every possible state before they are measured. So instead of just representing ones and zeros, quantum systems can be both at once."
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
So are you saying that USB 'is' superior to Spdif?
I understand the "we do it this way because we always have..." mentality, but is USB truly superior?
If so, then IYO will we start to see USB-to-USB take over the industry for digital transmission. Not just the Audiophile world but industry that truly depends on lowest jitter from point-A to point-B.
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
The issue isn't the wire type, it's the method of flow control. The right way is for the DAC to pull data out of the transport rather than have the transport push a stream of timed samples to the DAC. With USB the right way is to use a proprietary protocol and driver for block mode transfer or to use the async protocol, which can use drivers provided by the operating system. With SPDIF the right way is to run an extra wire from the DAC back to the transport to slave the transport to the DAC's clock. If the wrong way is used, heroic measures will be needed to get good results. However, even with the right way results won't be excellent without careful implementation. A good system architecture is useless unless it is well implemented.
It's the entire system that counts and the way it is put together. It is not possible to break a system down into individual components and optimize each of these separately if one wants the best possible results. There is far too much interaction between components. A great SPDIF installation will beat a poor USB implementation and conversely. Note that when a "push" architecture is used the source, the cable and the sink are all in the signal path and hence critical to sound quality. When a "pull" architecture is used only the sink is in the signal path. The source and cable are not in the signal path and if they have an effect on sound quality it is only because of second order effects at the sink, i.e. various forms of leakage such as ground bounce and poor clock design.
Note that with both technologies, jitter on the cable is typically nearly as large as it can be to avoid data errors, since jitter is typically one of the factors limiting the maximum possible data rate. If the wire is conveying audio timing information because it is being run in push mode then the cable jitter will be 10 - 100 times larger than the jitter needed for full resolution reproduction out of a DAC. Jitter down at the level needed for audio quality is not readily measurable with any test equipment, and it certainly can't be seen on the screen of even an expensive scope. DACs that run in push mode must have a phase locked loop or other heroic measures to reduce the incoming jitter down to levels that will produce tolerable sound quality and these measures are neither cheap nor completely effective.
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
> > > > With SPDIF the right way is to run an extra wire from the DAC back to the transport to slave the transport to the DAC's clock.
What IF both the Dac and the transport were slaved to the same word-clock?
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
"What IF both the Dac and the transport were slaved to the same word-clock?"
If the word clock were external to both the DAC and the transport and separately buffered outputs were used then the jitter on the cable from the transport to the DAC would not affect the audio quality. However, the quality of the output would depend on the quality of the clock signal received by the DAC. In other words, the sound might not vary much with changes in the transport but it wouldn't be good unless the word clock was very high quality since the word clock would definitely be in the signal path. If the word clock is a separate wire that is isolated from other wires it won't have any signal dependent jitter, which is a problem with systems where the same wire is used for both clock and data. However, there will still be noise of various sorts, so it is unlikely that the word clock as received by the DAC will be as clean as the word clock at the actual oscillator. This problem can be avoided by locating the oscillator in the DAC.
Unfortunately, there is another problem with using an external word clock that affects delta-sigma DACs and other upsampling converters. These do the actual digital to analog conversion at a higher sampling rate than the word rate. If an external word clock is used then the received word clock must be multiplied up to the final master clock rate which will be some multiple of the word clock rate. There is no way to do this frequency multiplication without employing heroic measures (expensive, not completely effective).
The correct way to implement a system synchronized by a word clock is to run a high quality fixed crystal oscillator at the master clock frequency, and send this clean signal directly to the actual digital to analog converter. At the same time, buffer this oscillator signal and send it to a divide circuit (a digital counter) to divide the high master clock rate down to the output word clock. Then send this word clock out of the DAC to the transport. A DAC that works this way will have a word clock output, but not a word clock input.
I don't believe it is possible to tell whether a given DAC has implemented the clock architecture correctly from reading spec sheets and manuals. At the very least one has to open up the device and do a little reverse engineering, something that can be difficult these days due to a high degree of integration, programmable gate arrays, etc. One can look for crystals and ascertain their frequencies, etc. (I haven't reverse engineered any audio devices, but I have reverse engineered data comm devices as part of patent lawsuits, which involved a combination of various system tests, measurements, reading of chip numbers with a flashlight and magnifying glass, Googling, reading various company literature, technical, legal, and marketing.)
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
"the jitter on the cable from the transport to the DAC would not affect the audio quality"
True, but the cable still might affect audio quality due to commmon-mode noise or RFI differences from one to the next.
I practice, this is what happens, and I can prove it with a USB common-mode filter.
Common mode noise on the UBS cable shouldn't couple into the DAC. It should be blocked by the USB receiver circuitry and the buffering circuitry inside of the DAC that connects the USB receiver to the converter.
As to RFI, the cable might act as an antenna and radiate EMI. The DAC and other audio components and cabling should suppress this interference which isn't in the audio band.
In either case, if the cable affects the sound quality it is not the fault of the cable, it's due to a defect downstream of the cable.
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
.
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
I have the full pdf stored on my computer and will make it available on my site. Recently they have been doing some house cleaning so the original document is a little hard to find.
I think a word-clock generator being shared between the source and the dac would really resolve the jitter issue.
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
Word-Clock will not solve it. If you use WC to sync a system, then by definition the clock in that system must be a PLL. WC frequency is not Master Clock frequency. Its the MC, which is typically 256 times the WC that must have low-jitter.
This is why in my discontinued Pace-Car reclocker, I distributed Master Clock not WC. If WC is used to control a local oscillator in the DAC, the jitter will be significantly higher than using a Master Clock in the DAC and distributing it to the source.
Steve N.
Forget AB/X we are talking a world renouned Engineering and Technology company with tons of money and staffs of Engineers with way more knowledge and experience than P-Feedback.Surely you can not think Texas Instruments is wrong or don't know what they are doing.
Fact IS, even IF a $1000 USB cable was used the overall results will still be the same. Shorter runs of wire would still give small jitter numbers. IF the quality of wire were an issue we would see even more reduced jitter, but the results would still be consistent across the board.
Dynobots Audio
Music is the Bridge between Heaven and Earth - 音楽は天国と地球のかけ橋
Edits: 04/11/12 04/11/12 04/11/12
"Fact IS, even IF a $1000 USB cable was used the overall results will still be the same."
No, they would not. The eye would be more open and the jitter component smaller. The difference between a long $1K cable and a long $20 cable is huge. The losses, dielectric absorption, metallurgical discontinuity reflections and impedance variations are the reason.
Steve N.
I'm sure that Belden has plenty of inexpensive cable that has a very nice eye pattern in lengths that fit within the USB protocol's timing budget. There is no science behind the mega-priced cables USB, it's marketing BS. The people marketing these cables undoubtedly lack sufficient knowledge of cable design, transmission line theory, driver and receiver circuitry, logic design, etc. There is real expertise in digital cables, but it resides in IC chip manufacturers such as Intel and cable manufacturers such as Belden.
Real engineers do not associate themselves with snake oil marketing.
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Tony - you need to compare a $20 Belkin Gold (Which I incidently include with my USB converter) to the same length of Locus-Design Polestar playing music. The Polestar is not an expensive cable and not even the best out there, but absolutely kills the Belkin, and its more flexible. It's not marketing BS. I dont know that they even advertise this.
Belkin cable is a lot like all of the other "by-the-foot" cables on big spools made in China. Belden 1694A 75 ohm coax is the same. I have this too. They meet the specs for these types of cables, but it does not take much design skill to make something that sounds a lot better. Metals, dielectrics, shielding and geometries can all be improved.
Steve N.
There should be no audio signal going over a USB cable, only data. If there is an audio signal (i.e. a timed waveform that is correlated with the music) the design is broken, e.g. running in adaptive mode. If a USB cable effects the sound quality of a USB DAC then the USB DAC is not well designed. It may be that the best available USB DACs are not up to snuff, but if so the way to fix this is not a bunch of add-on band-aids. It is to do a better job of basic engineering. (But this is unlikely to happen because of the hockus-pokus snakoil nature of the high end audio mariet. I would look more to the pro-audio market, where the end users are technically knowledgeable so there is less room for nonsense.)
It's the same situation as with power cables. If a power cable affects the sound of the component attached it's because of poor design on the part of that component. A power cable is not supposed to be in the signal path. The situation is completely different with interconnects or speaker cables, where these wires are definitely in the signal path and have direct effect on the audio signal as well as directly interacting with the components on either end of the cable that are processing the signal.
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
"If a USB cable effects the sound quality of a USB DAC then the USB DAC is not well designed."
This is what I though as well when I designed-in a multiple reclocked ASync USB interface. I had really hoped that this would be the case, but I learned different.
The reality is that these cables still matter, maybe not as much as when the interface was Adaptive, but they still matter. I dont know if its just impedance, shielding or common-mode rejection, but they still matter. Every single USB converter and DAC using async or block mode still improves by using a good USB cable.
I challenge you to design an async or adaptive interface that is completely immune to cable effects. Good luck.
All you have done is to indicate that you haven't been able to figure out what is going on. If changing the cable makes a real difference in the sound, why can't you track down what is going on?
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Why doesn't somebody just fix it? Because nobody knows what the problem is. All the obvious primary effects have long been taken care of. Secondary effects have been hypothesized, tested, determined which are causing problems and those dealt with, and still its not perfect. We are now in a situation where whatever is causing the issue is not obvious, and is almost certainly something difficult to measure.So someone comes up with a wild idea as to what it might be and tries to see if its possible to measure with existing test equipment. They come to the conclusion that it can't be measured by what's currently out there. So you go to a test equipment manufacturer and they say it will cost 20 million dollars to build such a piece of test equipment. The problem is we don't even know if the hypothesized effect is what is causing the issue. That's a lot of money just to find out if it is or is not the problem.
So what happens is people try and use existing test equipment that was designed to measure something else and see if its possible to coerce it into measuring what YOU want to measure, sometimes this works and sometimes it doesn't.
What frequently winds up happening is that you give up on trying to measure whatever it is and try and figure out a way to mitigate whatever it is (assuming that it exists), then build, that and listen, did something change? This process can take a long time and a lot of money, so you don't get to try out too many hypothesis per year. This process can take a long time to find the issue.
As an aside on this, my day job is designing chips that handle HUGE data rates. I can't tell you how many tera bits per second we are talking about, but it's a LOT of data. The cable that goes between boards that these chips are on costs $150,000, I kid you not. When you are working on the edge cables just plain cost a lot.
But I know you are going to say, but audio isn't going anywhere near as fast, so expensive cables are not necessary, BUT as Steve mentioned the level of the cable effects which are AUDIBLE are MUCH lower than the level which renders the link inoperable. These levels are in the same ballpark as what renders a link inoperable in the chips I work with. Why are such small levels audible? That's what we don't know yet. We do know that test equipment to work at these levels is extremely expensive, way out of reach of most audiophile companies. This makes going to the next level very slow.
It will happen, but it's going to take a lot of effort and wild goose chases to figure out.
John S.
Edits: 04/12/12
Thanks, John. I could not have said it better. At least I'm trying to resolve this. How many other manufacturers have a USB common-mode noise filter? Try zero. I'm the only one.
Steve N.
Right. The problems are elusive. And there's no real money in audio. And there are lots of scam artists, so that even if someone did happen to solve the problem there would be people out there "demonstrating" that the problem was still present, and this would diminish the already limited audiophile market. Another way of saying this is that the emperor has no clothes. Where is the little child?
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Post a Followup:
FAQ |
Post a Message! |
Forgot Password? |
|
||||||||||||||
|
This post is made possible by the generous support of people like you and our sponsors: