![]() ![]() |
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
49.49.60.60
I looked for issues dealing with the USB 2.0 cable in general. I did some research into what was dealt with during the 48 KHz 16 bit implementation of the asynchronous USB DAC (the one I have is such), and then jumped to the latest 192 KHz 24 bit.The older USB revision controller chips were mostly for USB 1.0, and many used these simpler controllers even when implementing 2.0 hardware that had a 500 mA supply over the old 100 mA supply. That was when 12 MHz clock rates were the norm over the USB.
48 KHz at 16 bits with 2 channels requires 3.072 MHz of the 12 MHz (Full Speed) that served well. Then it was also possible to go to twice that at 96 KHz 16 bit for 6.144 MHz of the 12 MHz, or 9.216 MHz for 24 bit widths. That was the end and if the cabling was not up to snuff to meet full speed specs, there is a chance of losing a bit here and there and skipping. Usually you lost it all rather than a gradual loss of data.
To go to 192 KHz at 24 bits, a whole new controller needed to be made to go faster. There was added clock rates to 24 MHz or 48 MHz over the serial rate data transfer, where I am not sure was really a standard ever. But it could also get there on most decent cables.
But then a new USB spec for "High Speed" was adopted for the cabling and the transceivers, at a lower voltage transition from 0 to 1 and back, but up to 480 MHz. Transitions changed from -0.5 V to about +2.8 V to this +/-0.4 V transmitted, and +/-0.2 V received. That reduces noise output but cuts into noise immunity. A shielded twisted pair is now a requirement I would say.
Now the USB 2.0 cable is not built for speed, it was built for low noise emission, and slowishness is a requirement for the twisted pair idea. The FASTEST specified allowable rise and fall times at High Speed is 0.5 ns. Slower is okay!
The USB 2.0 twisted pair is a crude transmission line, with an approximate 90 ohm Zo impedance. It is specified that the transceivers at each end terminate 45 ohms - 45 ohms for the balanced twisted pair to properly give the diff mode impedance at 90 ohms. Most insertion loss allowable is a 6 dB loss, which is shown by the transmit and receive specs. Terminations lose a decent 14 dB of energy from reflecting back, plenty for digital safety.
480 MHz speed is going too close to a 500 psec rise time and would require lab grade implementation to get there. But for 192 KHz high rez audio, a good margin.
There are too many unknowns here to say a whole lot. What microcontroller? What susceptibility? What clockrate? What isolation? Etc...
But I can say that for ordinary 44.1 KHz 16 bit audio, it should not take heroic measures to be fairly immune to noise if all the marketing features and specs I hear about are employed.
It seems that the higher sampling rates will be the ones that might be more likely affected. And so for CD rate asynch USB DACs, cabling needs to be good quality, but not outrageous in theory. USB started out to be cheap, not super performance. Now some expect super performance out of another junky limited digital interface. Then you ought to be in USB 3.0 or other high performance interface, a whole new jump ahead, the next time you shop for a high res USB DAC.
RS-232 was industrial grade data transmission with giant +/-12 V transitions for high noise immunity of message delivery. USB was sort of like that to begin, and now we're going to +/-0.2 V? Well, again the idea has been altered to fit the quickly moving demands for more info and attempt backward compatibility at the same time.
The BW required for 500 ns rise time edges is about 0.35/(500E-9) in first order calculation, or about 700 MHz. The wavelength is thus about 30 cm in the twisted pair for sharpest edges. For best results in a short cable, anything under 1/8 that wavelength avoids most "microwave delay" effects, or about 3.7 cm. It has to be quite short for that. No matter, if specs are all met from computer to DAC, 5m is designed to be okay if using nothing else (no hubs and extensions).
But in short, there is no "engineering cable analysis equation" to do to determine any real differences in them. The only common sense says that if there is a real effect, expect them to be worse at higher bitrates.
What do the differences in high-end USB cables make happen? Better match? Better noise immunity? Less noisy? Less inductive? More capacitive? More precise 90 ohms transmission. None of the above, it just works? I don't have high-res to bother yet I feel. I could be wrong, but not likely $1400 worth of wrong to me for most of my software at 256 Kbps AAC. :-)
-Kurt
PS Feel free to correct any errors you see.
Edits: 10/03/11Follow Ups:
"What do the differences in high-end USB cables make happen? Better match? Better noise immunity? Less noisy? Less inductive? More capacitive? More precise 90 ohms transmission. None of the above, it just works?"
None of the above when using Asyc USB interfaces IMO. Async is technically immune to incoming jitter. But there is still something going on there.
It's simply the common-mode noise from the ground, +5V and datalines between the computer and the device. Can be worse if there is a ground-loop adding noise, but even the ground-loop is not required.
Technically, you should not even need a ground or +5 on the USB cable. The differential signals should do it, provided that the destination device can handle the common-mode noise and reject it. Herein lies the rub. This is not an ideal world and the receiver does not reject all of the common-mode noise. The noise on the ground is also propagated through the device.
My theory is that different cables sound different because they have different size ground wires in them with different inductance and resistance. Different computers sound different because the USB interface chipsets and power systems for those interfaces vary from one computer to the next. They inject differing amounts of common-mode noise on ground and the signals.
Steve N.
Even w/o 5V, the interface is still sensitive to cabling. God know what may or may not happen if the usb data cable direction is reversed.
"It's simply the common-mode noise from the ground, +5V and datalines between the computer and the device."
If this is the source of the noise it can be eliminated by appropriate circuitry in the DAC. While the cost will be far from zero, it is likely to be much less than the cost of the more obscenely priced boutique cables and will yield stable, repeatable results. Measuring the noise rejection through the DAC (all forms of noise superimposed on the ideal waveform) is not that difficult and gaining this capability at a high level of resolution is a prerequisite to any serious attempt to produce reliable, stable DACs.
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Exactly. I already have a solution that seems to be quite good. Using it at RMAF this year.
'If this is the source of the noise it can be eliminated by appropriate circuitry in the DAC.'
This is supposed to be already the case with ground isolated usb dacs powered by battery or highly regulated power supplies.
Yet such dacs are still affected by usb cabling.
Even though they are all Async, every DAC USB interface is a different design. Some dont sound different with different cables. Each interface must be studied as a special case I think.
Which is why:
1. the jitter rejection circuitry is inadequate
2. the ground isolation circuitry is inadequate
or
3. there are additional layers to the onion.
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
the whole thing isn't a greater idea than what is already on offer, as I have said a number of times!
Edits: 10/04/11
The noise from the GND and 5V line is eliminated in my dac. But cables continue to sound different. Arguments could be made that the isolation isn't totally effective, but then again, I don't really know.Some cables are truly special in terms of sound quality and aren't all that expensive.
Edits: 10/04/11
your dac is no better than other aes3 dacs.
"But cables continue to sound different."
Probably more layers of the onion to be unpeeled. Or more nested dolls if you prefer. :-)
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
"It's simply the common-mode noise from the ground, +5V and datalines between the computer and the device."
All good points. My listening with Acoustic Revive SP USB cable with dual conduit has been very interesting.
I don't get your comments about cabling not mattering at the lower audio sample rates. USB uses a packet protocol, so if there is less audio data the transmission burst rate isn't reduced, rather the gaps between bursts are increased in size.
It may be more complicated than that, because there may be auto-negotiation of the clock rate between devices. I will defer to Gordon if I am incorrect on this. Gordon has undoubtedly studied the specification and is familiar with how various implementations have abused the specification, something that I have not done, as I am not, nor have I ever been a member of the Communist Party. Oops, I meant, a member of the USB Lover's Society. :-)
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
"I don't get your comments about cabling not mattering at the lower audio sample rates. USB uses a packet protocol, so if there is less audio data the transmission burst rate isn't reduced, rather the gaps between bursts are increased in size."
Well, that is another issue, isn't it? I don't understand the data transfer well enough nor the source clocking, nor the DAC firmware and the drivers well enough, however it is done.
I guess in order to think outside the box, we need to know about everything INSIDE the box first. That isn't going to happen.
Kurt,
Really to go through the specs would take months to do as you need to look at more than what you did and really none of that may have anything to do with cable differences.
Your math is a little wrong...
16 bit data requires 4 byte samples 2 left, 2 right at 1ms frame rate that is 192 bytes per frame or 192K bytes or 1.536MBps (bits per second).
There are a couple of things to know. First USB 2.0 can be high speed or full speed. 100ma was 1.0, 1.1 was 500ma, most Apple ports can supply 800ma.
In Full Speed the largest frame is 1023, in high speed 1024. Therefore if we back math this at full speed and say 24 bit samples or 6 bytes per stereo sample we get.
1023/6*1000 = 170.6KHz sample rate max for Full Speed.
With High Speed and Micro Frames you can go basically 8x faster and that is what gives us 24(32)/192 capabilities.
Again as far as cables concerned I have not seen anything that suggests why cables make a difference in sound. I just don't think we have enough out of the box thoughts on this.
Thanks
Gordon
J. Gordon Rankin
'Again as far as cables concerned I have not seen anything that suggests why cables make a difference in sound. I just don't think we have enough out of the box thoughts on this.'
Fred,
Nothing like SPDIF! There you have two problems. You have to account for the data which is pretty easy but you also have to the clock which is not. The effect the cable has with SPDIF is totally different than USB. The SPDIF receiver has to be able to deal with the cable differences and at different rates. Jitter has been one of the various problems that is easy to see is the cables fault. Well that and the specification for SPDIF is really broad and really not followed at all. I have seen really expensive transports were the SPDIF is picked off the drive controller board with a single wire, cap coupled at the RCA or BNC connector. Now how the hell does that make a 75 ohm connection? I have seen some SPDIF transmitters that are 4-5x the voltage. Claiming it sounded better... yea right.
USB only has to deal with data!
Thanks
Gordon
J. Gordon Rankin
What you say is NOT true for good/high end stuff with proper spdif transmitter and receiver At least they work 100% of the time..
Who wants to spend thousands on usb cables like Mercman? Plus the time it takes to 'audition' different cables, different players, different OSs, different hard disks, and different ram.
I'd rather listen to music on a stable, long term basis; then decide what to 'improve'.
Fred,
Ok so I look around the lab and I can see 5 SPDIF cables all of them are easily more expensive than any USB cables I use.
The problem is simple... well complex with SPDIF:
1) Jitter is a real issue because it is part of the problem. Look the receiver always has to train on the incoming signal. Therefore there is always something to fix here.
2) Impedance, this causes jitter, the networks the drive the level everything causes jitter because the receiver is so sensitive.
3) Timing: Back in the early 80's when Crystal was first doing SPDIF receivers, I was working on the first intelligent Ethernet adapter and had Crystal in because they did this stuff too. The guy who designed the CS8412 was there and over lunch we talked about the chip. If you can reduce the variation in timing... meaning the change in frequency over a very large time then the PLL in the Crystal receivers works really well. Otherwise the damn thing will have thousands of picoseconds of jitter.
> > > > > > Ok so here's the point. We test these dacs using SPDIF with an Ap, Prism or whatever. Heck these give best case results. In the true world this does not happen and it all suffers.
At least with USB the damn thing delivers the data in tack. We know the sample rate, bit size everything. What ever we do with this after that is our own fault.
Thanks
Gordon
J. Gordon Rankin
I'm sorry you look at it like this Fred. I'm having a good time and definately feel I've improved the sound of my system. It's been good fun this year.And don't forget my Crimson HS dac that nows does 192/24. I know this is old stuff for you, but not for me.
Edits: 10/04/11
If you have 5 variables, to go thru the no of permutations (combinations), you need to vary the setup 120 times (5 factorial). If you have 10, then 3628800!Therefore, short term variations of a large no of variables is very difficult to control. In the end, one doesn't know which combination is the 'best'.
Edits: 10/04/11
Fred,
Sorry I don't even know how to answer this??? I don't know where you are going.
Look if SPDIF was correctly implemented years ago it would have been bi-directional and would have by now came with an asynchronous method.
Thanks
Gordon
J. Gordon Rankin
All I was doing was replacing an old computer with a newer, faster one. The components were upgraded to the latest technology for a Mac. In the end, it's still the same principle.The computer was given a Sata 3 SSD, more memory, and Thunderbolt HD for the external drive. The Thunderbolt drive matched the specs of the Sata 3 OS SSD. This is what I wanted in the beginning, but the components weren't available. I would have preferred getting everything at the same time instead of one component at a time since they were just parts of the same computer. But I appreciate what you are saying.
Edits: 10/05/11
changes nothing from what I said. You are changing too many variables at the same time (or in a short time) to make any kind of lasting or generalise-able judegments, objective or subjective.
These are not controlled experiments, but simply observations. The same thing you do here all the time.
I must be nuts to be responding to this.
"I must be nuts to be responding to this."
Yup! :-)
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
I must be nuts to be responding to this.
You are. That's why you've been put into an Audio Asylum.
Get used to it - you're going to be here for a long, long time and the demons will always be haunting you.
But at least they will use a different argument every day. Unless, of course, it's a wrong one. In that case, you'll hear it echoing round the halls every day.
For years.
Why have you been telling everone that the Phoenix active interface for a hard disk sounds better than a fast 3Gb/s or 6 Gb/s SSD? Why are you also concluding that more and more ram is the way to go? The problem is that you generalise these wrt to a fixed set of hardware and a constantly changing set of software. Hence my remarks about permutations and combinations of changes.
I make comments based on my own experience, but I do not try to generalise what I say. This is the difference.
I do not try to generalise what I say.
So who's that fellow in Ward 7 who bangs on about the Sometimes Universal Serial Bus day in and day out? And has done so for years? And brags about being the only one to do so? And pours scorn on those that don't listen to him? And says that anyone who gets it to work well has bad kit or bad teeth - or both?
Anyway, can't stop. Got to write to Josephine with the great news about Austerlitz . . .
> Got to write to Josephine with the great news about Austerlitz . . .That fellow in Ward 7 is Frederick the Great.
Or maybe Frederick the Sour Grape.
Bill
Edits: 10/06/11
"Why have you been telling everone that the Phoenix active interface for a hard disk sounds better than a fast 3Gb/s or 6 Gb/s SSD?"
I have????
"I make comments based on my own experience, but I do not try to generalise what I say. This is the difference. "
You don't generalize? Read your posts Fred.
What you say is NOT true for good/high end stuff with proper spdif transmitter and receiver At least they work 100% of the time.
Only a couple of days ago, you were reporting that, unlike MS generic drivers, good manufacturer-provided drivers for USB audio devices were robust and reliable (I haven't time to find the post right now). I agreed with you but took it to mean that inter alia they worked 100 per cent of the time. Was I wrong?
Speaking personally, I'm finding the recent discussion of USB less and less pertinent the more it goes on. One sign that the arguments are running aground (it happens regularly here) is when they segue into questioning manufacturers' ethics.
D
That can be helpful. And yes, we all are still stumped about some things. Don't we just hate it when that happens.
I know: Electricity is neither a wave nor a particle!
Well, I am no Einstein for sure.
Nice post Kurt.
I would like to comment that RS-232 is (sadly) anything but an "industrial grade" interface. It is a low speed, unisolated, unbalanced interface designed for local interface to modems. That's it. Everything else it was used for was a bastardization and frought with problems.
But it was the ONLY thing resembling a "desktop" network that had any support for many years and having worked with it for quite a few of them I can attest that it's a HW and SW PITA. The USB scheme is different in many respects but the main thing it has going is that it was actually created to be a desktop network.
Now I'm retired so am just a "user" and I'm delighted with how well the scheme has been working the last decade, what a huge improvement for both the users and manufacturers. Our tech-support folks used to spend about 2/3 of their time trying to resolve customer's 232 related problems...
Rick
Thanks for that historical perspective. I remember RS-232 first when operating a dumb terminal back in my college (mainframe) days in 1979-too much later. As a more advanced student, I got to use a local RS-232 connected dot matrix printer, printing on a continuous roll of recycled elephant dung paper. That's right kids! And we liked it, too!I don't want to admit it, but I even punched out some cards to program an IBM 360 for a quarter on "FORTRAN Programming". Haha! The later courses had students using that terminal time. To me, it's Industrial Grade by means that it was for commercial industry use then, not for home use. The big voltage swings that had big digital noise immunity also meant slower BAUD rates for long runs into the computer room for the HP3000 mainframe.
Then I guess they overused this simple system into making a mess out of it, as you said.
The good old days! When I could program things and read others' work, not someone else' badly organized C# sh!t or something like that now. It's all a nightmare! Let's go back to 64K memory!!!! I'm going to have another nervous breakdown over this GD code! I am having flashbacks again! Heeeeeeeellllllllpppp!
And in Wikipedia, it has the delightful statement:
"C# is intended to be a simple, modern, general-purpose, object-oriented programming language.[7] Its development team is led by Anders Hejlsberg."
Is Anders trying to kill us all? Huh? WTF does this mean in that same article that is so simple and modern?:
"Furthermore, C# has added several major features to accommodate functional-style programming, culminating in their LINQ extensions released with C# 3.0 and its supporting framework of lambda expressions, extension methods, and anonymous classes.[22] These features enable C# programmers to use functional programming techniques, such as closures, when it is advantageous to their application. The LINQ extensions and the functional imports help developers reduce the amount of "boilerplate" code that is included in common tasks like querying a database, parsing an xml file, or searching through a data structure, shifting the emphasis onto the actual program logic to help improve readability and maintainability.[23]"
And THAT is SIMPLE????? F.U. Anders!!!!
What f*cking Dr. Frankenstein lunatics software people have become. :-)
OTOH, f*ck that smiley!
-Kurt
Edits: 10/03/11
> The big voltage swings that had big digital noise immunity also meant
> slower BAUD rates for long runs into the computer room
> for the HP3000 mainframe.
RS-232 was designed a long time ago (think ASR-33 teletypes). Using 12V was necessary for the capabilities of the electronics available for such relatively low cost devices. Once there were terminal devices using the RS-232 interface widely available, it became a practical way to connect terminals to computers. That marketplace momentum kept RS-232 in use long after the design decisions had become archaic.
> "C# is intended to be a simple, modern, general-purpose,
> object-oriented programming language.[7] Its development team
> is led by Anders Hejlsberg."
I don't know your reasons for a rant about C#. Anders was the guy who came up with Turbo Pascal for Borland. Later he developed Delphi, which meshed Borland's quick compiling Pascal compiler with a framework for GUI development and a very slick IDE for writing and testing GUI applications for Windows. Writing Windows applications with Delphi was heaven compared to working with in C Microsoft's Visual Studio or with Visual Basic.
Microsoft wooed Anders away and he applied the same ideas to develop C#. Other elements were added to modernize the development environment
Memory management,
the common language runtime,
interpreted rather than hard compiled code,
the .net framework which provides a much cleaner API than the bare Windows API and
integration with Visual Studio.
Later versions of C# got bigger as competition with Java required providing equivalents for new features in Java or proposed for Java.
From my point of view as a retired programmer, Anders was the most brilliant developer of programming tools for Windows client side development from the early 80s through the early 2000s. I miss the availability of cheap, slick tools like Delphi for developing Windows GUI apps now.
Bill
You can program in that language well, that's good. I cannot. There's TMI to cram into my brain at once that must all be focused on. I can't focus that well because my brain's short attention span and it's poor short term memory recall both conspire to derail me on every attempt to do it.
BASIC and FORTRAN is about all I might be able to do. I could never gather enough information in my head to really write in .NUT Therefore, I am now obsolete. But it's "SIMPLE!"
You'd have a job finding 90R twisted pair screened cable; I suspect that most are not but 100 - 110R. The only one I could find is an 'aviation' grade cable sold in 100m.
Your review raises another point I have been making 'Sometimes Universal Serial Bus'. When audio from front ports with cable sounds different from that from rear port board soldered terminals (point also made by Gordon), what chance do we have of a suitable high quality ineterface for hires audio?
To improve matters, the first thing for usb audio vendors to admit should be these issues; then we stand a chance of curing it.
I first raised the subject 2 or more years ago. Vendors tried their best to silence discussion. Now it is being admitted because they want to continue to sell more.
"To improve matters, the first thing for usb audio vendors to admit should be these issues; then we stand a chance of curing it."Now that you thought about it for me,
I don't see 14 dB attenuation at the terminal ports out to 480 MHz when I look at these POS USB connectors on my PC. I don't think by standard concepts in electrical matching, any expensive connector and cable will overcome that part.
For $1400, I want to mod the PC motherboard and get it up to spec first. Do you think the DAC end's connector is better for a quality DAC? Maybe not.
Come to think of it again, it is all still POSSIBLE to meet the specs. If they use good hardware and do it right, even with solder joints. The wavelength is NOT that small yet.
Edits: 10/03/11
Post a Followup:
FAQ |
Post a Message! |
Forgot Password? |
|
||||||||||||||
|
This post is made possible by the generous support of people like you and our sponsors: