Home Computer Audio Asylum

Music servers and other computer based digital audio technologies.

A brief look into USB 2.0 cable specs and implementation

I looked for issues dealing with the USB 2.0 cable in general. I did some research into what was dealt with during the 48 KHz 16 bit implementation of the asynchronous USB DAC (the one I have is such), and then jumped to the latest 192 KHz 24 bit.

The older USB revision controller chips were mostly for USB 1.0, and many used these simpler controllers even when implementing 2.0 hardware that had a 500 mA supply over the old 100 mA supply. That was when 12 MHz clock rates were the norm over the USB.

48 KHz at 16 bits with 2 channels requires 3.072 MHz of the 12 MHz (Full Speed) that served well. Then it was also possible to go to twice that at 96 KHz 16 bit for 6.144 MHz of the 12 MHz, or 9.216 MHz for 24 bit widths. That was the end and if the cabling was not up to snuff to meet full speed specs, there is a chance of losing a bit here and there and skipping. Usually you lost it all rather than a gradual loss of data.

To go to 192 KHz at 24 bits, a whole new controller needed to be made to go faster. There was added clock rates to 24 MHz or 48 MHz over the serial rate data transfer, where I am not sure was really a standard ever. But it could also get there on most decent cables.

But then a new USB spec for "High Speed" was adopted for the cabling and the transceivers, at a lower voltage transition from 0 to 1 and back, but up to 480 MHz. Transitions changed from -0.5 V to about +2.8 V to this +/-0.4 V transmitted, and +/-0.2 V received. That reduces noise output but cuts into noise immunity. A shielded twisted pair is now a requirement I would say.

Now the USB 2.0 cable is not built for speed, it was built for low noise emission, and slowishness is a requirement for the twisted pair idea. The FASTEST specified allowable rise and fall times at High Speed is 0.5 ns. Slower is okay!

The USB 2.0 twisted pair is a crude transmission line, with an approximate 90 ohm Zo impedance. It is specified that the transceivers at each end terminate 45 ohms - 45 ohms for the balanced twisted pair to properly give the diff mode impedance at 90 ohms. Most insertion loss allowable is a 6 dB loss, which is shown by the transmit and receive specs. Terminations lose a decent 14 dB of energy from reflecting back, plenty for digital safety.

480 MHz speed is going too close to a 500 psec rise time and would require lab grade implementation to get there. But for 192 KHz high rez audio, a good margin.

There are too many unknowns here to say a whole lot. What microcontroller? What susceptibility? What clockrate? What isolation? Etc...

But I can say that for ordinary 44.1 KHz 16 bit audio, it should not take heroic measures to be fairly immune to noise if all the marketing features and specs I hear about are employed.

It seems that the higher sampling rates will be the ones that might be more likely affected. And so for CD rate asynch USB DACs, cabling needs to be good quality, but not outrageous in theory. USB started out to be cheap, not super performance. Now some expect super performance out of another junky limited digital interface. Then you ought to be in USB 3.0 or other high performance interface, a whole new jump ahead, the next time you shop for a high res USB DAC.

RS-232 was industrial grade data transmission with giant +/-12 V transitions for high noise immunity of message delivery. USB was sort of like that to begin, and now we're going to +/-0.2 V? Well, again the idea has been altered to fit the quickly moving demands for more info and attempt backward compatibility at the same time.

The BW required for 500 ns rise time edges is about 0.35/(500E-9) in first order calculation, or about 700 MHz. The wavelength is thus about 30 cm in the twisted pair for sharpest edges. For best results in a short cable, anything under 1/8 that wavelength avoids most "microwave delay" effects, or about 3.7 cm. It has to be quite short for that. No matter, if specs are all met from computer to DAC, 5m is designed to be okay if using nothing else (no hubs and extensions).

But in short, there is no "engineering cable analysis equation" to do to determine any real differences in them. The only common sense says that if there is a real effect, expect them to be worse at higher bitrates.

What do the differences in high-end USB cables make happen? Better match? Better noise immunity? Less noisy? Less inductive? More capacitive? More precise 90 ohms transmission. None of the above, it just works? I don't have high-res to bother yet I feel. I could be wrong, but not likely $1400 worth of wrong to me for most of my software at 256 Kbps AAC. :-)

-Kurt

PS Feel free to correct any errors you see.





Edits: 10/03/11

This post is made possible by the generous support of people like you and our sponsors:
  Kimber Kable  


Topic - A brief look into USB 2.0 cable specs and implementation - kurt s 01:16:27 10/03/11 (39)

FAQ

Post a Message!

Forgot Password?
Moniker (Username):
Password (Optional):
  Remember my Moniker & Password  (What's this?)    Eat Me
E-Mail (Optional):
Subject:
Message:   (Posts are subject to Content Rules)
Optional Link URL:
Optional Link Title:
Optional Image URL:
Upload Image:
E-mail Replies:  Automagically notify you when someone responds.