![]() ![]() |
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
93.109.155.75
In Reply to: RE: usb to spdif with BNC posted by thom70 on February 14, 2014 at 05:50:34
Some manufacturers are quite happy to fit BNCs at no extra cost; W4S being one of them. Their mini converter actually comes with one.
Alternatively, it is quite easy to fit a BNC 75R yourself.
It seems that you , like me, can hear the difference between a 200R phono and a proper 75R BNC.
I is very easy to see the change in eye pattern when the correct jack is fitted, and it is nonsense to say that there is no difference in sound and that there is no need for impedance matching.
Follow Ups:
Hi,> It seems that you , like me, can hear the difference
> between a 200R phono and a proper 75R BNC.Funny thing is, a RCA/Phono connector is actually LOWER than 75R, not 200R by a long shot. Depending on the precise design it closer to 50 Ohm constant impedance BTW, the really thin shelled sockets may be even higher.
And I find it amusing that you can hear a 30% Mismatch (50R to 75R) for at the max 20mm but you cannot hear the > 100% mismatch for many more millimetres from the cables (referring to our old friend the Musiland Monitor 01 USD).
I will not completely exclude some audibility for the connectors mismatch (though it is not easy to get it to show up on a TDR using realistic rise time, comparing the same cable/device terminated with BNC/RCA).
Yet in the context of the common design approaches I commonly see in all sorts of SPDIF outputs the RCA connector vs. BNC is really the LAST thing to worry about. Usually even the impedance of the output is nothing like 75 Ohm at any given frequency.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
Edits: 02/14/14
to the website you visit in another name and ask Jocko what phono socket impedances are.
It is easy to see on a proper scope setup the difference between the traces with 50R termination and 75R termination, but then I understand that you use a 150 MHz digital HP scope for your work.
As for voicing, it is also easy to hear the differences between the same cable terminated in phono and BNC75R plugs.
Fred,
> Go to the website you visit in another name and ask Jocko what phono
> socket impedances are.
They are not 200 Ohm... Ask [subdued coughing - as I know who Jocko really is] Jocko Homo.
> It is easy to see on a proper scope setup the difference between the
> traces with 50R termination and 75R termination,
This we both agree on. But we are not talking termination, but connectors. Lets not mix apples and oranges.
> but then I understand that you use a 150 MHz digital HP scope for
> your work.
I also have a Tex analogue 'scope that goes higher.
BUT, why would you suggest a 150MHz is insufficient? What is the rise time involved? What is the frequency? SPDIF does not operate at 150MHz or higher.
We may disagree on audibility, which is strictly subjective.
But if you really maintain that a true 75R source, with a rise time similar to a SPDIF source can show on a decent TDR the difference of the same source, cable and a 75R termination on the far end of the 75R cable (say 1m?) you can show the traces taken at the far end and point out where the glitch from the RCA connector is and just how much the magnitude is nextto a 75R BNC.
It is easy. You show real data and I will readily admit that my tests were flawed. You may talk the talk, why don't demonstrate that you also walk the walk?
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
"BUT, why would you suggest a 150MHz is insufficient? What is the rise time involved? What is the frequency? SPDIF does not operate at 150MHz or higher."I think you must have missed that day of class in EE 101. This has nothing to do with the frequency of S/PDIF signals. It has everything to do with the edge-rates. Even rise-times of 3nsec have spectrum close to .5GHz. In order to measure these accurately, you need probes and scope combination with B/W of 1GHz at least. I use 6GHz equipment myself. My S/PDIF edge-rates are more like 300psec.
Edits: 02/15/14
Hi,> I think you must have missed that day of class in EE 101.
Nope, check my attendance record.
> This has nothing to do with the frequency of S/PDIF signals.
> It has everything to do with the edge-rates.Yup, absolutely.
> Even rise-times of 3nsec have spectrum close to .5GHz.
And given that even 192KHz is usually well below 10nS rise time .15GHz is more than sufficient, plus, if we really miss a few of the harmonics of the very top, or rather, if they are a bit attenuated (because a 150MHz 'scope of course does not immediately cut off at 150MHz, it merely experiences some attenuation), what is the result as far practical transmitter and receiver circuits for SPDIF are concerned?
Feck all, as they say in Ireland.
> In order to measure these accurately, you need probes and scope
> combination with B/W of 1GHz at least.Depends on the precise aims you have. My aimis to evaluate practical, real world performance only. For that, the scope I have (I have since gotten faster ones BTW, cannot work on USB with something that slow) suffices.
> My S/PDIF edge-rates are more like 300psec.
Impressive.
Not much logic or analogue circuits out there that come even close to 0.3nS rise time. Good work. Does that include 1m SPDIF Cable and is measured at the far end (that is how I do it)?
What a shame that it all still goes into a SPDIF receiver with a 10nS or much longer trigger window, so it seems rather wasted in practice.
Plus, with such a rise-time you are of course maximising the problems from any impedance discontinuity in the connection (even from a 75 Ohm BNC) and you potentially create a lot of problems for getting FCC or other EMC approvals.
I usually look at the recovered jitter in the Audio signal after an "industry standard" CS or AKM receiver (as they have so pee poor jitter rejection) and a DAC with sufficient dynamic range to see what is the impact of tuning stuff ahead of the DAC, plus I listen to the results s well. I cannot say I found any correlation with rise-time past meeting certain levels. Something like JET/JISCO on the other hand...
I think you can easily step to a slower edge rate without causing common SPDIF receivers any trouble, of course you loose bragging rights.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
Edits: 02/15/14
"Not much logic or analogue circuits out there that come even close to 0.3nS rise time. Good work. Does that include 1m SPDIF Cable and is measured at the far end"
Of course, 4 feet actually. It is terminated inside the scope with 75 ohms. The Scope 75 ohm internal terminator cost me $1200.00 alone. I don't even use a probe. This is the ONLY way to accurately measure S/PDIF IMO. Otherwise you are missing a lot. With other measurement setups and B/W limitations, you can only make frequency measurements or qualitative observations, and those are limited. You cannot measure rise-time, fall-time, overshoot, undershoot, edge monotonicity etc...
"Plus, with such a rise-time you are of course maximizing the problems from any impedance discontinuity in the connection (even from a 75 Ohm BNC) and you potentially create a lot of problems for getting FCC or other EMC approvals."
Its the price you must pay if you want to achieve the lowest jitter.
"I think you can easily step to a slower edge rate without causing common SPDIF receivers any trouble, of course you loose bragging rights."
Its not about causing "trouble", its about achieving the lowest jitter from the receiver. And the receiver you use matters a LOT. I believe I don't brag about my risetimes anywhere in my literature.
Steve N.
Hi,> And the receiver you use matters a LOT.
Not the way we (AMR/iFi) implement SPDIF where we do. To me, as long as the bits are decoded right this is all I need. Using SPDIF clock for timing is suicidal, I prefer to decouple the two clock domains completely. Much better results than tinkering at the margins of transmission lines.
On the other hand, if we make a USB -> SPDIF converter for general use, then most if not all our customers have generic SPDIF receivers, usually in "textbook/datasheet" implementations with all that implies as well.
As all AMR and iFi digital products have decent USB inputs (and had since 2006), I see no need to make a USB -> SPDIF converter targeted at my own products.
So I design them to get the most from what most of our customers are likely to have, a rather different goal I would guess than your designs have.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
Edits: 02/15/14
The receiving S/PDIF receiver matters in a lot of designs, if your DAC has a S/PDIF input on it. Even if you re-clock after it using a PLL, the jitter coming from the receiver will matter. If you can slave the source device, only then will it not matter. This configuration is not common though.The receiver chip can generally also be used as a transmitter that is used to generate the S/PDIF encoded signal output from a USB converter or other server with S/PDIF output. This one matters a LOT.
Edits: 02/15/14
Hi,
> The receiving S/PDIF receiver matters in a lot of designs,
> if your DAC has a S/PDIF input on it. Even if you re-clock
> after it using a PLL, the jitter coming from the receiver
> will matter.
Only morons use a PLL to clean up the jitter of another PLL. It does not work.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
"Using SPDIF clock for timing is suicidal, I prefer to decouple the two clock domains completely."
How do you deal with flow control / buffering / latency issues?
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Hi,
> How do you deal with flow control / buffering / latency issues?
We use a buffer. It is just as long as absolutely necessary.
The details of the flow control, lock on etc. I am afraid I consider "trade secret". It is not the common PLL/DLL/DPLL approach, but something rather non-linear which combines a lot of interesting hardware and software solutions. It took a team of 5 on hardware and software (I was one and arguably the lead) the better part of a year to get to production readiness with > 50% of the time devoted to that single project.
A lot of the solution is actually correctly defining the problem. This is where most attempts fail miserably.
In normal home playback the latency is immaterial (does not cause lip-sync issues on video). I personally feel it is still short enough for studio use as well.
During "steady state" playback I see a single clock speed update event maybe in 10 - 15 minutes, this usually a single step, around 0.04ppm per step, source PC. Initial lock takes around 10 mS, the system always settles in under 1-2 second to steady state.
After "steady state" is achieved the system is in essence totally static, there are non of the usual feed through issues due to the limited PLL Filter bandwidth etc. It is truly static, until the "fuzzy logic" determines that input clock variations accumulate a permanent change (drift), rather than random or patterned variation that eventually null out (jitter).
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
I've designed these type of systems in the past, namely the Pace-Car reclocker which is out of production. It made a compensation about every 15 seconds, which was inaudible. However, in order to maintain the low jitter of a free-running clock, it had to be a free-running clock, which was pulled only a tiny amount from its nominal frequency to remain in sync.The problem and PIA with this system was that it had to be fine-tuned to each source device. It turns out that many source devices vary in frequency quite a lot from nominal, particularly computer-based systems. People had to send me their appleTV or whatever so I could tune the thing...
Short of doing this, I can't understand how you can be using a free-running clock, unless maybe you are using 2 clocks that are slightly higher and slightly lower in frequency than nominal and selecting between them synchronously. That must be it.
Steve N.
Edits: 02/15/14
Steve,> I've designed these type of systems in the past,
Not really.
> namely the Pace-Car reclocker which is out of production.
> It made a compensation about every 15 seconds, which was inaudible.I have not found so fast update rates completely inaudible. Nearly so, yes.
> However, in order to maintain the low jitter of a free-running
> clock, it had to be a free-running clock, which was pulled only
> a tiny amount from its nominal frequency to remain in sync.Alas, there are clock technologies in these modern times that go FAR past the old style.
> The problem and PIA with this system was that it had to be
> fine-tuned to each source device. It turns out that many source
> devices vary in frequency quite a lot from nominal, particularly
> computer-based systems.Not a problem my system has. It locks to any arbitrary clock frequency in the region from a few KHz to a few MHz. And jitter is comparable to the better fixed frequency clocks. Nice work, if you can get it (to work).
It needed solving loads of discrete problems. For example, it works totally different when locking and when operating steady state and has decision logic to decide when to shift between these two modes. But arguably acquiring a clock and locking down a secondary clock domain solidly are totally different problems that need totally different approaches.
> Short of doing this, I can't understand how you can be using a
> free-running clock, unless maybe you are using 2 clocks that
> are slightly higher and slightly lower in frequency than nominal
> and selecting between them synchronously. That must be it.You got it all wrong old man. That way you will never get this to work at all. VCXO/SAW etc. have crappy pull ranges AND crappy phase-noise in a PLL or with a DAC driving them. Try moving into the 21st century.
As I wrote elsewhere, the key is in defining the problem correctly and then finding a technology that delivers the correct solution to the problem, rather than trying to bend the "usual same old same old we have been doing for decades" to somehow serve.
Don't start with a solution (either your own preference or what others have been doing [and often failing] at) and try to make it fit your problem.
Start with a blank piece of paper and fill it with whatever will solve your problem in theory and then find something that will fill the "box" on your flow diagram in reality. What you come up with may be very different to what we came up with, but it will deliver what our solution does.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
Edits: 02/15/14 02/15/14
" It made a compensation about every 15 seconds, which was inaudible.
I have not found so fast update rates completely inaudible. Nearly so, yes"
I and my customers found it inaudible.
"maybe you are using 2 clocks that are slightly higher and slightly lower in frequency than nominal and selecting between them synchronously. That must be it.
You got it all wrong old man. That way you will never get this to work at all. VCXO/SAW etc. have crappy pull ranges AND crappy phase-noise in a PLL or with a DAC driving them."
Pull ranges? With 2 separate clocks, there is no "pulling", just selection between them. Both clocks are free-running.
Steve N.
"Pull ranges? With 2 separate clocks, there is no "pulling", just selection between them. Both clocks are free-running."
For this to be "inaudible" I assume you must be doing some clever timing of the precise moments where you switch. Also, the two clocks have to be close enough in frequency that pitch differences won't be audible to the most sensitive listener, yet far enough apart that the slower will be slower than the slowest usable source and the faster will be faster than the fastest usable source. Not much room to play with?
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
A few hundred kHz is enough. Like I said, I already did the "pull" solution and it worked with many devices without tuning, although not all.
Transitioning between two async clocks is the challenge in what I proposed. It requires some experience designing self-timed logic. Been there, done that. I even taught classes on it.
Steve N.
Hi Steve,
> Pull ranges? With 2 separate clocks, there is no "pulling", just
> selection between them. Both clocks are free-running.
You need a heck of a lot of clocks to cover the range from appx. 10 to appx. 50 MHz and do that in 0.004ppm steps.
We could do the NAIM trick of switching different load cap's on different crystal oscillators and use several Crystals with selected and defined frequencies. There are even IC's that allow very fine tuning in steps of the crystal clocks.
Or I could imagine using one fixed clock and one VCXO and using some clever mixing to generate harmonics that then are filtered and clipped to make a clock.
You could even combine these two techniques.
But that is not what we use.
You can keep guessing.
As I said before, instead projecting solutions you are familiar with, start with a clean slate, define requirements and then find the best solution.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
which is the important thing and why some want correct BNCs.Quote ''You may talk the talk, why don't demonstrate that you also walk the walk?'' Unquote.
Where have you walked the walked other than asserting that the connector equation is not important?
Contrast this to iFi stressing the 90R terminations for your design of their USB devices as being 'important' (which I think it may well be). Are you now denying this?
Edits: 02/14/14
Fred,
Let us be clear, for both SPDIF and USB Cable impedance and termination (in the technical sense of transmission line termination http://en.wikipedia.org/wiki/Electrical_termination ) matter greatly.
You can read more why here:
http://en.wikipedia.org/wiki/Reflections_of_signals_on_conducting_lines
Incorrect cable impedance or termination resistance will cause reflections that will corrupt the signal. A short or open circuit will reflect 100% of the signal. Less extreme mismatches reflect less signal.
Discontinuities in impedance at the connection points also can matter IF (and that is a big if) they are significant in length compared to the wavelength of the signal or to the equivalent in distance of the rise time of the signal.
So we need to look at the signals.
SPDIF - uses manchester code so appx. 3MHz (44.1K samplerate) to 12MHz (192KHz)
USB2.0 at high speed - uses a complex bidirectional coding scheme, 480MHz maximum frequency
So USB at high speed contains 40 Times the frequency of SPDIF at 192KHz and thus can tolerate different impedances along the line of 40 timesless than SPDIF.
For reference, at 12MHz we are dealing with a wavelength of 17.5 meters! The impedance discontinuity of an RCA plug is pretty much irrelevant at this frequency.
Measured on a TDR a 75R BNC connector can show that it is actually 66 Ohm and causes a glitch of appx 4.2mm total length.
RCA Connectors (plug & socket) show an impedance of 52Ohm (this is rather more variable) and causes a glitch of around 12mm total length.
This translates to a lot less than a 1nS glitch in either case with a 12% (BNC) and 30% (RCA) of the signal reflected.
Now the transition time for the LMV7219 Comparator (e.g. the SPDIF input often used with the ES9018) is on average around 10nS. This is much better than common SPDIF receivers, which are by far slower.
If such a glitch as described arrives close with the signal causing a transition on the receiving end it will simply be swamped by the comparators (or SPDIF receivers) slow transition. If the reflections are significantly delayed they simply do not matter.
So, bottom line, both BNC and RCA cause some reflections, both reflections are short enough in the time domain to not matter at SPDIF Data rates, objectively speaking.
If you have objective data that proves different, make it public.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
with what is heard and seen. Who is now talking the talk?
We can't "hear" what you "hear". But we can "see" what you can see.
Is there something that you can show us?
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Unfortunately my Tetronix 400MHz has just blown the power supply and so I cannot show right now. My historical paper records are in a different country as well.
But any serious and experienced audiophile with technical skills will know that correct terminations and cabling are important in getting the best sound out of digital replay, especially for spdif connections.
There is no point in switching arguments between connectors and terminations as is the case in my exchanges with Thorsten.
Fred,
> There is no point in switching arguments between connectors and
> terminations as is the case in my exchanges with Thorsten.
Termination is a clearly defined technical in the context of high frequency cabling, as is the use of the term. If you want to change that definition and use when it suits you, then you are switching arguments or are deliberately obfuscating things.
400MHz gives you around 2.5 nS time domain resolution.
The glitches caused by RCA Plug/Sockets together are one fifth of that. There is no way you can actually see these glitches unless you get much better gear, neither my standard set of 'scopes nor your "the dog ate my homework" Tek with blown powersupply can show the differences between RCA & BNC connectors.
If you actually see any differences using this gear you either changed more than just connectors or the method has a fundamental flaw. You should really check your work.
You can ask Steve to help. He has the instrumentation needed.
Ciao T
Sometimes I'd like to be the water
sometimes shallow, sometimes wild.
Born high in the mountains,
even the seas would be mine.
(Translated from the song "Aus der ferne" by City)
Tony Lauck
"Diversity is the law of nature; no two entities in this universe are uniform." - P.R. Sarkar
Some people can supposedly hear a gnat's fart but can't prove that a gnat was ever present. I suppose others can claim to hear the difference between insignificant impedance differences even though they make no difference at all in the grand scheme of things.
At the relatively low spdif frequencies, we know that the VSWR mismatch between an RCA and BNC is insignificant, especially since we don't know that there's a constant impedance between the entire point A to B circuitry AND.... we're not talking microwave RF transmission lines here. ;-)
But hey, some people can claim to hear the difference between a black rock placed on their electronics vs a white rock. Go figure!
![]()
"At the relatively low spdif frequencies, we know that the VSWR mismatch between an RCA and BNC is insignificant"How do we know that?
Its totally wrong wrong wrong. This has little to do with the signal frequency. It has everything to do with the risetime. Risetimes can be 25 nsec or they can be 300 psec depending on the design. Both work, but the 300psec will deliver lower jitter. It will also be more sensitive to impedance discontinuities. I've been doing digital design and a lot of transmission-lines for 37 years. I've designed communication networks for massively parallel supercomputers and computer mainframe peripherals for the largest computer companies on the planet. I know what I'm talking about.
Steve N.
Edits: 02/15/14
One of the points is, we do not know that there's impedance continuity from point A to point B, and at these low freqs the designer is probably not over concerned with it. So sticking a BNC at an end-point in place of an RCA isn't going to magically make things sound better.
Constant impedance and termination become much more critical into hundreds of MHz, UHF, and microwaves..... not so much at 3 MHz. But yes, I understand about rise time.
![]()
This is digital and therefore the risetime is the important metric to be concerned with for impedance discontinuities in a transmission-line.
Post a Followup:
FAQ |
Post a Message! |
Forgot Password? |
|
||||||||||||||
|
This post is made possible by the generous support of people like you and our sponsors: