|
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
71.112.27.9
In Reply to: Re: Jitter at what point in the chain? posted by Charles Hansen on March 4, 2007 at 10:16:29:
HowdyI guess I didn't note the irony in your previous posts :)
I was flabbergasted when I changed the power cord on my SACD transport (at that time a modified Philips SACD 1000) and on the other end of fiber optic cables my DAC6e produced louder and more defined bass. What the heck?
A plausible explanation? You're the hardware guy not me :) I still don't know if you are serious or not, but looking at digital signals with a scope and seeing how analog things really are, you definitely see things like input clocks and signals affecting local ground levels which in turn affect output clocks and signals on chips...
Follow Ups:
< < you definitely see things like input clocks and signals affecting local ground levels which in turn affect output clocks and signals on chips... > >So when the two boxes are connected with fiber-optic links, and the clock is in the DAC (running upstream to the transport), how in the heck can you explain that changing power cords on the transport could make any difference in the jitter?
I honestly can't think of any jitter-related explanation. So something is clearly going on here, but nobody knows what. At this point, I think that Peter Belt's explanations make as much sense as anybody else's...
HowdyIt's the other way around. The differences in the transport (power cords, mechanical isolation, etc. cause differences in jitter in the output clock(s) and data from the transport. This jitter in the optical cable is the mechanism for transmitting these differences in the transport to the DAC...
We seem to be talking about two different things. I am talking about what some people call "Clock Link". The master clock is in the DAC box. It feeds the DAC chip directly. A copy of this clock is sent upstream to the transport box. The incoming data stream from the transport is reclocked by the master clock in the DAC box.A block diagram of this is linked below.
If the cables between the two boxes are optical cables, it would seem that there is no way for any jitter from the transport to make it past the re-clocking module (FIFO) in the DAC box.
Please give a plausible explanation for how jitter from the transport makes it to the DAC chip in this case.
HowdyWell, that's exactly what I'm trying to talk about (except that EMM Labs uses two return cables instead of S/PDIF for the returning clock and data, avoiding the obvious problems with jitter in S/PDIF decoders.)
Note the legend in your diagram (nice diagram, BTW), especially "dirty clock". The jitter I'm talking about is the dirt on the dirty clock and S/PDIF signal. Note in particular that your diagram is correct in that the input to the DAC isn't the clean clock: it's been polluted by the dirty clock and data coming into the reclocking circuit. That pollution is caused by things like local ground bounce that I was trying to talk about. See the links to various papers in the posts below Christine's post.
< < nice diagram, BTW > >There is actually an error in the diagram. The blue line (dirty clock) connecting the reclocking module to the DAC module should should be violet (clean clock).
But my point is the same. You could keep doing things to keep the "dirt" out of the DAC box. For example:
a) Use separate clock and data lines (like the Meitner) instead of S/PDIF.
b) Use FIFOs to get rid of incoming "dirt".
c) Use opto-isolators to eliminate any ground loops or ground bounce from the incoming "dirty" signal.
d) Re-clock the "dirty" clock leaving the transport box with the "clean" clock coming from the DAC box.
And any other number of things you can think of. But no matter what people have tried, they have not been able to eliminate the sonic effect of tweaks on the transport.
My conclusion is that there is something else going on here that we don't understand yet besides simple jitter issues. It appears that you believe there is still some "secret" path for the introduction of jitter into the master clock for the DAC chip.
If you will allow me to interject here, there IS a secret jitter path, its the reclocker itself. So far nobody has yet come up with a reclocker that is completely imune to the jitter on the DATA input.For reclocking everybody is using a CMOS flip-flop, in any logic chip all the inputs will cause current spikes on the VDD and VSS (power and ground) internal traces in the chip, package etc. These current spikes cause voltage drops across the resistances and inductances of the traces. The "threshold" where the chip senses the change in the clock signal (its actually much more complicated than that, but for simplicities sake I'll call it a simple threshold) varies with these voltages, so the point at which the flop sends the value through will change in time based on these signal induced variations in on chip power. This is jitter.
Now you may think "well if the input signal doesn't switch anywhere near the clock edge it doesn't matter", this is quite true, BUT the internal spikes frequently excite resonances in the chip which cause ringing on the rails which last quite a bit longer than the transition of the signal. The upshot is that reclocking fast changing signals (such as highly oversampled or upsampled signals) is not good. The slower the signals being reclocked the better. Going to faster logic just makes it worse, they cause even more internal ringing.
Then there is how you do the reclocking. If you use a chip with multiple flops per chips, and use it to reclock multiple signals, the jitter from all those inputs combine. The best is to use single flop per package in very tiny packages to cut down on the package inductance.
OK so what about feeding the ultra low jitter clock directly to the DAC and not reclocking the other signals, well you wind up with the same issues inside the DAC chip itself.
The best way is to do both, run the slowest signals you can get through a reclocker AND feed the direct clock signal into the DAC. Of course this is assuming synchronous reclocking, no PLLs etc, the low jitter clock is right next to the DAC chip and controls the rest of the system.
Note that the above says absolutely nothing about galvanic isolation from the source, this is talking about jitter on the data lines, which will pass through any type of isolation scheme, as a matter of fact most isolation schemes actually increase the jitter on the signals which is why I don't use them in my system.
So all these DACs that are running super upsampled, multi whatever high speed signals (which sound impressive in marketing blurbs) actually make the system MORE sensitive to jitter than the old slow systems.
And don't get me started on jitter and FIFOs, this has gone on long enough, but I hope you get the idea, there are REAL measurable ways jitter can still get through a synchronous resampled system.
If the designer takes all this into account and does everything just right, the amount of jitter that can get through is pretty small, but its still definately there.
HowdyBut that is exactly my point: that isn't a mistake it's reality. A FIFO doesn't make things perfect, tho it usually helps. A FIFO is a jitter filter not a jitter eliminator. The point of the author of the diagram is trying to make ( http://audio.peufeu.com/node/7 ) is that using a DAC sourced clock to drive the transport and also to reclock the returning data helps to solve a lot of the S/PDIF problems (but not eliminate them entirely.)
Anyway I don't believe in gremlins and it's obvious to me that jitter is the cause of the problems that you are attributing to gremlins :)
I was encouraged as I looked for the links I gave that more and more universities are adding sections to their EE courses about jitter. I was taught a little about it back in the late 70's but I haven't run across too many hardware people with formal training in jitter mitigation.
< < A FIFO is a jitter filter not a jitter eliminator > >So just to make sure we are on the same page here, please explain to me what you think is the mechanism for jitter to be passed through an *asynchronous* FIFO.
< < I don't believe in gremlins > >
I've didn't used to either. But I've seen so many things make a clearly audible difference that (according to all currently understood electronic theory) *shouldn't* make a difference, that I've resigned myself to the fact that are things that we just don't understand yet. If you want to call those "gremlins", that is certainly your prerogative.
HowdyI'm not sure why you highlight asynchronous in this case, but as I've stated it before, things like local grounding levels being affected by the input clock and data affecting the output clock's levels/timing.
Anyway we've beaten this to death, I'm just as baffled about your point of view as you undoubtedly are about mine :)
Perhaps this is similar to the situation where there are some who claim that cables can't make a difference. It's clear to me that they do and I believe that in principle we can explain those differences, but in practice predicting the sound of a given cable with, say a given amp and given speakers is fairly intractable. Especially since we don't have that accurate of models let alone knowing how specific measurable features sound. Still I don't believe it's magic, just difficult.
< < I'm just as baffled about your point of view as you undoubtedly are about mine > >Referring to the block diagram I previously linked to, I guess what you are saying is that there is NO WAY to make the master clock immune to jitter on the incoming data signal, and that it is this transfer of jitter that causes various transports to sound different.
And I guess what I am saying is that it IS possible to make that master clock be totally immune to jitter on the incoming data signal. And that furthermore, we would still be able to hear the differences between transports because there is some other factor involved besides simple jitter.
I can't really think of a way to test either theory. All I can say is that whenever I found a really good way to test things that didn't make any sense whatsoever, they still made a difference. To the point that I'm pretty sure that are *lots* of things we still don't understand. Hence my original statement that I wish the "answer" was as simple as "jitter".
Thanks for keeping things civil!
Howdy"Thanks for keeping things civil!" and to you too. I sometimes dread reading and responding to posts when I'm challenged. I thank you especially for your last post and I agree with your synopsis of our positions :)
*** Note in particular that your diagram is correct in that the input to the DAC isn't the clean clock: it's been polluted by the dirty clock and data coming into the reclocking circuit. ***Well done!
Arguably even the "clean clock" lines are polluted by ground bounce and power rail fluctuations. Maybe "cleaner clock" is a better term :-)
I quite like the diagram and yes it's a good way of synchronizing clock between transport and DAC using dual SPDIF in/out lines. Didn't Sony do something similar a while ago (also using SPDIF)?
I read a very plausible explanation, but unfortunately can't find the link, otherwise I'll include it here.Basically, as we all know logic transitions in digital circuits are edge trigerred through level changes. Ie. the precise point in time at which a 1 becomes a 0 is approximately halfway between a voltage transition from say 5V to 0V.
Therefore the precise timing of logic transitions is heavily influenced by the stability of the voltage rails.
So the theory is anything that can cause micro voltage fluctuations can cause jitter. Even things like power cords, connecting cables from one device to another, variations in speed of a fan or motor, etc.
Anyway, the theory seems plausible to me. The link which I can't find at the moment provides some empirical data around typical voltage fluctuations caused by logic induced modulation, and corresponding impact on jitter.
Howdyhttp://members.chello.nl/~m.heijligers/DAChtml/PLL/PLL1.htm
http://www.xilinx.com/xlnx/xweb/xil_tx_display.jsp?sTechX_ID=al_vias
States the case reasonably well, without going into too much jargon. Jon Risch is a poster around here, perhaps he can chime in as well.
I am not an expert on switching power, but it just occured to me that there may be possible to design a switching supply that tries to minimise logic induced modulation and back EMF (for example, by altering the switching frequency).Any comments?
May help explain why some designers are using switching power supplies in high end players.
HowdyI'm no expert there either, but it doesn't seem likely to me. Good ground planes, trace routing, trace impedance management, etc. as well as local power supply filtering and appropriate bypass caps, etc. make all the difference. (Once again I'm talking as a software guy that spent too much time in the lab helping with system debugging, not as a hardware guy.)
Well, the reason I'm speculating is based on some comments from Bruce Candy regarding the design of the Halcro amps.Anyway, I'm just wondering whether it's possible to design a switching power supply that switches at a rate synchronized to the master clock, and draws power from different parts of the cycle for different parts of the circuit. Some of the more sophisticated PC power supplies already do similar things to stabilise the load to the CPU vs the graphics card.
I wouldn't like to be person doing such a design - I don't need the headache!
HowdyYep, syncing the switching supply freq to the important local clocks can be useful (for example we did it in a video monitor to hide the power supply noise in the retraces.)
But if the switching supply freq isn't perfectly synced with the clock(s) in question you have to worry about beating and sometimes it's hard to vary it as fast you might need to if the incoming clock rate is changing... Also you obviously need to handle the edge cases like making sure that unplugging the DAC input doesn't cause the power supply to shut down :)
.
HowdyI found this doing a quick search, and tho I'm pretty sure it isn't the article you were talking about it has some practical jitter info. (Don't forget to go the the second page where it talks more about some grounding issues (including the ground bounce I alluded to in this thread.)
http://www.elecdesign.com/Articles/Index.cfm?AD=1&ArticleID=4476
That wasn't the article I was thinking about, but it's a good article nevertheless.The interesting thing about logic induced modulation is that it explains why there is jitter coming out of an optical drive even though as some people love to point out there's a RAM buffer in there.
First of all, the memory is probably clocked at the same rate as the chip controlling the motor and laser, which is usually not a multiple of the audio clock. The clock source itself is probably synthesized through PLL. It's also likely to share power with the mechanical components of the drive. Result: jitter.
It also explains why a PC is typically not a good place to do audio. It's possible to get good audio (and let's face most studios these days use PCs for DAWs) but some care needs to be taken.
This post is made possible by the generous support of people like you and our sponsors: