Home Propeller Head Plaza

Technical and scientific discussion of amps, cables and other topics.

Re: Reducing Noise by Choosing Impedances

John Curl wrote:
Charles, I wish to say that your 'second guessing' amp design only confuses the situation. Do you see a THX logo on your amp? This means that the amp follows THX guidelines. One guideline is the voltage gain of the amp must be calibrated to: .1V=1W or 1V=100W. We have no choice in this.

Thanks for spelling this out! I knew that THX required a specific gain, but I had always heard it in terms of dB voltage gain. Now I see the intuitive formula behind it.

Thanks also for the additonal info about my amp! So there's also a 1M ohm resistor load behind the pot... When do shunt resistors add noise?

John I believe you are one of the best amplifier designers. Within established constraints, you do better than anyone, and I understand that adhering to established constraints (standards and conventions) are essential for commercial success.

However, here in my house, I am under no obligation to follow established standards. I do look for the THX label when buying HT receivers to avoid getting garbage. But when buying or modifying a hifi amplifier, I don't need to. (In fact, given the choice of THX or "designed by John Curl", I would choose the latter.) For hifi stereo amplification, the only standards I need adhere to here are my own (CPX ?).

And I wonder how THX decided on 1v for 100 watts. If anyone knows, I'd like to hear it. But to me, it simply seems very much like the codification of previous bad practices and conventions coming out of the distant past, like the RCA connector. I believe it would make a lot more sense for 100 watts output (which in practice, is very very rarely needed) to be achieved with a much higher input voltage level. As I argued before, the typical "average" level of input to a power amplifier is on the order of 10mV to 100mV. Given an electrically noisy environment, and conventional circuitry (transistor, and especially tube) it is hard to achieve a good hum and noise level below an average of 100mV, let alone 10mV. Also, there is no good reason to require 1 or 2 volts to be the effective "maximum" level. It is no particular trouble to achieve 8 volts even with cheap opamps, let alone tubes. This extra headroom is wasted, and the result is unnecessarily high hum and noise.

You might argue that this is unimportant, but I beg to differ. The reason is this: hum and noise. With single ended connections, and these low level interface voltage standards, it's hard to avoid easily audible hum and noise (easily audible when you get close to a speaker).

This would be significantly reduced with my proposed standard, which would be something more like 8V for 400W into 8 ohms. (Based on the reasoning: you'll hardly ever need 400W, even if it's available, which is pretty unlikely, and on the other hand, it's pretty easy to achieve 8V input even with cheap circuits. High end equipment can go to 16V for even more headroom.)

Sure, I could reduce hum and noise by getting: a balanced preamp (actually, I have one already), a balanced crossover (readily accessible in pro audio, but quite rare built to audiophile standards), and a balanced amplfier. All that will cost a lot more money. I believe I can get a similar improvement simply by removing two pots and putting in a -9dB low resistance voltage divider (5K or less total resistance) at the input of my amp. In one fell swoop, I'll improve S/N by about 9dB. The only thing I'll lose is a little superfluous line-level headroom.

Charles




This post is made possible by the generous support of people like you and our sponsors:
  Schiit Audio  


Follow Ups Full Thread
Follow Ups


You can not post to an archived thread.