|
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
66.210.33.152
Hello all,
I have a Thorens TD-124 with a Shure M3D cartridge which has an output impedance of 47K ohm. I then have a Hagerman "Bugle" phono stage with an input impedance of 47K ohm and an output impedance of 330 ohm. This is followed by a DIY version of the Decware Taboo amplifier which has an input impedance of 47K ohm. My question is this. Should I change the input impedance of the "Taboo" amp to 330 ohm??? Thanks in advance.
Jim
Follow Ups:
the source impedance is good practice when coupling VOLTAGE
amplifier (which is what we're talking here) stages. This ensures
a Maximum UNDISTORTED output voltage, and greater linearity in
general. Remember, we are not trying to develop any real amount
of power (lot's of current) here, just to cleanly and quietly
amplify voltages. Matching loads (such as loudspeakers) to POWER
amplifiers is a different story. Much less great impedance ratios,
and some large variations, depending on the nature of the Power
stage: Solid state, pentode, triode, etc. --another discussion.Just remember, it is good to have the load resistance higher than
the source resistance in voltage amplifying stages --NOT vise-versa.
T.M.
Am I understanding your response correctly. Are you saying that the input impedance of the Bugle phono stage should be 4-5 times 47K, (the output impedance of the Shure M3D)? Thanks for responding.
Jim
I question whether the output impedance numbers quoted by most audio manufacturers do in fact meet the definition for such. It's been my experience that the numbers just as frequently represent the manufacturer's recommended load impedance. Not the same thing at all, of course.
Normally it does no harm to feed a low impedance into a higher one. There should be no need to change the input impedance of your amp.
Thank you!
Jim
This post is made possible by the generous support of people like you and our sponsors: