Audio Asylum Thread Printer
Get a view of an entire thread on one page
|For Sale Ads|
I picked up a new Onkyo M5010 stereo power amp from Best Buy ($250 plus state sales tax)?
One channel has 15 mV and the other channel has 32 mV DC at the output.
The source has DC blocking capacitor at the output. So the DC output is the power amp's offset.
Is 32 mV acceptable level of DC output on a hi-fi amp(well, that model is not really hi-fi)?
I have seen some hi-fi amp's catalogue specs such as DC less than 10 mV (Quad, Creek, BGW, etc.).
In the case of a Parasound Halo amp, the manual claims 0.00V meaning that DC offset at the output is less than 5mV.
There is no DC adjust.
If the DC exceeded 100mV I would consider replacing the diff input pair for that channel.
Test it with no load and after it has warmed up for about 5 minutes.
Honestly even my cheaper amps can be set to 0mv. Sounds like some poor QC on the part of the factory line person doing that settings.
Check it while the stereo is "cold" and then again after it has been on for an hour. If there is a significant swing you got some problems.
I saw a service manual on the internet. I could find what seems to be bias adjustment pot, but could not find DC offset adjustment pot.
The Adcom GFA-1 has a protection circuit that kicks in at 50 mV. I could live with 100 millivolts.
Dark energy? Ridiculous!
We live in an electric universe.
Post a Followup:
Post a Message!
This post is made possible by the generous support of people like you and our sponsors: