I've had my M80s for 3 weeks now. Tonight, being alone, I finally got an opportunity to push them to see how loud they could get. I have the Denon AVR-890 which is 105W/channel. Depending on what I was playing, I found the sound to start distorting anywhere from +3dB to +8dB. I asked in a previous thread why would some people use an external amp (usually more powerful) instead of their AVR amp, what was the benefit, if it was to prevent distortion at high level. JohnK was kind enough to respond with the following:
Bruno, there's no audible difference(including distortion)when an amplifier with a higher maximum output capacity is used unless that higher max is actually used; unused headroom is simply that: unused.
For a comfortably loud average level in the mid 80s of dB level, only about 1 watt is needed. A 1000 watt amplifier won't be different from a 10 watt amplifier there. Brief split-second peaks will use much more, of course, but not likely anything that typical receivers(such as your 890)rated anywhere in the 100 watt area can't handle with audible transparency. You, not the amplifier, are in control of the loudness level.
So now my question is if 100W/ch is enough, why are my M80s starting to distort? My receiver goes up to +18db, why can't I push them to this level without distortion?
Don't get me wrong, +3db is more than I need for my room size, and I will have them very rarely at this level, however if I would have a much bigger room, that would have been a problem.
Also, I played at +5dB for about 45min (neighbor, if you're reading this, I apologize
) and even though the receiver didn't shut off, it got very hot. Is this normal?