Regrettably, the answer is "depends on the receiver".

For any given input signal and level setting, amplifiers try to drive the speaker at a constant voltage -- constant in the sense that with two slightly different speakers the voltage would be the same with each speaker even if the speaker had a slightly different impedence.

Note that because we're talking about AC signals and music the voltage isn't really "constant", just "predictable and repeatable". We're also talking about AC impedence not DC resistance... in other words I'm simplifying all over the place, but it doesn't affect the outcome.

If the receiver is rated at 100W RMS per channel then it can put out enough voltage to drive 100 watts into 8 ohms. Since power = voltage times current and current = voltage over resistance (impedence) power = V squared over R. 100 = V squared over 8, V squared = 800, V = about 28 volts RMS (say 40 volts peak, 80 volts peak-to-peak). In case you're interested, the current required here would be 28/8 or a bit under 4 amps RMS.

Note that a good amplifier will actually be able to put out quite a bit more voltage AND quite a bit more current in order to properly handle the peaks and transients... but the distortion will probably be higher and the amplifier will overheat if you draw higher voltage or current continuously.

Replace the 8 ohm speaker with a 4 ohm speaker, what happens ? Amplifiers put out a constant voltage, so instead of (28 squared / 8) or 100 you get (28 squared / 4) or 200 watts, ASSUMING THE RECEIVER CAN PUT OUT ENOUGH CURRENT TO MAINTAIN THAT CONSTANT VOLTAGE.

Here's where the fun starts. Most receivers can't put out enough current to drive the same voltage into 4 ohms as they did into 8 ohms... and the ones that can will overheat if you try to run at 200W output instead of 100W. Part of the problem is that the energy lost in a wire (or an amplifier) is a function of current not voltage -- this is why long distance power lines step the voltage up by a factor of 1000 or more, reducing the current and lost energy by the same factor.

In other words, 100 W into 8 ohms is 28 volts and 4 amps; 100 W into 4 ohms is roughly 20 volts and 5 amps, so the amplifier will waste about 25% more heat in the output stages of the power amp and run hotter as a result.

Receivers that have enough current and power reserves will typically put out 50% more power into a 4 ohm load than their 8 ohm rating, but will run a bit hotter as a result. Cheaper receivers will overheat and crap out, sometimes at pretty low power levels. It's not that 4 ohms is "bad", it just happens that 8 ohms is the most common standard for low cost home audio.

Note that most big-ass power amps are designed to drive happily into a 4 ohm load all day, and are usually factory rated for something like 50% more power into 4 ohm than 8 ohm. Denon and HK receivers also seem to have no problems driving into 4 ohms, do NOT have factory ratings for 4 ohms, but anecdotal evidence suggests they can drive 30-50% more power into a 4 ohm load.


M60ti, VP180, QS8, M2ti, EP500, PC-Plus 20-39
M5HP, M40ti, Sierra-1
LFR1100 active, ADA1500-4 and -8