Hello delbianco!

Its really a shame that the audio equipment manufacturers can't (or more precicely WON'T) conform to a universal standard when it comes to rating amplifier power (and many other specs as well)

One of the best ratings that many manufacturers use and have used over the years includes a rating that gives a continuous amplifier power output in a specified load (usually 8 ohms) in a specific frequency range (normally 20 - 20K hz + or - 3 db[and the + or - part is very important]) with a maximum specified harmonic distortion level (such as no more than .5% or .08%THD).

If you can find such ratings, it is easier to compare amp to amp. Unfortunately some manufacturers choose to over emphasize power at the expense of load ratings (impedance) distortion or frequency response, and that makes it hard for the consumer to make fair comparisons!

Every manufacturer has the same cost/power limitiaions in their designs and construction. These days, it is fairly simple to design and build a reasonably priced and good performing amplifier up to about 100w per channel. When you go over that magic number of 100, then the costs incerase drastically for minor incerases in power. Given the logrithmic scale of SPL to power output, your best bang for the buck resides in more efficient speakers driven by amps in the 70w - 100w range. when rated something like this:
Power output: 90W per channel (continuous or RMS) at 8 ohms, 20 - 20K hz +or- 3db, with no more than .05% THD.

If the 90 watt amp (above) is well designed, it will put out more watts at lower impedance levels (i.e. 6 or 4 ohms) - but usually with a higher THD rating - such as .5%

Bottom line - when comparing the amplifier portion of receivers or integrated amps - always compare oranges to oranges

Good Luck!

Randyman