Hi RatPack,

It's really complicated and there is no direct comparison possible from the volume dB display on your receiver, which is referring to electrical changes in dB (it takes 10 times as much amplifier power in watts to increase volume levels so they subjectively sound "twice as loud") and actual acoustic Sound Pressure Levels (dB SPL) at your listening area as measured by the Rat Shack SPL meter.

If engineers were observing strict theory, they would calibrate the volume dB display (assuming speakers of a given sensitivity) so that when you cranked up the control to the "0 dB" setting, the amplifiers would be ready to clip. The amplifiers would be producing their maximum rated output just before distortion set in. But loudspeakers vary considerably in their rated sensitivity (how efficiently they convert electrical watts to acoustic output), which will affect how much power is drawn from the receiver.

You can't compare the dB readouts from one brand of receiver to the next. There is a consensus, however, that most engineers calibrate the volume display so that the amplifiers will be nearing their maximum output somewhere between -10 dB and +5 dB on the display. What that translates to in actual volume levels in acoustic dB, C weighting scale, measured on the Rat Shack dB meter at your listening seat in your particular room will depend on a host of variables: 1) the rated sensitivity of the speakers 2) the absorptive or reflective qualities of the room and its furnishings 3) the physical dimensions of the room 4) how far away from the speakers you are sitting 5) How many speakers are operating (just stereo, 5.1, 7.1) from the same A/V receiver, and other variables too numerous to go into.

In the days of analog recording and broadcasting, "0 dB" or "+3 dB" indicated when the signal being recorded (or broadcast) was at near-distortion levels. Analog tape recorders "clipped" fairly softly, so you could push recording peaks beyond 0 dB to +3 dB or more without gross audible distortion. The "0 dB" reading was a means of standarizing recording levels on tape among different studios, and also of preventing radio and TV station transmitters from overmodulating (distorting).

But digital recorders clip in a nasty fashion and you must never let the recording signals get so loud that they go over 0 dB on the input meters. Even so, it still happens.

I hope this is somewhat clearer. If you have everything set up OK with all your relative channel levels, and you are using speakers of average or better rated anechoic sensitivity (89 or 90 dB/1 watt/1 meter), then you can assume that by the time your volume dB display gets to the -10 dB setting or higher, things will be getting very, very loud and the amplifiers may be nearing their full rated output on peaks.

Finally, if your receiver display indicates "-30 dB", you can assume that you are quite a ways from pushing the receiver's amplifiers into distortion.

Regards,


Alan Lofft,
Axiom Resident Expert (Retired)