The gain settings in many amps tend to confuse novice users. The long and short of it is that at some point, amp designers realized that when you got to rated output power with less drive signal ( the level from your mixer ) the users thought it had more power and output. The input attenuator has always been used because you can use it to tune the systems dynamic range for the best signal-to-noise ratio. With a high input sensitivity, any little bit of noise at the input becomes a lot of noise at the output of the amp. If you turn the attenuator down, the noise floor goes down too. The potential output of the amplifier is still the same though, it just requires that much more drive level to achieve full output. With a low sensitivity input, you inherently have a lower noise floor. You will have to send a higher drive level to achieve full output, but it isn't much different than turning down the attenuator as a gross analogy. The problem is that not all mixers and sources have enough output to drive a low input sensitivity amp to full output. Some mixers can only produce +12dbu of output before they clip, so if the amplifier requires +18dbu to achive full rated power, you will clip the mixer before the amp is clipping. Conversely, if you have a mixer capable of producing +24dbu of output, an amp with a high input sensitivity will be clipping LOOOONG before the mixer does. This is why many amps come with adjustable input sensitivity settings. It adjusts the amp to work best for your needs and situations.