and also a sensitivity adjustment so there is an option for a more normal amount of gain to drive them to full output, rather than the "pro input level" they previously required (26dB gain instead of the more common 32dB gain). It only took four years to fix the sensitivity issue.
I wonder how many sales they lost because they sounded 6dB quieter than all of their competitors in GC side-by-side comparisons.
Several decades ago, there was a large music store in the DC area that was convincing musicians that Brand X amplifier watts were louder than brand Z amplifier.
They would prove it as well.
Let's say both amplifiers were rated at 500watts/channel (common range back then).
If you hook a signal and speaker to brand X it would be louder than brand Z. So it had "more powerful watts", at least in the simple mind of many musicians.
Of course the reason was the input sensitivity. It simply took less to drive brand X.
Back then (it has changed now), I used to fix a number of brand X amplifiers, and none of them ever meet spec. So in reality, brand X watts were "less powerful" than brand Z. But the way the test was setup, it easily convinced many.
Also the store made higher profits off of the sale of brand X, so of course they used to push them anyway they could.
I have seen the same type of faulty test used to say that brand B console had less noise than brand A console.
They would "prove it" by turning up all the gains and faders and simply listen to the noise.
Never mind the fact that the console that produced the greatest hiss ALSO had more gain, that little fact was overlooked.
But if you don't understand what is going on, it is easy to be fooled.