An induction machine isn't a constant power device and the power rating is specified at a specific voltage and frequency, decrease the voltage and you decrease the power.
It's a common misconception, from looking at
That a given power rating will try to balance this equation and the same misconception that a 100W light bulb will always draw 100 watts, when in fact power ratings are always rated at a specific voltage and frequency. At the most basic level we can remember that most things are of a fixed impedance.
Transpose Ohms law
And then substitute into the basic power equation to give
Which demonstrates how the power decreases with the square of the voltage.
This is the sales pitch for voltage optimizers and in fact the reason that they save energy.
That is looking at things in the most basic of ways, I have done a lot of work with the dynamic modeling of induction machines and the dq equations for power and electromagnetic torque clearly show how they both will decrease if the voltage decreases, check them out!
How much current an induction machine draws is indeed a complex (pun intended!) question, applying load torque will increase the input current and vector drives work like magic.
There is no doubt that decreasing the stator voltage and keeping all other parameters constant will decrease the speed, line currents and hence the power drawn it was an old method of speed control using variacs and even enormous resistors once upon a time!
I have also done a lot of practical work with induction machines and I have a good laboratory set up at home where I have done lots of what you would call amazing! practical work testing speed observers and different modulation strategies and I know without doubt if I decrease the DC bus with a V/Hz drive then the speed will decrease and the line current will also decrease.