Why higher frequency is used rather than 50Hz & 60Hz?

The earliest proponent of alternating current (AC), Thomas Edison pushed direct current. If I remember my history right, 60Hz was a frequency that would produce no visible lamp flicker. There were (and still remain) 25Hz, 50Hz, 133Hz, & 400Hz power systems.

The initial investment in one frequency, early power generators (hydro i.e. low frequency) would have made sudden changes to be very costly. But some of the reasons are to do with light flickering (which frequency will give you continues light?), commutator AC motors (sudden changes in direction of current causing inductance issues) etc. Modern technologies have now provided solutions to some of these problems for just changing the frequencies between 50Hz and 60Hz, or even 400Hz.

There is nothing sacred about 60Hz. I have always wondered why there are 60 seconds per minute and not 100. Like many things, a selection is made and, like the QWERTY keyboard, it stays with us forever after.

Higher frequency power systems (e.g. 400Hz is standard) are used on airplanes. The higher frequency provides for less saturation in transformers so they can use less iron and be lighter.

For domestic systems, the reason they are not used is because it would require generators to be run at much higher speeds or have more rotor poles - both an expensive proposition.

For the case of the 50Hz motor, looking at the equivalent circuit it can be seen current/torque will be less due to impedance change and no load speed will be higher. Torque can be compensated for by increasing the voltage to provide more current. Smaller motors often have 50Hz/60Hz ratings on the name plate.

Note that even low cost garden variety variable frequency drives typically provide some additional speed control above rated frequency so running a 50Hz motor at 60Hz, 75Hz or even 100Hz is not unheard of.

Leave your comment