Why generator voltage is 11kV 50Hz or 13.8kV 60Hz?

The real reason behind the choice of 11, 22, 33, 66 and 132kV levels is not clear from the responses to date and it may have passes into history unknown. Most likely, it was a decision arising from a standards committee and it may be an optimum transformation ratio chosen to minimize total losses. Someone still alive may yet know the reason. Then again, it may have a legacy resembling that of the standard rail gauge.

These numbers were chosen arbitrarily in the past and have actually evolved as industry standard since these voltage levels are integral multiples of 11 which further simplify the voltage transformation ratio. In the American standard, 13.8 kV, 34.5 kV, 69 kV, 138 kV have also evolved as standard voltage levels. Just like 12 inches makes 1 foot and 3 feet makes 1 yard have evolved into standard units of measure.

To put this issue in a proper perspective, the choice of generation voltage will always depend on the dominant standard that is being adopted by a particular country or region. For Europe, 11 kV and 50 Hz is the dominant standard generator voltage and frequency, for North America, 13.8 kV and 60 Hz is the dominant standard generator voltage and frequency. But large generator manufacturers can design and build a wide range of generator voltage (11 kV to 28 kV) and standard frequency (50 Hz or 60 Hz), based on existing generator design, construction and insulation technology.

It is cheaper to generate at a relative lower voltage and then step it up for transmission. Hence, most power generating plants are designed to operate at 11KV. To generate at 33KV, the size of the motor might be twice as large as the size of 11KV generator. So it is better to have a multi stage step up for transmission if need be.

One plausible explanation I have encountered is that the apparent standardized voltage values of 3.3kV, 6.6kV, 11kV, etc are not necessarily, or intended to be, multiples of 11. Any fortuitous relationship to form factor is purely coincidental. These values have their basis in formative development years of the Electrical Supply Industry when the adoption of the more rounded figures of 3KV, 6kV, and 10kV were made. During this era, it was pessimistically assumed that the transmission lines would subtract around 10% of the input voltage level through power losses in the cable. Therefore in order to compensate for this, the primary generation voltage would be; the required nominal voltage + transmission losses. E.g. 3000v + 300v = 3.3kVetc. and hence the off load generated voltages became 3, 6, 11 + 10%.

11 kV (50 Hz) is the dominant generator voltage in Europe, while 13.8 kV (60 Hz) is the dominant generator voltage in America. Large generator manufacturer like General Electric can design and build generators with voltage ranging from 11 kV to 28 kV and frequency either in 50 Hz or 60 Hz.
The choice of generation voltage has nothing to do with form factor, but depends on the requirement or dominant standard adopted by a certain country or region. Likewise, transmission voltage depends on the standard voltages adopted by a certain country or region. It could be 138 kV, 230 kV, 500 kV in USA or 400 kV in EU or 800 kV in South Africa. The choice of generation voltage and transmission voltage will determine the voltage ratio of the Generator Step-Up transformer.

Leave your comment


12/18/2016 8:19 PM
thx nice article!
8/11/2017 10:37 AM
If we supply 10% extra in lines for compensation? what will be the rating and range of the devices connected to the lines in receiving end? (I am asking about CB, Isolators, Relay etc.)
10/16/2017 7:30 AM
for a T\L of 500KV that is going to a grid station
what is the procedure to step it down......
Plz send the details at abdulhaq4435@gmail.com