What is % impedance of transformer?

The % impedance is formally referred to as impedance voltage. It is the supply voltage, expressed as a % of rated voltage, that is required to circulate rated current through the transformer.

It is measured in the factory by a short circuit test at rated frequency. With the low-voltage winding shorted, the supply voltage to the high-voltage winding is increased until rated current flows in the transformer. The supply voltage magnitude that results in rated current is then referred to as the impedance voltage. When it is divided by the rated voltage of the transformer, it becomes the %- impedance voltage, or more commonly, the %-impedance.

For example, if you have a transformer with a positive sequence impedance of 8% stamped on the nameplate, this means that 8% of the rated winding voltage is required to produce rated transformer current. If the high-voltage winding is rated at 220 kV, then 0.08*220 kV = 17.6 kV is required to produce rated full-load current when the low-voltage winding is shorted. This test is typically conducted under normal cooling conditions (i.e., natural air and oil; no forced air fans or forced oil pump cooling).

The positive sequence impedance is determined by a three-phase short circuit test; and the zero sequence impedance is determined by a single-phase short circuit test.

It is also a good indicator of how the transformer will fit into a soft or stiff system, in terms of its fault level and voltage regulation aspects. A system that is weak (high impedance) will be vulnerable to voltage regulation problems and installing a high %Z transformer will make that situation worse. We used to stipulate low %Z transformers (2-3%) at the remote ends of long distribution systems, otherwise starting large motors was problematic (volt drop). High %Z transformers are good to reduce fault levels and associated PSCC, to within the design ratings of the system switchgear etc, on very stiff systems with low impedance and high PSCC levels.

Transformer , FAQ

Leave your comment (Registered user only)


2/13/2020 6:08 AM
I'm just wondering why they call it percentage impedance in the first place. All these terms such as 'impedance voltage' is absolutely ridiculous. I can understand defining an impedance based on the percentage of primary voltage that causes rated secondary current to flow. But I seriously doubt that this percentage should be called an percentage impedance.
10/8/2016 7:23 AM
In general terms, not only particularly applied to generator transformers, transformers for industrial purposes have a design percentage impedance which can vary according to each specification. In my case, I always make the analysis of the influence of this impedance aiming at the performance of starting large motors across the line to verify if the voltage drops are too high that could avoid the motor from having a successful start. In many situations, among other steps, the decrease of transformer impedance is highly desirable for this purpose. On the other hand, there is an increase of the short-circuit level which may affect the equipment. It is a cost-benefit analysis. In Brazil, we had an old code, no longer valid, in which it was defined the "recommended" transformer impedances according to its power in kVA in a way to avoid extra costs of design. Such "normal" impedances can also be found as default values in SW programs, but such values vary from transformer design and country. There is another "recommended" impedance table according to transformer kVA for dry-transformers which have a little higher impedance compared with the oil transformers of the same kVA, due to higher inner distances among windings required in air or other insulation media and the consequence of dispersion of magnetic flux, which increases the transformer reactance.
10/8/2016 7:08 AM
I have come across the different percentage impedance for transformers according to the application. For example, the GTs always has higher percentage impedance? Why? Is it because to limit the fault current?