Difference between Current Transformer and Voltage Transformer

Transformers are essentially 2 coils of different number of turns linking a common time varying flux, so that different voltages can be connected on the two sides. Three applications arise: 1. Using transformer to drive a load with voltage rating different from that available at the source. This is power transformer action. 2. Using it for instrumentation, i.e., measuring currents of hundreds/ thousands of amperes with a low range ampere meter. No 'power' is expected to be drawn through it by the instrument (say, ammeter). The construction of such a current transformer (CT) therefore becomes special: thick conductor (for high current side) and small number of turns (even one turn - or a bar), and more turns on the other side). Design is different for getting accuracy. A similar argument applies to voltage transformers (VT). Because of special construction one needs to keep CTs always short circuited (as if through an ammeter). Relatively innocuous CTs and VTs for HV applications (100s of kV) are heavily shrouded by insulator bushings (to isolate it from surroundings, earth and other equipment and working personnel), which occupy much larger space than the skeleton transformer itself.

Current transformers have two functions:

  1. Step current down for metering by use of a burden resistor to scale a voltage to the current.
  2. Step current up for a high current application (such as testing current transformers, furnace apps, desalinization apps, etc.)

For case 1), the flux density is kept low for keeping error voltage within acceptable limits. The transformer self-impedance is kept high with respect to the burden to further reduce error current by shunted Ix. NI in = NI out will hold true to the accuracy of the design.

For case 2), flux density is not as critical and can be run up near material limits, as you only want to supply enough voltage on the output to force the required current through your load.

The voltage transformer:
Is usually designed with the higher flux density for the input voltage in order to reduce size because the customer never gives you the amount of space that you want. The flux density is limited by material type and inevitably, watts loss on the core that will give an acceptable temp rise.

On a voltage transformer, too little burden (which is the same as too much load) will cause the voltage to dip. In IEEE there are burden classifications (W, X, Y, Z, ZZ, M) that state different amounts of VA and the VT would be specified for an accuracy that can be achieved given any of the burdens that may be expected (as specified). For example a 0.3W, X, Y & 0.6Z & ZZ unit would provide 0.3 accuracy class performance at W, X & Y loadings but 0.6 at Z or ZZ loadings. The manufacturer may provide graphs to show the ratio correction factor and phase angle for the burden test conditions.

By basic definition there is no difference between voltage and current transformers, except purpose. For a designer 1) and 2) apply.


Leave your comment (Registered user only)