answersLogoWhite

0

Unit or say Rating of transformer is in KVA and not in KW because copper loss of a transformer depends on current and iron loss on voltage. Hence, Total transformer loss depends on volt-ampere ( VA ) and not on phase angle between voltage and current i.e. it is independent of load power factor. That is why Unit or say Rating Of Transformer Is In KVA and not in KW.

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

JudyJudy
Simplicity is my specialty.
Chat with Judy
ViviVivi
Your ride-or-die bestie who's seen you through every high and low.
Chat with Vivi
FranFran
I've made my fair share of mistakes, and if I can help you avoid a few, I'd sure like to try.
Chat with Fran
More answers

The transformer has two types of losses :

a. copper loss

b. core loss

As we know Copper loss is totally depends on the current of the transformer and the core losses depends on voltage of the transformer.

This can u also find in " No load test and On load test".

So the rating of transformer is in VA.

Dhiraj Shringi

Alternative Answer

A transformer's iron losses depend on the magnitude of the flux which, in turn, is proportional to voltage, while its copper losses depend on the winding currents. As both iron and copper losses contribute to the maximum operating temperature of the transformer, it follows that a transformer must be rated in terms of voltage and current. In alternating current systems, the product of voltage and current is apparent power, expressed in volt amperes.

As a transformer's secondary voltage is kept approximately constant, it is its 'volt ampere' rating that determines its maximum (secondary) load current.

Expressing a transformer's rating in watts (i.e. true power) would be completely meaningless because, with a highly-reactive load, it will be supplying practically zero watts while still possibly having to supply its rated current.

User Avatar

Wiki User

13y ago
User Avatar

it indicates the power handling capacity of the transformer

AnswerA transformer's iron losses depend on the magnitude of the flux which, in turn, is proportional to voltage, while its copper losses depend on the winding currents. As both iron and copper losses contribute to the maximum operating temperature of the transformer, it follows that a transformer must be rated in terms of voltage and current. In alternating current systems, the product of voltage and current is apparent power, expressed in volt amperes.

As a transformer's secondary voltage is kept approximately constant, it is its 'volt ampere' rating that determines its maximum (secondary) load current.

Expressing a transformer's rating in watts (i.e. true power) would be completely meaningless because, with a highly-reactive load, it will be supplying practically zero watts while still possibly having to supply its rated current.

User Avatar

Wiki User

13y ago
User Avatar

Well, smaller Transformers are rated in VA, bigger ones in KVA, huge ones in MVA. The amps (A in KVA) are what matter the most. They generate heat and copper losses in the transformer. The manufacturer has no way to know what the power factor of the load will be, so cannot easily put a KW rating on the transformer. Suppose the transformer is rated 100 KVA. If it is supplying a kiln with a resistive heating element, it can deliver 100 KW to the load. But if the transformer is feeding a computer data center, with many (reactive) computer power supplies, resulting in a low power factor, it will be able to deliver only a fraction of 100 KW. The problem is the reactive current still flows in the transformer windings, still causes heating and losses, but does not contribute real power to the load. It is up to the electrical system designer to characterize the anticipated load, determine what the power factor will most likely be, and size the transformer accordingly.

User Avatar

Wiki User

16y ago
User Avatar

Because commonly we say : some hardware (e.g. transformer, speaker etc.) has XX Watts which corresponds to Real(Active) power (usually the only one measured by old traditional counters) but we have also Apparent power and Reactive Power (that occurs when we have current lagging to voltage (which is not measured by traditional spinning magnetic core counters) . KVA (kilo volt-ampere) is used to measure the capacity (or power) of power transformer because it's more exact in terms of expressing power.

Answer

A transformer's iron losses depend on the magnitude of the flux which, in turn, is proportional to voltage, while its copper losses depend on the winding currents. As both iron and copper losses contribute to the maximum operating temperature of the transformer, it follows that a transformer must be rated in terms of voltage and current. In alternating current systems, the product of voltage and current is apparent power, expressed in volt amperes.

As a transformer's secondary voltage is kept approximately constant, it is its 'volt ampere' rating that determines its maximum (secondary) load current.

Expressing a transformer's rating in watts (i.e. true power) would be completely meaningless because, with a highly-reactive load, it will be supplying practically zero watts while still possibly having to supply its rated current.

User Avatar

Wiki User

13y ago
User Avatar

Because KVA (or MVA) is true power, whereas KW (or MW) is apparent power, being related to true power by the power factor, which is a function of the reactance of the load.

User Avatar

Wiki User

13y ago
User Avatar

The load power factor depends on the type of load. KVA is fixed but KW depends on the PF as well.

User Avatar

Wiki User

7y ago
User Avatar

kVA means thousands of VA (volt times amperes). Except for a power factor (which is often close to 1), volt times ampere is basically the same as watts.

User Avatar

Wiki User

7y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Why rating if transformer is taken in KVA?
Write your answer...
Submit
Still have questions?
magnify glass
imp