It is often confused about the usage of terms apparent power and real power, Volt-Ampere (VA) and Watt (W). Especially power supply unit manufacturers specify their device ratings in VA, but the actual load such as bulb, fan, TV, Fridge, etc. are specified with power rating in watts (W).

The real power is the actual power utilized by the load such as TV, Fridge, Fan, Bulb etc. And the real power is measured in watts (W).

The Apparent power is the power consumed by the power supply unit to supply real power to actual load. Apparent power is measured in Volt-Amperes (VA).

In simple words, the power supply unit consumes apparent power from the line supply or battery, and gives real power to load such as fan, or bulb.

So apparent power is the sum of the real power supplied to load, and the reactive power wastage in reactive components such as capacitor or inductor. This reactive power will be dissipated as heat in power supply unit.

If apparent power is much higher than real power, then the power supply unit is said to be inefficient.

## Power Factor

To measure the relationship between apparent power and real power, the term power factor (PF) is used.

The power factor of an AC electric power system is defined as the ratio of the real power flowing to the load to the apparent power, and is a dimensionless number between 0 and 1 (frequently expressed as a percentage, e.g. 0.5 PF = 50% PF).

In an electric power system, a load with low power factor draws more current than a load with a high power factor for the same amount of useful power transferred.

To make use of entire apparent power to convert into real power, the power supply unit shall be designed with power factor approaching unity.

If you buy an inverter with 500VA rating, it will not drive a load of 500W. Based on the power factor of the inverter, you can calculate the real power it can deliver to load.

For example an inverter is rated with 1000VA, and is designed for a power factor of 0.8. Then the inverter can handle loads upto 800W.