# Watt vs VA

I thought 1 V⋅A = 1 W so why are UPSs rated for example 600VA 320W?

VA is a measure of power supplied.

Watt is a measure of power consumed.

The Watt rating will always be lower than the VA rating due to: https://en.wikipedia.org/wiki/Power_factor

You need to always use the Watt rating to find a suitable UPS. Ignore the VA.

•

The answer is actually more complicated than just power factor. A UPS will use an AC source to charge a battery (ie. convert AC to DC) and when the mains fail, use the battery to supply connected devices (convert DC back to AC). Power factor will have some effect on the in/out ratings, but the main factors are battery & inverter sizing.

Just something to keep in mind … the amount of time you want connected devices to run is as important as total power draw when sizing a UPS. Refer to the manufacturer runtime curves/charts

• -1 vote

VA (kVa) v's kW (W)

Depending on a few factors, basic calc is.

VA *0.8 = kW rating.

E.g. 2kVa = 1.6kW

If you are trying to work out how much many amps you need for a solution, then you want to match the volts and have a greater current (amp) than what is required.

E.g. notebook PC charger. It says it uses 19.5V then match 19.5V to the power supply.
It states it wants a current of 6.9 amp then ensure the supply feed is of 6.9 amps or greater.

The system will only draw what it needs, providing the minimum is being delivered. If the volts are too high or low, you run the risk of burning something out. Same with the current being too low.

For UPS, you need to calc all the devices you are planning to run from it.
They should be a single-phase solution for a home device - standard outlets etc.
Then work out the expected run time.

Most UPS provider's websites will have a calc to help you with this.

Otherwise, not to sure what your question is, or what info you are wanting.

Hope this helps.

•

Where did you pull 0.8 from?