The phrase power factor frequently is used in the electrical and power electronics industry. For example, home, office, and industrial electrical equipment often is fitted with power factor-corrected power supplies. Many current electronics dictionaries define power factor as the cosine of the phase angle between voltage and current or , but this definition will lead to errors and inaccuracies if applied to measurements on modern equipment.

So what is power factor? Why is it important, and how is it measured? Most of us remember that power factor relates watts to volt-amperes:

W = VA PF or

W = VA

where: W = true power (sometimes called active power or real power)
in watts measured with a wattmeter

VA = apparent power, the product of rms volts and rms amps

PF = power factor

Power factor is defined by IEEE and IEC as the ratio
of true power to apparent power: PF = W / VA. Power factor takes into
account both the phase
and wave-shape contributions to the difference between true and apparent
power. At the input to an off-line, switched-mode power supply, for example,
the current is not sinusoidal.

A large capacitor smoothes the full-wave rectified
supply *(Figure
1)*. The capacitor charges for a short period
of time near the peak of the voltage waveform. For the rest of the supply
cycle, the diodes
are reverse biased, and no current flows from the supply. The current
waveform consists of short pulses near the voltage peak.

It is not easy to measure the phase angle on these waveforms. If both
the voltage and the current waveforms were sinusoidal, it would be possible
to measure phase angle simply by reading the time difference between
zero crossings of the waveforms on an oscilloscope. In fact, the phase
angle now is defined as the angle between the fundamental voltage and
current. In this example, the cosq would be close to 1, yet the overall
power factor of a typical power supply might be 0.6 *(Figure 2)*.

Power factor is an important measurement for two main reasons. First, an overall power factor of less than 1 means that an electricity supplier has to provide more generating capacity than actually is required.

For example, consider an office building drawing 200 A at 400 V. The supply transformer and backup UPS must be rated at 200 A × 400 V = 80 kVA. But if the power factor of the loads is only 0.6, then only 80 kVA × 0.6 = 48 kVA of real power are being consumed. In other words, if the power factor were 1, the supply capacity (transformers, cables, switchgear, UPS) could be considerably smaller.

Secondly, the current waveform distortion that contributes to reduced power factor is a cause of voltage waveform distortion and overheating in the neutral cables of three-phase systems. Primarily for this reason, international standards such as IEC 61000-3-2 have been established to control current waveform distortion by introducing limits for the amplitude of current harmonics.

To comply with these standards, designers use circuits that force the current waveform to be near sinusoidal and in-phase with the voltage. These circuits are known as power-factor correction circuits.

Remember that power-factor correction circuits are not perfect, sometimes only providing a power factor close to 1 when the power supply is at or near rated full load. Since power supplies often are derated in use, power-factor measurements must be made on the equipment under test (EUT) across a range of working conditions.

Crest factor, another measurement often used when describing these waveforms, is defined as the ratio of peak to rms current. For this type of power-supply current waveform, a crest factor of 3 is common. A crest factor of 10 or more is possible for a very light load. The crest factor of a sine wave is 2 or 1.414.

Whether the EUT uses a power-factor correction circuit or not, how is power factor measured? Since power factor is defined as W/VA, power factor is measured using a wattmeter. The wattmeter will measure true power and usually provide a direct measurement of volt-amperes as well as power factor.

However, the waveform no longer is a sinusoid. The current waveform is distorted and will contain harmonics at relatively high frequencies.

To properly measure the power factor, the wattmeter must have a bandwidth that encompasses these harmonic frequencies. A few kilohertz bandwidth usually is sufficient, but meters designed to respond at up to 60 or 440 Hz might miss a significant amount of information.

Also, the original requirement may have been to limit harmonic amplitudes, not just to measure power factor. If this is the case, it is necessary to measure the harmonics and compare them to the limits. Even when a power factor-corrected power supply is used and the power factor is close to 1, the amplitude of a high-order harmonic could exceed the limits.

Reference

**Web site**: http://www.engr.psu.edu/

Back