In: Electrical Engineering
Hello,
Please find
the answer attached as under. Please give a thumbs up rating if you
find the answer useful! Have a rocking day ahead!
The power factor is the ratio of actual power to the load, to the apparent power flowing to the load. High power factor refers to power factor closer to unity i.e anything equal to or greater than 0.9, ideally (Commercial units with pf lesser than 0.9 are usually penalised). Low power factor refers to power factors lower than 0.9 ideally. Since the power factor gives the fraction of the actual power being transferred to the load, ideally, a high power factor is usually desired. In industries, a power factor closer to 0.98 is maintained.
Most of the inductive loads have poor power factors. Examples would be induction motors, grinders and mixers employing induction motors, electric discharge lamps, transformers etc. In each of these applinaces, the current lags behind voltage, creating a phase difference between the two. The larger the phase difference, the smaller the actual power delivered to the load. The laging effect is caused because of the inductive effect, which does not allow rapid changes in current.
Loads which are purely resistive generally present the highest power factor. Examples would include ovens which have resistive heating, resistive water heaters, resistance soldering machines and filament lamps. Since these elements only have resistive impedance (practically no L or C), the current is always in phase with the voltage. This means that the true power supplied to the load is the entire actual power, and there is no apparent power.