In: Physics
I read an article which tells power consumption by many devices.
It say that a desktop computer (computer and monitor) use 400 to
600 watt.
While when i checked my computer and monitor with meter, it was
about 60 + 60 = 120 watt (computer + 17" CRT monitor) after loading
windows xp and running an application. The voltage is 220V
here.
Which one is correct? How much power does it consume?
I sincerely doubt that a computer could use 400+W under normal circumstances. That is the typical power rating of the transformer powering the computer. So, at peak the computer could use that much power.
This peak consumption in practice means something like:
Also, IBM compatible motherboards and power transformers are made to support different hardware configurations. Examples:
You can think of it this way as well. Servers normally have two PSUs (for redundancy). Both are plugged in at the same time and both could theoretically support the server by themselves. So if the maximum peak wattage of the server is 500W, you would have 2 500W PSUs plugged in at the same time. However, the consumption of the server will not exceed 500W.