Cost of running your computer R R




Has it ever crossed your mind how much that humming box you call a computer actually sets you back in electricity charges? I'm sure even if money is not an issue for you, it still would be interesting to know how much weight your computer system drops on your electric bill each month.

From the calculations we're going to do here, you will have a good idea of not only how much power your computer consumes, but also the power consumption for every other gadget in your house that runs on electricity.

Although we will base our calculations on computer users within the USA, the same principle can be applied easily, no matter where you live. All you need is the unit charge of electricity usage, which can be obtained from your power company, or you may even already have it on your electric bill.

So the question is, how much does it cost to run your computer each month?

That depends on where you live. Cost in places like Hawaii, New York, California, New Hampshire, etc. are highest at around 12 cents/kilowatt-hour (if you're not familiar with what kilowatt-hour is, don't frett over it. We'll take a closer look later). Cost of power consumption is lowest in places like washington, idaho, wyoming, etc. at around 4 cents per kilowatt-hour.

The amount of power that your computer consumes is measured in watts. Every electronic equipment has its power rating at the back of it. Look at the back of your computer and monitor and note the power rating on each one. The power supply on most computers is rated at 250W. Some may be less, some may be more, but 250 watts on average.

So assuming you
  • have a 250W power supply
  • have a 100W monitor
  • leave the system on 24/7, including the monitor
  • live in California, and
  • leave the computer+monitor running for a month...
Let's talk a moment about kilowatt-hour

kilowatt-hour:
unit of work or energy, equal to the energy expended in one hour at a steady rate of one kilowatt. In other words, this is the amount of energy consumed in one hour by an equipment rated at 1000 watts.
Total kilowatt-hours (kwh) consumed by one 100-watt light bulb burning for 150 hours can be calculated as follows:

100 watts x 150 hours = 15,000 watt-hours = 15 kwh.

So for a typical computer 

total watts = 350 watts, (ie. 250W for the computer + 100W for the monitor)

total hrs per month: 24 hrs x 30 days = 720 hrs

kilowatt-hour: 350 watts x 720 hours = 252,000 watt-hrs = 252 kwh

Cost/month in california = 252 x 12 cents = $30.24

Cost/month in Washington = $10.

The cost we calculated above is as bad as it can get. As you know, most computers have power management system whereby the computer and monitor get turned off automatically during periods of inactivity. So if we take all the idle times into account, the cost of running the computer per month could drop as much at 70%. You get the idea...

Now that you know how to calculate the cost of electricity consumption, you can easily find the cost of running any equipment you may have. Just look at the back of that equipment for the power rating, then ask your power company for cost per kilowatt-hour. Your electric bill should also carry that information, so look there first.


Send me a feedback on this tutorial, so I can improve on it. Use the form below. Thanks.



Home