We recently bought a Kill A Watt for work so we could measure the actual power draw in some of our switch closets and more accurately predict what our UPSs could do in the event of a power outage. The idea is we need to keep the closets running for at least a half hour because that’s how all the phones are powered (PoE… EoP… OeP…). I decided to bring the device home and do some math on my own computer setup.
First some specs on my home computer setup and what draws power at my desk.
The Actual Computer
- Processor: Intel Core i7 860 (2.8 Ghz)
- Memory: 16GB (overkill, but I likes to run VMs)
- Video Card 1: GeForce GTX 660 Ti
- Video Card 2: GeForce GTS 250 (I have it set up to just do Phys-X processing, but honestly don’t use it and should just yank it out)
- Networking: Dual Intel gigabit adaptors, but one is turned off in the BIOS
- Drive(s): One 128 GB SSD (connected at SATA III), one 500 GB mechanical hard drive (Connected at SATA II), and a DVD RW of some kind (SATA, not sure the actual bus spec, I doubt it even scratches SATA I) that billows dust whenever I open it.
- Sound Card: Whatever came on the board… 7.1 capable, optical out, etc.
- Other Ports: All turned off. If it ain’t in use it’s disabled.
- External eSATA hard drive for storing VMs
- Samsung 22 inch monitor
- Acer 22 inch monitor
- Logitech 5.1 surround speaker system
- LED desk lap
- Micro USB phone charger
Everything is plugged in to an APC 650 UPS (which is actually broken, so I’m not sure why I bother). The peripherals are plugged in to a power strip with a hard on/off switch that gets turned off when the PC isn’t in use. The PC plugs directly into the UPS and even when “off” is not hard off via the power supply switch at the back.
With everything turned on and my PC running a game I spiked around 220 W (watts) with a load of around 237 VA (volt-amps).
According to my ComEd bill the rate is $0.04640 per kWh. That comes out to about $0.010208 per hour to operate my computer. THIS site was extremely useful in calculating these energy costs.
Assuming I sit on my rump, like I am now and use my computer for 4 hours that comes out to $0.040832 per day to operate my computer.
Well, I like to game on the weekend so let’s say I spend 5 hours each weekend day on my computer plus the 20 from the week that comes out to 30 total hours on the computer in a week. That comes out to $0.30624 per week.
This means a total cost of $15.92 for the year.
This is all just with the thing running. We haven’t gotten in to sleep mode versus turning it off yet.
This cost does not include taxes and other bullsh*t fees that come from the electric company. For instance my average electricity bill is around $45 and $20 of it is a fee that is otherwise unaffected by my power usage, I could go on vacation for 3 months and I think I would have a $20 bill waiting for me each month.
Now my testing proved that my UPS just by being on and plugged in with zero other devices plugged in to it draws 3 W with a total load of around 6 VA. Now those 3 W are already included in the figure above, so let’s just start by saying the remaining 138 hours in the week that my UPS is on will cost me $0.019182 or an additional $1.00 per year. (Conversely I can cut $1 off my energy bill each year by switching to a straight up surge protector!)
What if I leave my computer in sleep mode?
Excluding the drain of the UPS (already calculated above) the PC while asleep alone draws 4 W with a total load of 14 VA. Crunch crunch crunch that comes out to an additional $1.33 per year to leave my computer asleep all the time it’s not running and $2.33 per year to leave it asleep with the UPS on.
You hear talk about how even when off devices draw power. Is it true?
Taking the UPS completely out of the picture I plugged my computer in to the Kill A Watt for the not-so-surprising result of a 2 W draw with a load of 11 VA load. To make sure I wasn't using drugs, I flipped the switch on the back of the computer power supply to see the draw drop to 0 W.
You can already see above that just leaving my computer asleep when it’s not being used costs me $1.33 per year, turning it off reduces that cost to only $0.67, barely worth mentioning considering the convenience of hitting a button on the keyboard and having the thing light up in seconds.
So a quick bullet point summary:
PC and peripherals running -- 220 W for 30 hours a week works out to$15.92 per year
- PC asleep, peripherals shut off via switch, UPS turned on -- 7 W for 138 hours a week works out to $2.33 per year.
- Total cost of my computer works out to about $18.25 per year.
- Oh and if I left my computer and all peripherals on 24 hours a day 365 days a year that would work out to $89.42 per year!
What did I learn from all of this?
- Appliances do use energy when “off” and if there are enough of them it can add up. When it’s convenient to put an easy to access switch/surge protector on the equipment do it. The savings isn’t great, but the polar bears will thank you.
- Sometimes it’s not convenient to cut power altogether. For instance my cable box when powered off takes about 10+ minutes to get the programming guide when it’s turned back on.
- The difference between sleep mode and even “totally off” is negligible for the gain in convenience of just being able to tap the button and have it come back quickly.
- I did a little testing with having my monitors left on and in power save mode (with the computer asleep). The sleeping monitor added about another 2 W of load, so times 2 devices I’m saving myself maybe $1.30 a year by doing so… still I feel better about it.
- While the personal cost isn’t terribly great, it’s important to remember these are electric costs the average household wasn’t incurring 30+ years ago. Back then appliances turned off when you flipped the switch, it wasn’t until the idea of “standby” modes that drawing current while “off” started to take shape. And computers, prior to effective sleep modes were simply shut off when not in use because only a crazy person would leave it running while not there!!! Right??? To see the real impact of these appliances you have to start multiplying.
- Approximately 115M households in the US, assuming each house has 4 appliances that draw a total of 8 W of power while “off” that comes out to 920 MW of energy wasted across the country.
- Assuming these devices are left “off” for 20 hours out of the day, and assuming everyone pays a similar electricity rate to me, that comes out to $331M in wasted energy across the country every single year… just to reiterate, on stuff that’s technically “off”.