[ale] 0.15c an hour

Greg Freemyer greg.freemyer at gmail.com
Mon Dec 4 18:14:47 EST 2006


On 12/4/06, Calvin Harrigan <charriglists at bellsouth.net> wrote:
> Christopher Fowler wrote:
> > http://jebediah.brown.googlepages.com/
> >
> > That article states that it costs 0.15c and hour to run a computer.  Is
> > that true?
> >
> > By my calculations
> >
> > 15 pennies * 24 hours * 31 days = 11,160 pennies.
> > Divide # of pennies by 100 and I get $111 dollars per month.  I
> > tend to leave my computer  on a24x7 but I do not think I pay 15c per
> > hour to run my computer.
> >
> > If that calculation is right then we are ripping off Quality Services
> > because our servers would be using more power than what we are paying in
> > our colo rack.
> >

Seems high, but if my calculations are right it is not off by much.
If you have real numbers you can make this more accurate.

===
I think electricity is about 7 cents per kilowatt hour in GA.
(Someone have a power bill handy?)

So if a 350 watt PS is only 50% efficient, it is pulling 700watts max.
 So that is .7 kilowatt hours per hour max for the CPU.

Or .7 * 7cents = 4.9cents / hour max. for the basic computer.

The reality is that the PS is probably better than 50% efficient and
the PS is probably not running at max load.  So you could easily be at
2 or 3 cents an hour for the computer.

The monitor likely draws about the same, so your at 5 cents an hour
being a good guess. (But there would not be a monitor at a colo)

But then if you have to cool it:  Normally takes more electricity to
cool then it does to heat, so I'm at a little over 10 cents for a
workstation and 5 cents for headless unit.

Now in your house / small business during the winter you're typically
running the heater.  Turns out most electronics is almost 100%
efficient at producing heat, so whatever it costs to run the computer
is made up for in reduced heating bills.

Greg
-- 
Greg Freemyer
The Norcross Group
Forensics for the 21st Century



More information about the Ale mailing list