leaving the computer on all day - power consumption

Status
Not open for further replies.

Midnight

Baseband Member
Messages
87
I'm trying to strike off a list of possible things in the apartment that are draining energy. Anyone know if leaving the computer on all the time is consuming too much electricity? ...everywhere that has computers leaves them on all the time ...and I am wondering if turning it off at the end of the day and before i go to class will save me money?
 
I remember long ago hearing that it is more efficent to leave a computer on rather than to turn it on and off, however this was a VERY long time ago and I have no idea if there is any basis behind it or not
 
gaara said:
I remember long ago hearing that it is more efficent to leave a computer on rather than to turn it on and off, however this was a VERY long time ago and I have no idea if there is any basis behind it or not

Yes, I find that hard to believe.....Personally I turn mine off when not using it, like I do with lights, T.V. or any other electrical items in the house. They all add up....
 
yeah the efficiency is in terms of the life of your hardware not saving energy lol.

It's compared to starting a car....turning on a car puts the most strain on the engine and same applies to a computer so really that has nothing to do with what he's wanting.

I couldn't tell you how much turning off the computer saves dude...only way to find out would be a test...leave it on an entire month and turn it off nightly for an entire month and check your bill...even that would be a rough estimate.

or turn your computer off, have someone outside looking at the meter, and turn it on and see if they notice an increase in speed...thats the only thing I could think of trying.
 
haha. :).. You will definitely save energy if you turn the comp off man. But as Nubius said, it is bad for the h/w.

A common reason for this is the thermal expansions that occur on your component. When you turn on/off, things cool/heated up in a cycle. This leads to problems also.

I think computers are pretty good these days at keeping the Watts figure down when on idle. The best way to do this is to check for power consumption in your house yourself.

1 week ON, 1 week OFF :)

Well putting things on idle will slow things down. But it's still better than turning everything completely off..
 
Thats what i would do, leav it on for a month, and then try turning it off when your not using it for a month. That would be the best test. Or you could get all technical and calculate how much it consumes by how many hours its on and then subtract the time you think you would turn it off and see if that makes a big diffrence on your total monthly usage?

And like the other two above me said. In a way your better to just leav it on all the time, Because when it gets turned off and on all the time the parts go thru a lot of temperature change which causes stuff to expand and contract and ends up causing more harm then just leaving it on all the time. well i cant think of the right words right now so ill leave it at that.
 
Yeah. You could give it a shot on your calculator. You can guess how much ur PC consumes.. ~300 W?.. You know the $$ per K Watt Hour.. So do the simple math :)

But yeah as I said, the contraction/expansion of electronics is a HUGE factor that leads to the degradation of your parts. Things have different thermal densities and the connections/chips are very fragile. When they contract/expand things do wear off.. :)
 
Although I could only guess, I'd say an average, newer computer uses up about 150-200W of power while at idle.

So, lets say you leave your computer on 24/7. This would pretty well mean that it's average power consumption would be that of idle, assuming you spend no more than a few hours on your computer per day. So, we'll say 220W average power consumption just to be safe.

Next, if there is about 720 hours in a month, and the computer uses an average of 220W, we can calculate that in a month, the computer uses 158.4kWh in a month. This is what your power bill is based on: Kilowatt-hours.

I am not sure what kind of billing you get per kWh there. So, find out whatever your price per kWh is, then multiply that rate by 158.4kWh, and that should tell you how much money per month your computer eats up in a month:D
 
Hmm, don't know about 220w....depends though...I believe the 6800GT's and Ultra suck up like 150w on their own, but if he's got a tad bit older than I'd suspect it wouldn't be much at all
 
If you are using a CRT monitor then make sure you flick that off before you go out. that'll use more power than an idle computer
 
Status
Not open for further replies.
Back
Top Bottom