ok, i was curious about how much my computer cost also so i searched google this site came up among others which is why i found this site i found a very informative site: http://michaelbluejay.com/electricity/computers.html bottom line= you all are wrong so far... the MAX the computer can use is 450w.. just because it has that capacity doesn tmean that it uses that much... if you dont look at the link the end math is that your computer probably uses a max of 150w when you are using it (they say on the website about 80) and your monitor another 120w... when in sleep mode it uses a max of 50 (and your monitor uses practically nothing ie. 0-15) so... dont worry about it... he mentioned ont he site how if you used your computer/monitor intensely 8hr a day it would cost about $3.30 a month (im from usa so roughly $0.10/kwh) tacking on 16 hours of sleep mode = another 2.4$ a month logically you might spend about $10 a month on your computer hope i was helpful
it all makes sense to me, i found a few other sites that had very similar information, this one just presented it the best way it wouldnt make sense for your computer to use the max power all the time... if it did there woudl be problems for example... if your computer used all 450w, and u got a new cd drive and installed it... it would need some w of its own but the powersupply wouldnt be able to supply it therefore the powersupplies logically have to be built huge to support any sort of upgrade you could want... some things would use much more power... i bet playing a new videogame uses about ~30 more w (maybe more, its just a random guess)
Back to my original statement: Nice site by the way................. I was going to say, look at the specs of laptops power, they run on much lower supplies and the technology in them is no different to a regular desktop - except for the size of course.........
I Was giving worst case scenario. I can admit defeat. So I throw up my white flag. But really even with that web site it is going to depend on what is on the inside of your system and what you are doing with it. I know on my current setup a 350 Watt would keep crashing. I put in a 450Watt and it hasn't crashed since. There is something I keep hearing of that you can attach to the cord to get current readings. I have not looked for this item so I do not know if it exists. You don't have to cut the cord anywhere. There must be some proof to it if Auto timing guns work. If I understood that website correct then I have another reason PC is better than Mac. They use less power. -Del
Even though the website stipulates lower wattage useages, the real calculation should be via the incoming AC mains. Here is a good write up about it, 3 pages and references........ http://www.targetpc.com/hardware/power_supplies/measure/
Yep an ammeter is what you want. I cannot see your PC using that much, your monitor will use a bit, my 19" use 2.5amps (about 500W in English currency). One unit in the UK is about 5p I think. 1 unit = 1kW/hr. I can't see a problem here, except that someone has short arms, deep pockets & that tight, he squeaks when he walks! Gees man, just pay your way!! Can't believe this generated so many sensible replies. I have been holding this back for ages!!! AAAAAAAAAAAAAAAAAAAAARRRRRRRRRRGGGGGGGGGGHHHHHHHHHHH!!!!!!!!!!!!!! pulsar LOL!
Thank you Ddp. To really find out how much power yours is using you will need an amp meter. Plug all your components into one power strip or UPS. Place the amp meter on the cord that actually goes to the wall. Power it all up and see what you get. In USA multiply 120 to whatever number you get and that is how many watts you are using (yes this may not be exact since you may not be drawing all 120 volts. It will give you a general Idea). I would also suggest starting up a big 3D game as your system does use more power for those. -Del
mmcbean, No I wasn't incorrect. READ MY MESSAGE AGAIN. I said I had a 400W PSU but estimated I was using around 250W. Read people's messages fully before playing the smart arse.