When I saw the thread title, it thought was a query about companies going bankrupt. Ph4ZeD and others who point out the electricity costs have the right of it - an medium spec rig can consume 300W or more at full blast compared to (say) 100W showing a desktop or running demanding business applications (like Solitaire or Word...). 200W extra usage means 4.8KWh per PC per day if the PCs are left on 24/7. For a medium-sized business running 100PCs and paying £0.12/KWh (about US$0.20) that comes to over £1,700/month (US$2,780). Add in the environmental angle (and possible negative publicity of excess usage - as with Starbucks over their water usage) and you have a very tough argument to get this approved by most businesses. Any sysadmin who's gone out on a limb and done this on their company PCs without getting agreement in writing (since people can have alarmingly short-term memories about costly decisions) would be putting their job on the line.
My Alienware laptop folded 24/7 during the chimp challenge with no ill effects, I am typing this on it now,
Electricity costs for large organisations with high electricity useage are actually lower - around £0.07 per unit (kwh) as they can negotiate bulk discounts with their electricity providers. However there is an attendant cost in air conditioning which can almost double this, so the end effect is similar. In the large multi-national company I used to work for, the desktop PCs ran 24/7 so they could run updates to them overnight, however most of the office buildings had the air conditioning turned off or to a minimum overnight to save power costs. If you told the facilities manager that you were going to run some software overnight that would double the power usage then he would say fine but your room temperatures are going to go up and your cost centre will have to pay the extra electric bill. It's not very eco-friendly, even if the CO2 is produced at a slower rate overnight. We had about 100,000 PCs across the UK and over 200,000 worldwide. Our multiple data centres had many thousands of servers in total, but many many more than that virtual. Because of virtualisation the physical servers were better utilised but even so most would be ticking over during the night shift. However we did save 100's of thousands of kilowatts by virtualising over half of our servers over a two to three year period and decommissioning the old hardware. For reliability and resiliance reasons they were never turned off unless they needed servicing (power-cycling servers can stress components) and we were targetting 99.7% uptime of critical services.