1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Distributed Computer as variable heater to use excess solar power

Discussion in 'Software' started by wyx087, 26 Oct 2015.

  1. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    711
    What’s the best way to use computer as an automatically varying heater to utilise free electricity? Bitcoins? SETI? Folding? Prime?


    I have recently installed solar panels on my roof, on a sunny winter’s day like today, it generates around 400w of excess electricity throughout the day (because it’s 2 roof E/W, there is never that peak seen on most other S roof installations). I thought the perfect heater would be my desktop PC.
    So far, I’ve managed to extract the excess watt value (solar power exporting number) from the smart energy meter into my C program, where I can output in any format (watt, percentage of effort, number of work units, etc) depend on program’s input.


    So which power burning program is a good way to use this free electricity? Are bitcoins a viable choice on nVidia graphics card?

    For any given program, how do I vary its electricity consumption dynamically?



    Basically asking is there any power burner programs that have any way to change its workload (thus power consumption) on-the-fly?
     
    Last edited: 26 Oct 2015
  2. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    711
    So I've found "FAH_GPU_IDLE" environment variable for Folding at home. It appears to do what I want it to do: modulate power consumption.

    I've also found this little tool for CPU applications. I'm thinking of using SETI@home, is it a good cause to dedicate my computer to?
     
  3. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Wouldn't using a PC as a heater be a monumental waste of energy?

    I thought as energy can only be converted to other types of energy that using something that wasn't designed to take advantage of that particular type of energy would mean some of it would be wasted. Like big speakers used for converting electrical energy into kinetic energy and how that conversion process isn't 100% efficient, some electrical energy gets converted to heat.
     
  4. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    711
    In PC (inside desktop case at least), almost all of the energy is actually converted into heat. Doing calculations: heat, gaming: heat. inefficiencies in PSU: heat.

    A PC drawing 400w will produce 395w of heat. Perhaps a few watts is converted into kinetic for spinning the disks and fans.


    So What's the popular distributed computing at the moment. I'm told bitcoins pointless now.
     
  5. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I thought most of the energy was converted to kinetic energy in the form of switching transistors and the heat was a byproduct, that's not disagreeing or saying either of us is right or wrong BTW I'm just genuinely interested in what all that electrical energy we pump into our PCs gets converted too.
     
  6. wolfticket

    wolfticket Downwind from the bloodhounds

    Joined:
    19 Apr 2008
    Posts:
    3,556
    Likes Received:
    646
    Almost all (in fact arguably all) the waste energy from computing is in the form of heat. If this heat is useful then there is no waste.

    Even the waste energy from kinetic is ultimately heat.

    I've often considered whether in the winter, especially if you have thermostatically controlled electric heating on anyway, whether there is such a thing as an electrical device that wastes electricity?
    The waste energy is essentially in the form of heat, so that inefficient mains power power block or TV on standby is just acting as a small space heater. Same with non-energy saving lightbulbs.

    As far as I can gather electrical devices are essentially 100% efficient, they just produce waste heat (first law and all that), which isn't waste if you are cold.

    It feels like spending money on insulation and using waste heat from electrical devices doing other useful jobs would be sensible.

    Oh, and BES does seem to be the way to go for moderating CPU utilisation (although I haven't used it myself). Unless that is there is a function within the app you use.
    And as you imply any of the apps you mention will utilise up to 100% of you CPU. You're basically donating processing power to them, so you should probably pick the goal you like most.
     
    Last edited: 28 Oct 2015
  7. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    After having done some reading i came up with what seems like a reasonable explanation ( last comment), at least to me it does, as Wolfticket said almost all of the energy in one way or the other ends up as heat.

    I love physics. :geek:
     
    Last edited: 28 Oct 2015
  8. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    711
    Exactly. It comes down to this: use excess solar power && heat home => computer do compute for heat. Instead of pointless programs like Intel burn test, might as well donate some processing power while I'm doing it. :rock: It won't cost me anything with help of my solar panel :hehe:


    Initial point of this thread is to search for distributed programs that DO provide ability to moderate power usage. So far, there seems none :wallbash:
     
  9. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    I've definitely noticed the difference in temperature when leaving my computer doing Boinc calculations overnight in a room with no heating, iirc 350W makes getting out of bed much easier in the winter.
     
  10. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,129
    Likes Received:
    6,717
    BOINC lets you set an upper limit for CPU usage - I have mine set at 80%, which results in a sine-wave-looking load graph where BOINC loads the CPU cores at 100% for a bit, then stop, then loads 'em at 100% for a bit, then stops... It's not direct "I want you to use X watts," which would be pretty much impossible to do, but combined with something that lets you track your power usage (i.e. a plug-in energy meter, or whatever you're tracking your solar consumption with now) you could experiment with numbers until you get the power draw you're after.
     
  11. Mr Evil

    Mr Evil What's a Dremel?

    Joined:
    17 Apr 2015
    Posts:
    36
    Likes Received:
    3
    Yes. If you consider a device like a PC in isolation you might think that the ~100% efficiency is perfect, but the PC does not operate in a vacuum, it is part of an entire electrical grid which is much less efficient. The biggest losses will be in the power station, then some more will be lost in transmission. It's more efficient to burn fuel directly for heat.

    It's different if you are getting your electricity locally from solar panels, but even then you can do a lot better than 100% electical efficiency* by using a heat pump.

    *Of course the thermodynamic efficiency remains <100%, but you get more heat out than electical energy in.
     
  12. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    711
    Are you saying I should be an air-con? :hehe:

    Yesterday while sunny, I put my PC to work encoding videos for free :D


    Any recommendations for a GPU client (nVidia) that allows you to set effort level? Still looking for one.
     

Share This Page