1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs Battle of the GPUs: Is power efficiency the new must-have?

Discussion in 'Article Discussion' started by Dogbert666, 17 Nov 2014.

  1. Dogbert666

    Dogbert666 *Fewer Lover of bit-tech Administrator

    Joined:
    17 Jan 2010
    Posts:
    1,678
    Likes Received:
    181
  2. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    Can't wait to see what the potential of full fat Maxwell...
     
  3. Hustler

    Hustler Minimodder

    Joined:
    8 Aug 2005
    Posts:
    1,039
    Likes Received:
    41
    It's the main reason I won't be a bit impressed with any performance rumours regarding AMD's new 390 cards unless and until I see they are within 10% of Maxwell power figures.

    I'd only consider moving to AMD if I could comfortably power a crossfire rig with their second tier high end card on my 650w PSU, as I can do with a couple of Nvidia 970's.
     
  4. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    Power efficiency now runs hand in hand with the power of the GPU, in one sense we want 4k, 5k and more, but the scaleable architecture not only has to be energy friendly for the top end (because energy prices are rising higher than would seem sustainable so we want to use less to keep our energy spending the same or lower - much more important for multiGPU driven super computers) and for mobile platforms, performance with longest battery life is the holy grail of GPU design.

    Maxell has used their mid range GPU GM104 and brought a performance increase versus the 780 Kepler with excellent power saving characteristics on the same 28nm die size - GM200 will unleash it's full power (and price), but will chomp up that 100w advantage it has over the 290X until it's released on 20nm.

    Interesting times indeed and I can't wait to see AMD's new GPU.
     
  5. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    My SFX PSU is a 450W model, so low wattage cards are the way forward for me.
     
  6. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    As per the title, no is my answer. Efficiency is important for sure but it is not the 'must-have' feature. The reason I say this is that efficiency alone is not enough for people to upgrade and I honestly do not believe anyone would pay out for a brand new GPU if it only gave the same performance but was more efficient.
    I also believe that it is hard to dissect the topic this way as AMD and nVidia will never release a card that is only more efficient. When they develop a super efficient GPU they will also use that efficiency to raise the performance.
     
  7. Dave Lister

    Dave Lister Minimodder

    Joined:
    1 Sep 2009
    Posts:
    880
    Likes Received:
    12
    Low power with performance were both priority's for me being a gamer who is tight fisted, so the msi gtx 970 4g was the winner that outdid my amd 5870 on both counts.
    And EK have just announced a waterblock for my 970 ! so now I have to work out if the card will still be using the same power or not if I hook up an AIO water cooler and the EK water block :)

    source: http://www.techpowerup.com/207249/ek-releases-msi-gtx-970-gaming-water-block.html
     
  8. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,284
    Likes Received:
    891
    My 2p: I run a 1200p monitor so I don't need more GPU grunt, but if I can acheive the same level of performance in a smaller form factor, with less heat and noise, then I'll be a happy camper. That's the way forward for me, rather than more raw rendering power.

    Goes for CPUs as well, tbf.
     
  9. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Personally I'm more looking forward to how AMD's FIJI XT pans out, rumors are suggesting that it's going to come with 3D stacked Memory (higher bandwidth, lower power), a 4096-bit memory bus (more addressable memory for 4K), and the rumored 4096 stream processors would be a 45% increase over Hawaii.

    Going on rumors it seems AMD are aiming squarely at a single card solution to gaming at 4K, something that a GM200 (full fat Maxwell) may struggle with, people wanting to game at 4K probably won't care much about heat, power draw, or noise.
     
  10. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    Please don't me started on these water blocks. Why is it that unless you have the top end card, it is pretty much impossible to get a block that is actually FULL cover and matches the size of the PCB? I hate the exposed PCB look.
     
  11. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    Would you really spend money on brand new hardware just to match the performance of what you already had, albeit with better efficiency?
    How long would it take for that better efficiency to pay for the purchase of a GTX980 if you were replacing a GTX780 Ti, for example?

    I can understand it if a GPU needs replacing due to a failure or something and that you may go for similar power and end up with a cheaper and more efficient GPU, like anyone who has had a GTX480 burn out and they bought a 750 Ti instead, but to replace perfectly working hardware?
     
  12. DbD

    DbD Minimodder

    Joined:
    13 Dec 2007
    Posts:
    519
    Likes Received:
    14
    Total performance of graphics card = performance/watt * watts.

    Hence even on desktop AMD are screwed because they are so far (2 gens) behind on per/watt - Nvidia can always beat AMD by bringing out a card with more watts as their performance/watt is much better so if both end up with cards with the same power usage Nvidia wins easily. AMD on the other hand has to stop increasing power at some point - the current crop of 300W cards is already pretty silly and very hard to cool.

    That's not even looking at the very large gaming laptop market which is very focused on per/watt. In that AMD hasn't really had many wins in since Kepler came out (which is still more efficient then GCN hence the near lockout Nvidia has got on discrete gaming laptops).
     
    Last edited: 17 Nov 2014
  13. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    Because bottom tier cards have a tiny power draw and don't need to be watercooled.
     
  14. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    Parge, you have totally missed my point, not to mention that water cooling is not only about performance, but about looks too.

    Compare these two images that are for water blocks for a 290X:

    Image 1:
    http://www.techpowerup.com/img/13-11-05/FC-R9-290X_NP_full_1200.jpg

    Image 2:
    http://www.ekwb.com/shop/media/cata...d/f/c/fc-r9-290x-original-csq_np_full_800.jpg

    Same card but one block covers the full PCB while the other stops short. At least with high end cards you have the option of one or the other but as far as I have seen, as soon as you drop down a level (GTX770 and now GTX970) you no longer have the option of the extended version.
     
  15. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    I personally think it started with GTX670/680; then NVIDIA jumped away with the ultra-highend (still kept the cards bellow GTX780 in check) and then improved everything in the GTX9xx series. And yes, i like the fact that they go lower in power figures. It means quieter cooling, lower power consumption, less heat to dissipate.
     
  16. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,132
    Likes Received:
    6,728
    I have *loads* of work to do today, so naturally I'm taking this opportunity to procrastinate and answer this 'ere question. Fun!

    So, the GTX 980 has a TDP of 165W. The GTX 780 Ti has a TDP of 250W. For the purposes of this calculation, the following assumptions will be made: the subject bought the GTX 780 Ti at full launch price; the subject bought the GTX 980 at full launch price; the subject sells the GTX 780 Ti at current eBay prices, using those funds to offset the cost of the GTX 980 purchase; the subject runs the card at full load on average four hours per day every day; the subject has not changed any other aspect of the PC; electricity costs 13.52p per KWh as per the Energy Saving Trust; power draw at idle is identical (to make things easier.)

    The GTX 780 Ti cost £559 at launch according to Bit-Tech's review, while the GTX 980 cost £429 from the same source. Current eBay prices put a second-hand GTX 780 Ti at between £280 and £340 depending on model; let's say £300 to make it easy. So, our subject goes into the equation with a £129 hole in his pocket for the upgrade (£429 minus the £300 he made back on the GTX 780 Ti; the £559 original purchase price of the Ti having been considered a sunken cost, and the question under examination being "how long would it take to pay for the purchase of the GTX 980").

    There's an 85W difference between the TDP, and therefore power draw, of the two cards. 85W over an hour is 0.085KWh, or 1.15p (rounded). There are 11,217 1.15ps (rounded) in £129. Therefore it would take 11,217 hours to pay back the £129 upgrade cost. At four hours a day, that's 2,804 (rounded) days; 400 (rounded) weeks; 7.7 (rounded) years. Given that no manufacturer is offering more than, what, a five-year warranty, that sounds to me like a losing proposition.

    Obviously, the figures are different if you're running the card full-pelt 24/7 (Litecoin mining, for example). Then you'd reach break-even in a year and a third. But for a gamer? Yeah, financially it doesn't make sense.
     
  17. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    Um... I am very tired having not slept at all the last 2 nights so maybe I am missing something and thanks for doing that Gareth, but can you explain where £170 came from and not £130? (£429 - £300).

    Ha ha, I see the edit. :)
     
  18. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,284
    Likes Received:
    891
    I think you have misunderstood my point. I'm not going to replace a perfectly functional GPU just for the sake of reducing power consumption. On the other hand, should something terminal happen to my current GPU, I would look to replace it with a card of similar horsepower but much lower power consumption, rather than a card with considerably more grunt but the same power consumption. Does that make sense?
     
  19. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    Anyway, so that all backs up my original comments, that while efficiency is becoming more of a key factor it is still only one of many things to think of and is not all defining by itself.

    It certainly doesn't lend itself to cool bragging rights... That's for sure.
     
  20. SchizoFrog

    SchizoFrog What's a Dremel?

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    Yes, of course, as I already said that I would understand it under those circumstances. People do upgrade though for every little bit of extra performance power they can get where as people are unlikely to follow a similar trend regarding efficiency. So as the title asks 'Is power efficiency the new must-have?' my conclusion is ultimately, No.
     
Tags: Add Tags

Share This Page