Discussion in 'Article Discussion' started by Dogbert666, 17 Nov 2014.
Can't wait to see what the potential of full fat Maxwell...
It's the main reason I won't be a bit impressed with any performance rumours regarding AMD's new 390 cards unless and until I see they are within 10% of Maxwell power figures.
I'd only consider moving to AMD if I could comfortably power a crossfire rig with their second tier high end card on my 650w PSU, as I can do with a couple of Nvidia 970's.
Power efficiency now runs hand in hand with the power of the GPU, in one sense we want 4k, 5k and more, but the scaleable architecture not only has to be energy friendly for the top end (because energy prices are rising higher than would seem sustainable so we want to use less to keep our energy spending the same or lower - much more important for multiGPU driven super computers) and for mobile platforms, performance with longest battery life is the holy grail of GPU design.
Maxell has used their mid range GPU GM104 and brought a performance increase versus the 780 Kepler with excellent power saving characteristics on the same 28nm die size - GM200 will unleash it's full power (and price), but will chomp up that 100w advantage it has over the 290X until it's released on 20nm.
Interesting times indeed and I can't wait to see AMD's new GPU.
My SFX PSU is a 450W model, so low wattage cards are the way forward for me.
As per the title, no is my answer. Efficiency is important for sure but it is not the 'must-have' feature. The reason I say this is that efficiency alone is not enough for people to upgrade and I honestly do not believe anyone would pay out for a brand new GPU if it only gave the same performance but was more efficient.
I also believe that it is hard to dissect the topic this way as AMD and nVidia will never release a card that is only more efficient. When they develop a super efficient GPU they will also use that efficiency to raise the performance.
Low power with performance were both priority's for me being a gamer who is tight fisted, so the msi gtx 970 4g was the winner that outdid my amd 5870 on both counts.
And EK have just announced a waterblock for my 970 ! so now I have to work out if the card will still be using the same power or not if I hook up an AIO water cooler and the EK water block
My 2p: I run a 1200p monitor so I don't need more GPU grunt, but if I can acheive the same level of performance in a smaller form factor, with less heat and noise, then I'll be a happy camper. That's the way forward for me, rather than more raw rendering power.
Goes for CPUs as well, tbf.
Personally I'm more looking forward to how AMD's FIJI XT pans out, rumors are suggesting that it's going to come with 3D stacked Memory (higher bandwidth, lower power), a 4096-bit memory bus (more addressable memory for 4K), and the rumored 4096 stream processors would be a 45% increase over Hawaii.
Going on rumors it seems AMD are aiming squarely at a single card solution to gaming at 4K, something that a GM200 (full fat Maxwell) may struggle with, people wanting to game at 4K probably won't care much about heat, power draw, or noise.
Please don't me started on these water blocks. Why is it that unless you have the top end card, it is pretty much impossible to get a block that is actually FULL cover and matches the size of the PCB? I hate the exposed PCB look.
Would you really spend money on brand new hardware just to match the performance of what you already had, albeit with better efficiency?
How long would it take for that better efficiency to pay for the purchase of a GTX980 if you were replacing a GTX780 Ti, for example?
I can understand it if a GPU needs replacing due to a failure or something and that you may go for similar power and end up with a cheaper and more efficient GPU, like anyone who has had a GTX480 burn out and they bought a 750 Ti instead, but to replace perfectly working hardware?
Total performance of graphics card = performance/watt * watts.
Hence even on desktop AMD are screwed because they are so far (2 gens) behind on per/watt - Nvidia can always beat AMD by bringing out a card with more watts as their performance/watt is much better so if both end up with cards with the same power usage Nvidia wins easily. AMD on the other hand has to stop increasing power at some point - the current crop of 300W cards is already pretty silly and very hard to cool.
That's not even looking at the very large gaming laptop market which is very focused on per/watt. In that AMD hasn't really had many wins in since Kepler came out (which is still more efficient then GCN hence the near lockout Nvidia has got on discrete gaming laptops).
Because bottom tier cards have a tiny power draw and don't need to be watercooled.
Parge, you have totally missed my point, not to mention that water cooling is not only about performance, but about looks too.
Compare these two images that are for water blocks for a 290X:
Same card but one block covers the full PCB while the other stops short. At least with high end cards you have the option of one or the other but as far as I have seen, as soon as you drop down a level (GTX770 and now GTX970) you no longer have the option of the extended version.
I personally think it started with GTX670/680; then NVIDIA jumped away with the ultra-highend (still kept the cards bellow GTX780 in check) and then improved everything in the GTX9xx series. And yes, i like the fact that they go lower in power figures. It means quieter cooling, lower power consumption, less heat to dissipate.
I have *loads* of work to do today, so naturally I'm taking this opportunity to procrastinate and answer this 'ere question. Fun!
So, the GTX 980 has a TDP of 165W. The GTX 780 Ti has a TDP of 250W. For the purposes of this calculation, the following assumptions will be made: the subject bought the GTX 780 Ti at full launch price; the subject bought the GTX 980 at full launch price; the subject sells the GTX 780 Ti at current eBay prices, using those funds to offset the cost of the GTX 980 purchase; the subject runs the card at full load on average four hours per day every day; the subject has not changed any other aspect of the PC; electricity costs 13.52p per KWh as per the Energy Saving Trust; power draw at idle is identical (to make things easier.)
The GTX 780 Ti cost £559 at launch according to Bit-Tech's review, while the GTX 980 cost £429 from the same source. Current eBay prices put a second-hand GTX 780 Ti at between £280 and £340 depending on model; let's say £300 to make it easy. So, our subject goes into the equation with a £129 hole in his pocket for the upgrade (£429 minus the £300 he made back on the GTX 780 Ti; the £559 original purchase price of the Ti having been considered a sunken cost, and the question under examination being "how long would it take to pay for the purchase of the GTX 980").
There's an 85W difference between the TDP, and therefore power draw, of the two cards. 85W over an hour is 0.085KWh, or 1.15p (rounded). There are 11,217 1.15ps (rounded) in £129. Therefore it would take 11,217 hours to pay back the £129 upgrade cost. At four hours a day, that's 2,804 (rounded) days; 400 (rounded) weeks; 7.7 (rounded) years. Given that no manufacturer is offering more than, what, a five-year warranty, that sounds to me like a losing proposition.
Obviously, the figures are different if you're running the card full-pelt 24/7 (Litecoin mining, for example). Then you'd reach break-even in a year and a third. But for a gamer? Yeah, financially it doesn't make sense.
Um... I am very tired having not slept at all the last 2 nights so maybe I am missing something and thanks for doing that Gareth, but can you explain where £170 came from and not £130? (£429 - £300).
Ha ha, I see the edit.
I think you have misunderstood my point. I'm not going to replace a perfectly functional GPU just for the sake of reducing power consumption. On the other hand, should something terminal happen to my current GPU, I would look to replace it with a card of similar horsepower but much lower power consumption, rather than a card with considerably more grunt but the same power consumption. Does that make sense?
Anyway, so that all backs up my original comments, that while efficiency is becoming more of a key factor it is still only one of many things to think of and is not all defining by itself.
It certainly doesn't lend itself to cool bragging rights... That's for sure.
Yes, of course, as I already said that I would understand it under those circumstances. People do upgrade though for every little bit of extra performance power they can get where as people are unlikely to follow a similar trend regarding efficiency. So as the title asks 'Is power efficiency the new must-have?' my conclusion is ultimately, No.
Separate names with a comma.