1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia announces world's 'most complex' GPU

Discussion in 'Article Discussion' started by brumgrunt, 18 May 2012.

  1. Ayrto

    Ayrto What's a Dremel?

    Joined:
    20 Jun 2007
    Posts:
    255
    Likes Received:
    3
    Would this have been the 680 had AMD released a beast?
     
  2. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Someone who has to reassemble the test card in the next 5 minutes, finds out the bolts have gone walkabouts and the nearest supplier who might have some is probably half an hour away, and doesn't care if the holes get threaded to hell as long as it holds together for an hour or two of photos.
     
  3. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    so we all know how well nVidia does with complexity my guess is yields for this chip will probably be what in the low 40% and the rest of that be failed chips lol.
     
  4. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    It's always good to remind ourselves that the current GXT680 ws actually made from the replacement gpu for the GTX560 and for every GK104 GPU sold, Nvidia's margins go through the roof.

    I am surprised though, this GPU sounds far too big for home computing isn't it? - well some would argue no, but I really fear for the cost of this and the cost to us when it finally materialises into the real GTX580 replacement.....

    Maybe that's another reason for having an artificially high price for the GK104 derivatives - best case secenario, I don't expect to see much change from £550 when this is released in it's gaming GPU form.

    Performance should be more than the GTX690 though, I just hope, sincerely hope, that AMD have a strong enough replacement for the 7970 ready to help push prices into the realms of affordable or provide a viable lower cost alternative.
     
  5. K404

    K404 It IS cold and it IS fast

    Joined:
    11 Sep 2006
    Posts:
    408
    Likes Received:
    20
    Won't be out until the end of the year.... I figure there will be something else for the desktop around Christmas though
     
  6. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    huh? I am not following you, sorry. :/


    Double huh?
    sorry for my following remark if inappropriate, as I am not sure I follow you, but: Tesla GPU's aren't design for video output. They are designed to be used as processors, to... well process stuff. It's like if you add another GPU for PhysX, but instead of PhysX, it's for CUDA or any GPU processing. And it's specially designed for it with no focus on gaming at all, nor even CAD software. It's really a GPU for research and simulations over anything. It's like the "super computers" of GPU's if you will.

    Nvidia builds them themselves, as the market is very very small. The reason why Nvidia showcase it to a wider public is to showcase what Nvidia can do, show that Nvidia is more than just doing GPU's and that they are actively working in graphical and technological improvements.

    As disgust, the GK110 is the "true" Kepler. The GK104 is the cut down model. A game that BOTH AMD and Nvidia plays since day 1, is not to release their best stuff, else it's easy to
    beat it, and then you come to a point that everyone looses, and then you have 3-4 or even 5 years with no new GPU's while the company are hard at work on a new architecture as they are out of things to release meanwhile. 3Dfx did that... it ended up with canceled GPU, after canceled GPU's as they were unable to release something more powerful very quickly to attract sales, especially that their stuff was expensive. And now they are gone.

    An example of this with Nvidia, is when Nvidia released the Geforce 8800GT. The rest of the series was a magma spilling GPU, and then you have, out of no where the 8800GT which is significantly cooler, cheaper and performs better. Then you have the 9000 series out, which the 9800 is practically the same performance as the 8800GT, did Nvidia re-branded the 8800GT OR they released it in advance under the 8000 series model to compete against AMD offering as Nvidia sales where hurt at the time. Clearly you can see that the 9000 series was done shortly after the 8000 was out. It was quick like this, as the architecture is based on the 8000 series. They were tweaking it (changing part of the GPU architecture), in a way to solve the 8000 series problem (heat).

    In this case, Maxwell is far from done (as we can see), so Nvidia needs a model in between to counter AMD reply.

    If AMD covered the cost of engineering a new GPU, and have the budget to work on a new GPU, and no one (AMD and their card builders) wants to make money, then yes, they can sale their highest end GPU at 30$ if not less.
     
  7. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Fascinating, 7 Billion transistors? I wonder how they'll cool these monsters? Because that's quite a bit of silicon and heat output. Especially in terms of density.

    I doubt they'll ever actually release a version of this. Unless of course they release it as a GTX285-esque or 7900/7950-esque (G71) where they shrink and tweak the die and release it as next gen.

    Actually I'd say it's pretty brilliant of them to make a derivative that's more shader focused than compute focused for the gamers. It makes for a cheaper card (the cost of the GTX 480 comes to mind, it was both and it brute forced it through) and cooler one since there's less compute but more shader power.

    Speaking of which, I need to try out a Green Team card, haven't done so yet in ages. Last card I had from the Green Team was a 7600GT.
     
  8. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    @goodbytes :read: - you're clearly looking for a reason to get your own back on your Vista-isms right, let me help you in your time of "huh" and "double huh"

    Disgust? You meant discussed right?

    I will repsond in reverse order because one is easier to answer than the other incredibly OT dribble.

    To the Double HUH, GK110 is not only the base GPGPU for the parallel compute K20 but also the GK110 will be the Keplar high performance desktop chip - I accept I did not make it clear that they wont just take the K20 chip and put "TWIMTBP" on the cover, but a GK110 derivative will be used for the high performance desktop GPU...... the mistake I made was I assumed you would already know this.


    In response to the 'huh':

    Any GPU with GK in front of it - means it's Keplar right (GF = Geforce Fermi, GK= Geforce Keplar), so I trust when you say 'true' Keplar, you actually refer to the high performance model based on Keplar architecture? Which is what I posted and what others had also infered in previous posts.

    What does "magma spilling" mean when you refer to other GPU's after the 8800GT - The 9800GT is identical to an 8800GT, although some were manufactured using a 55 nm technology instead of the 65 nm first debuted on the 8800GT. The 55 nm version supports HybridPower while 65 nm doesn't. The G92 design has been later re-badged for a second time and sold as GTS 250 and later, a third time only by OEMs as the GT 330.... as for the heat, most GPU's on 65nm we're pretty toasty, but most came with good enough cooling to overcome that, although I am sure you are right, some people upgraded from 8800GT to 9800GT because of it (although a change from the cooler may of sufficed)....

    It's fair to say though that each revision was optimised from an existing architecture which in itself was revolutionary at it's time, the G80 GPGPU has it's place in 'greatest moments of GPU evolution'

    Back from your history lesson and into May 2012 and the original post:

    The GK104 was originally meant to be the GTX660, but was re-badged because they found it outperformed the 7970 (that and GK110 was nowhere near ready)? The GK110 is (from an architecture PoV) a replacement for the GF110 which was the GPU used for the GTX580 - The desktop derivative will be called soemthing like GK111 or GK110.X hope this helps clear up your confusion.

    Summary:
    I see your point about the 8800GT - but it has no relevance to what I was saying - that version of the G80 was always going to be the 8800GT and subsequent revisions were always destined to be that model and that references' successor - the GK104 was designated as such to be the replacement fot the GF104 GPU which was used in the GTX560, so the GK104 was to be the GTX660 - the natural successor by designation and chip design (mid range Fermi to mid range Keplar) and that is not the case, Nvidia upgraded it's name for all the reasons stated in my threads.

    PS: In late 2000, not long after Voodoo 4's launch, several of 3dfx's creditors decided to initiate bankruptcy proceedings. 3dfx, as a whole, would have had virtually no chance of successfully contesting these proceedings, and instead opted to be bought by Nvidia - they're not gone, they are like ATI - absorbed into larger entities.
     
    Last edited: 21 May 2012
  9. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    Aye Kenny mate, probably the desktop version of the GK110 (which the media suggests will probably be called the GTX780) and something from ATI perhaps (I really hope so)?

    In terms of value, anything released in Q4 is such bad value, prices drop dramatically half way through the following quarter, as exciting as the products may be, patience really does pay off.

    PS: Come back we miss you :)
     
  10. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Well then Maverik I'd say that they did a damned good job then. If they had a midrange chip that propelled itself that high.

    Mind you the GK110 at this point should be released either as an improvement (not likely) or improved then released as the G?1xx. In other words, next gen is going to be a GK110 on crack and refined a tad.
     
  11. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    I agree the GK104 is mighty impressive, but the performance increase of 20% for the high performance chipset over the previous generations , is, by what have been used to in recent generations, a tad underwhelming (at least 50% increase over outgoing model), in todays games there's not much reason for gamers with a 580GTX running @ 1900x1200 to upgrade.

    I am sure when the GK11X gamers gpu will be exactly as you described, but also because of that, it's unlikely to be a 600series badge on it.
     
  12. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    I am just wondering why do you guys think GTX780 will be a downsized version of GK110 ? It will be more likely an upgraded version of GK104 instead.
    What is easier to do :
    a) add two or four more blocks of what you can find in GTX680 - GTX680 has 8 of them, GTX670 has 7 of them :
    http://www.geforce.com/Active/en_US.../introducing-the-geforce-gtx-680-gpu/Die2.png
    b) take the GK110 not designed for gaming with features useless for gamer and cut down the not features you don't need for gaming (but important for CUDA) and add features you need for gaming (video outputs?).

    In the end, GTX780 will more likely be based on GK104 than on GK110. But i won't stop you dreaming about a GK110 gaming card, it just probably won't happen.
     
  13. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    I bet you 20 'Scott Mills Points' that the next Flagship performance gaming GPU will be based on a GK110 - and lets wait and see :)
     
  14. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    I don't think NVIDIA will hard time choosing between a limited amount of GeForce GK110 based cards sold at $500 versus a limited amount of Tesla GK110 based cards/systems sold for thousands of dollars (just check out the prices of the Fermi based Tesla cards). And considering GK110 based Tesla cards are already ordered in quantities which will be probably fulfilled only in 2013, the chance of GK110 based GeForce is very slim. First because of the limited amount of chips available at all, then because the chip has features useless for a GeForce card and would have to be disabled anyway to stop them competing with Tesla cards...

    The only real option for GK110 based GeForce cards if NVIDIA decides to use not fully functional GK110 chips for them, let's say with 1 of 5 blocks disabled because of the damage. That would still give them 4x3=12 working blocks and 256 bit memory interface. But again, you will hit the issue of competing with the Tesla cards, because why would they use those damaged cores for GeForce when they can just use them for the lower versions of the Tesla cards (which usually have less cores than the highend model).

    So, if GK110 is not going to make into a GeForce, what options do we have. Take the huge GK110, cut out the compute features and cores which would make it too powerfull; or take the optimized GK104, increase the core count from 8 to 12, increase the memory interface from 256 to 384 bit, optimize it a bit more and end up with a ~400mm2 chip ?
     
  15. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    Some intersting points there for sure and it's a compelling reason for sure.

    Although quaddro and FirePro gpu's were (I think) based on same tech as the gaming GPU's and they were considerably more expensive too - so we won't know until Q4 this year and both camps have made compelling reasons for and against a gaming Gerforce GPU based on GK110..... my money is on it will be.
     
  16. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    The way it usually works, is that Nvidia/AMD makes the big chip, with everything in them, and they make the circuit board super fancy, super high-quality, and all fancy features, like ECC memory. Once that's done, they remove everything not needed for gaming, cut the components quality, simplify the circuitry to make it less reliable but much cheaper to produce, and so on, and call it they high end Radeon/Geforce. That is why Quadro/Tesla/FirePro cards are more expensive (of course, due to reduce demand, it also cost more, and also Nvidia/AMD charges a lot more, to maximize revenue to try and pay a large part of of R&D back. The Greforce/Radeon pays back the rest.. and hopefully some profit at the end of the cycle.
     
Tags: Add Tags

Share This Page