1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs We've just witnessed the last days of large, single chip GPUs

Discussion in 'Article Discussion' started by Sifter3000, 31 Mar 2010.

  1. Sifter3000

    Sifter3000 I used to be somebody

    Joined:
    11 Jul 2006
    Posts:
    1,766
    Likes Received:
    26
  2. Fizzban

    Fizzban Man of Many Typos

    Joined:
    10 Mar 2010
    Posts:
    3,691
    Likes Received:
    275
    Cooler running, less power hungry GPU's are the way forward. At least as far as the mainstream consumer is concerned. The less power we use the more money we save. Unless Nvidia can make a massive jump in performance, their cards are just pointless in most day to day rigs. Unless you fold or used cuda optimized programs its just a waste of money x2.

    Makes me chuckle..only Nvidia could make a 40nm based card more power hungry and hot than its predecessor.
     
  3. _Metal_Guitar_

    _Metal_Guitar_ What's a Dremel?

    Joined:
    16 Jun 2009
    Posts:
    129
    Likes Received:
    1
    "Like so many of us, I never want to see PC gaming die, but in my opinion the days of multi-billion transistor single chip graphics cards are practically over"

    What has a change to multi chip GPUs got to do with PC gaming dying? If all they make are multi chip GPUs, wouldn't the support for them just get better?
     
  4. Cyberpower-UK

    Cyberpower-UK Professional Overclocker

    Joined:
    6 May 2009
    Posts:
    211
    Likes Received:
    0
    As Crossfire and SLi have matured the multi GPU argument that ATi put forward with the launch of the 3870x2 is making more sense. High end cards rarely come close to 100% scaling due to CPU and memory limitations, especially in non-overclocked systems, but lower end cards in CF and SLi systems with OC'd CPU can often challenge the 'top end single GPU' and cost less, for example a pair of 5770s can compete with 5870 and three nips at the heels of the 5970. Cards like the 4870x2 and GTX295 have forced games developers to ensure their products can make good use of multi GPU systems which is beginning to erode out dated prejudices against SLI and CF in the same way as the prevalence of multi-threading in games slowly eroded the fast dual core vs quad core argument.
     
  5. Guest-16

    Guest-16 Guest

    More money has to be put into driver development and more money has to be put into graphics card design. With the cards stacked against PC gaming already in some respects, and each generation of graphics cards having less and less of a performance jump from the last - are we already hitting a wall?
     
  6. Tyrmot

    Tyrmot Minimodder

    Joined:
    12 Mar 2008
    Posts:
    309
    Likes Received:
    1
    Pretty sure I remember people saying the same thing about the big G80 core when that came out too...
     
  7. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
    Haven't the CPU's come back from putting two chips in one package and are producing large, single chips again?
    And why is it Global Foundries to NVidia's rescue? (ATI'd rejoice I guess, GloFo is still full of "old" AMD people)
    What Node is GloFo at? 45nm? (Opterons mostly I guess)
    Even intel is just at 32nm for something as complex as a processor (which a GPU is...if not more complex)
     
  8. Blademrk

    Blademrk Why so serious?

    Joined:
    21 Nov 2003
    Posts:
    3,988
    Likes Received:
    86
    Really? that's a shame :( they were always the cards I looked for first.
     
  9. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,996
    Likes Received:
    714
    'tis a shame, multi-GPU is way too dependent on drivers.

    how about specifically design a bus for multi-GPU? i like 4870x2's side-bus (??) and i think that's the way forward for higher/better performance.
     
  10. fingerbob69

    fingerbob69 Minimodder

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    Where is the impetus to keep designing faster and faster gpu's? Unless and until developers bring us games and applications that stretch and even surpass current gpu capabilities all we will see, for quite awhile are Fermi-small steps in better graphics performance. There needs to be at least a dozen, if not more, games like Crysis; that make gpu's bleed so that even a basic £100 card has to give a 5780 level of quality to keep games playable.

    It is unfortunate that we are unlikely to see any of this until after a new generation of consoles comes out and raises the floor under games graphics. PC gaming isn't dying, it just no longer leads.
     
  11. uz1_l0v3r

    uz1_l0v3r What's a Dremel?

    Joined:
    22 Sep 2009
    Posts:
    198
    Likes Received:
    2
    I don't understand how the disappointment of the Fermi core equates to the death of single chip GPUs. Dual core GPUs are total overkill in today's gaming market, the average gamer simply does not need one. Why would anyone in their right mind spend £4-500 on a dual core graphics card, when single core cards are perfectly adequate? I'm playing on a year old gtx 275 and still cranking every game to the max.
     
  12. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,956
    Likes Received:
    17
    I hope not. I'd much rather have a single GPU than need to rely on drivers to get good SLI/CF performance.
     
  13. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    problem is if nvidia focuses on cuda as most expect ( which makes 50-60% of its profits acording to reports)

    then ATI will have no competition

    and without it no real will to push the graphics boundry further.

    yes i prefer a single gpu always will but i think like the blob person we have seen the last of them
     
  14. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,956
    Likes Received:
    17
    Question: could Microsoft build general multi-GPU support into the next API (Direct X 12). That would be much better than needing specific games profiles in the driver.
     
  15. D-Cyph3r

    D-Cyph3r Gay for Yunosuke

    Joined:
    31 Aug 2008
    Posts:
    925
    Likes Received:
    41
    Or just use a single 5870 and get 85% of the performance....



    Anyways no, Nvidia wont learn from this because they still think Fermi is the best thing since slice bread. Jen-Hsun Huang is borderline delusional in his own brand fanboyism, hell he still thinks nVidia "make the best chipsets in the world"....
     
  16. technogiant

    technogiant What's a Dremel?

    Joined:
    2 May 2009
    Posts:
    323
    Likes Received:
    17
    I think there is a change of focus coming, perhaps Nvidia is going to concentrate more on the hpc market and this will drive development and a derivative of the hpc product will be used for gaming....very much as fermi is.

    Tbh perhaps thats the way it should be...it has always struck me as a little frivolous that "gaming" should be a major driving force in computer hardware development.

    Infact it may even be advantageous as regardless of the ebb and flow of demand for pc gaming hardware there will always be demand from the hpc sector...so provided there is sufficient demand to make a gaming derivative of a hpc product then hardware pc gaming hardware will continue to develop.
     
  17. Sloth

    Sloth #yolo #swag

    Joined:
    29 Nov 2006
    Posts:
    5,634
    Likes Received:
    208
    That would certainly be nice. When/if multi GPU moves forward and becomes more popular it would seem that there would be more support for it and therefore less issues. Look at 64 bit operating systems, used to cause some pretty big issues running one but now that modern PCs are hitting RAM capacity limitations there is a huge push to use a 64 bit OS and they are quite standard now so people make new products to work with them. Much the same way, I assume game developers would start making games better suited to use dual GPUs, API's would be changed, and drivers could be developed with the sole intent of dual-GPU cards.
     
  18. azrael-

    azrael- I'm special...

    Joined:
    18 May 2008
    Posts:
    3,852
    Likes Received:
    124
    I'm pretty sure Bindi, and others, is onto something. The days of huge monolithic GPUs are numbered. The graphics card industry is about to learn the same lesson the processor industry learned a few years back (mostly Intel with the P4): You can only go so far with (huge) single-core designs.

    The future of GPUs clearly lies in smaller and more efficient multi-core designs, just as it does for CPUs. Yes, it'll take a paradigm-shift and the learning curve for optimizing code for multi-core solutions might be a bit steep, but it's clearly the way ahead.
     
    Guest-16 likes this.
  19. fingerbob69

    fingerbob69 Minimodder

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    Seriously folks ...you gotta ask; why?

    Graphic cards purely for gaming as their main raison detré? Games are (with the honourable exception of Crysis) eaten alive by by most mid+ cards of the last two years. (Think 4890/275 or higher) . It is game developersthat have to develop games that DRIVE users to want up grade...Ati/Nvidia are on a lost cause if they think people will continually up grade just to have the latest card if their exsisting card remains more than adequate. Nvidia have bought games to show off physX. Wasted effort!. Ati (and Nvidia) should be paying them to make games only properly playable on the best of the last gen cards so people buy the next gen and the next gen is worth developing.
     
  20. tad2008

    tad2008 What's a Dremel?

    Joined:
    6 Nov 2008
    Posts:
    332
    Likes Received:
    3
    Some of the issues for games and drivers is down to the way the drivers are coded, the rest lays in the hand of the developers and the code they write. Far too many people are too keen to point to bad drivers when a lot of the time the problem lays within the software itself and is why on the PC platform we have the need for patches, something the consoles don't get because they spend more time testing for problems.

    It is a misconception that the sheer variety of hardware on a pc platform is to blame, but in essence this simply comes down to drivers and the software that uses it. For a simplified understanding consider Adobe's Flash and that it is capable of running on any PC regardless of hardware.

    I do agree that multi-core GPU's are going to be the way forward, though ATI have shown that there is still some life left in in single core GPU's for a little while longer yet.
     
Tags: Add Tags

Share This Page