1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Blogs We've just witnessed the last days of large, single chip GPUs

Discussion in 'Article Discussion' started by Sifter3000, 31 Mar 2010.

  1. metarinka

    metarinka New Member

    9 Feb 2003
    Likes Received:
    by multi gpu do we mean multiple discrete GPU's such as any x2 type of card? or multiple disrete cores on a single die?

    graphics processing is already an extremely parallelizable task. That's why GPU's have hundreds of stream processors and shader units. Those channels already act as a "multi GPU" solution with shared resources like memory links and the like. hence any modern GPU already is the equivalent of a 100+core processor (mind you with a specific instruction sets and shared memory)

    correct me if I'm wrong, but what is the benefit of going to multiple discrete dies instead of using one die with twice the transistors? unless we are talking about heat and yields?
  2. dec

    dec [blank space]

    10 Jan 2009
    Likes Received:
    multigpu = 4870x2, 3870x2, etc.

    multiple discrete dies lets pretty much everyone in the manufacturing process save a bit of time and money as its easier doing it once and slapping two of them onto a PCB instead of doing 2 totally different GPU's. Although its possible to just do this. Use Cypress of example.

    Currently its this:
    5970 = 2 5870's on the same PCB
    5870 = 20 processor clusters (right name) enabled
    5850 = 18 clusters enabled.
    5830 = 16 or something.

    and so on. And theyre all the same GPU with different parts disabled (except 5970). If they wanted to they could call a 5870 a 5970, a 5850 a 5870, and so on and achieve a similar result to what you were talking about. But the reason why they dont take the transistor count and shader count from the 5970 and force it into a 5870 die is because the stupid thing would pull a GTX480/470. (cook itself and blackout the whole on new york).

    It seems like ATI waits for a die shrink before doing that. Since the 5870 matches the 4870x2 for shader count but runs cooler and draws less power.

    Now on to the topic of the blog.

    Big GPU's wont be going anywhere for a while. As long as there are people, stuff like Fermi will keep happening simply because everyone will be like "i hope that new process can keep this thing from flopping" when the process is a few years off to begin with. Still there will be big single GPU's to power our playstations and xbox 360's of the future. However a demand for cool and light-bill friendly GPU's will become necessary especially when flash/internet browsing becomes gpu accelerated.

    Fermi + 28nm > 5970?
  3. Elton

    Elton Officially a Whisky Nerd

    23 Jan 2009
    Likes Received:
    I think the Large GPU already died with the 8800Ultra and even moreso with the GTX280.

    Not only are the processes getting increasingly smaller, but the industry is stagnating in terms of graphical prowress in games. Of course many I think already foresaw that the GPU size could only be as big until there was no more room, and in this case there isn't.

    Think of the CPUs now, 2 cores on 1 die..
  4. iwod

    iwod New Member

    22 Jul 2007
    Likes Received:
    Fermi has other problems as well, between now and 28nm a few Fermi respin will fix yield and Power Heat issues, or at least improvement will be made. Fermi has to move to 256bit memory controller. Because most games dont cope well with non Standard Memory size config. 384bit will only get higher Bandwidth but wasted memory. ( Unless Nvidia can push the industry to support these memory size. ) While it would be great to see 512bit GDDR5 controller. There seems to be yield and die space issues with it. 256bit GDDR5 5GHz would not be enough for Fermi, and 7Ghz or higer are not widely available yet. And if Nvidia could made Fermi running higher frequency Fermi has defaintely fit the bandwidth wall..
  5. Xir

    Xir Well-Known Member

    26 Apr 2006
    Likes Received:
    All current processors are multi core...but on one die ;-)
  6. wyx087

    wyx087 Homeworld 3 is happening!!

    15 Aug 2007
    Likes Received:
    people got to stop comparing multi-GPU to multi-core CPU. GPU are multi-core. multi-CPU server platforms such as a 4 processor cluster is comparable to multi-GPU.

    multi-core is where there is a number of cores share the same cache, with the same memory controller.

    multi-processor is where there are a number of memory controllers, each have their own memory, nothing is shared except for the IO of the system.

    due to shared nature of multi-core, current single GPU's can have very effecient scheduling and data can be exchanged on the very fast cache. but with multi-core, data must travel through a form of bus connecting the cores, thus creating bottleneck.
  7. rollo

    rollo Well-Known Member

    16 May 2008
    Likes Received:
    This assumes a console war continues, Sony and Microsoft have both not commuted to the next gen console

    natal and the playstation eye I think it's called are there next big ideas

    Sony said they have a 10 year product life cycle does that mean it's 10 years before ps4. Nvidia and Sony have a decent relationship so you would assume they would stick together

    graphics can't go that much higher before they are life like there's only so much detail you can add before the game hits reality

    Crysis is pretty close to it already. Chracters can already show emotion

    3d is the next big thing but i don't think 3d is for pc Market. Most people are still on. 21 inch screen if not below. How much further can graphics truly be pushed. Wipeout HD is still the best looking game out there with god of war 3 pretty close up

    both running on a 4-5 year old gpu.

    No game bar crysis really requires the 5850 and above unless you go into the crazy resolutions which a bit tech review showed very few use

    most people are still at the 1680 mark or even 1280x1024 Tilll everyone is at 1080p/i I dout graphics hardware will ever be pushed
  8. dec

    dec [blank space]

    10 Jan 2009
    Likes Received:
    The PS2 wasnt 10 years before the PS3.

    +1 To natal.

    IF OLED's ever get around to being sold like LCD's it just may be possible to have a whole wall as your monitor. That oughta keep GPU's busy. Who needs Eyefinity? I got a wall!
  9. dogknees

    dogknees New Member

    27 Dec 2008
    Likes Received:
    Many of you seem to be saying that games have reached their ultimate potential. That there is no possible room for improvement and no new paradigms to be explored.

    We haven't even scratched the surface of the possibilities! This is some of what I see in the future of gaming.

    Games physics is still incredibly immature. simulation of liquids and gasses and their interaction with the rest of the world is immature or non-existent. Where it exists, the methods are not "physics" based, but done with a few formulae that captures the general behaviour, but without much of the subtly of the real world. Particle systems are still using fairly coarse approximations to model gases and the simulation of chemical behaviours is only just appearing. Think about simulating game physics at the atomic level...

    Modelling is still at the level feature films were a decade or more ago. They are using models with billions of polygons. Gaming models are a joke in comparison. The use of tessellation in the new generation chips is a great step, but only one more step along a very long path. The sort of model detail seen in Avatar is where gaming will hopefully be in the medium term.

    Game AI is also in it's infancy, even compared to the current state of the art, but research in AI in general is accelerating at least as fast as the rest of the information industry, and game designers will keep racing to use whatever the researchers come up with. We will see smarter, more subtle opponents and allies. They'll have more detailed memories and intelligent behaviours, both individually and cooperatively and the ability to come up with novel solutions to problems.

    I can picture something like a simple field in spring in a future game. You'd be seeing literally billions of individually animated blades of grass, leaves, flowers and so on. All of them would move independently in the breeze as they do in the real world. There would be dew on the grass, and if you looked closely, all of it would be refractive. The soil the plants are growing in would consist of separate grains of sand, bits of organic matter and small stones. Your footprints in the soil and the crushed bits of grass would be the same, with the blades of grass slowly straightening up after you pass. The dew would be making your trouser legs damp, causing them to cling to your legs a bit as you walked. The field would also be populated with the variety of insects and other small animals you would see in a real field. These would all act like their real cousins, eating the grass or each other.

    That's just what I came up with in 5 minutes, without even considering actors, their behaviour and their interactions with the environment, each other and the player(s). I'm sure the designers and other creative types that build our games could add far more.

    We are at the dawn of gaming and virtual environments, compare Crysis with Wolfenstein and extrapolate a couple of decades with the exponential increases we've seen so far in all aspects of gaming. It's also likely that entirely new kinds of game will be developed given the possibilities that vast increases in processing power will available. After all, before Wolfenstein, there were no essentially FPS's as we know them today. The power of the '386 generation made something entirely new possible.

    1080 HD is the standard right now, but future standards are already being developed in the labs with 5-10 times the resolution we have now. Think about a wrap around screen about twice the size of a 30", but with resolution like a photo, maybe a 0.025 mm dot pitch or less.

    We'll also need two frames for whatever type of 3D is used! That was the future only a couple of years ago and now lots of new products are in 3D, media and hardware. It'll be standard in another couple.

    Big hi-res screens/3D displays, models with billions or trillions of polygons, accurate simulation of reflection, refraction, diffraction, diffusion, radiosity, and scattering of light will need enormous amounts of processing power from future generations of GPUs. Other advances will need physics processors perhaps AI processors and more.

    Future developments in fabrication and design will allow more gates to be built on the same area of a chip. Moving to new materials and technologies will provide more grunt still.

    And, still on the far horizon, but definitely out there, is quantum computing of various sorts. Some of the concepts are pretty esoteric, but there are others, based on single electron gates and circuits(which use single electrons to represent a bit) and other physical phenomena like spin, that are much closer to being realised.

    Will this ultimate processor be a single chip/object? Perhaps not, and certainly there will be generations where multiple chips are used, probably LOTS of chips. But, I think there will be many single chip solutions in our future.

    One reason is that it's always going to be faster to send a signal across a single chip that off one, across a wire/PCB trace/optical fibre, and back into another chip. As speeds increase, distance becomes a serious issue. The smaller your processor, the faster it can work.

    Scientists and engineers have been developing the idea of truly 3D processor/chip structures where you stack up many layers of circuitry vertically for years, and some chips now used multi-level structures where several transistors are stacked on top of one another. Potentially there could be as many layers as there are gates across the width of a chip. So, tens of thousands of layers might be possible. I'd call this a single GPU.

    We will never stop inventing and developing newer smarter technology and applying it to entertainment. Gaming is, and I believe will remain, one of the primary motivating factors of this progress. Maybe just behind sex!

    People love to play!
    SazBard likes this.
  10. azrael-

    azrael- I'm special...

    18 May 2008
    Likes Received:
    No, that's not what we're saying at all. Quite the contrary. What we're (or at least I'm) saying is that game development has shifted away from the PC to console development.

    Consoles have fixed graphical capabilities and usually have a longer life span that your average PC hardware. This is a problem, because when developing for consoles with their limited capabilities there's no incentive to invest time and effort in pushing the envelope for PC hardware.

    Ergo, as things stand, you won't need state-of-the-art graphics hardware on the PC to play a simple console port. Which is what most games these days are.
  11. kornedbeefy

    kornedbeefy New Member

    1 Sep 2009
    Likes Received:
    Maybe ATI and/or Nvidia need to promote PC gaming? How about some commercials on TV? PC gaming could use a champion, it only has it's fans.

    Eventually some devs are going to wake up and tap into top of the line PC hardware. The console market is getting oversaturated IMHO. They will have to advance the games physics and graphics if they want to survive and the PC will be their vehicle.
  12. Farfalho

    Farfalho New Member

    27 Nov 2009
    Likes Received:
    As matter of fact, 5970 = 2 5850's, not 2 5870. That, probably, will be a 5980 or 5990 if they get their 2 core graphics under the 5900 label. We're yet to see an ATi 2x5870.
  13. shaffaaf27

    shaffaaf27 The budget builder

    3 Jun 2008
    Likes Received:
    no the performace is of 2 5850s, but the GPUs are 2 5870s. both with 1600SP, where as the 5850s have 1440 SP.
  14. LightningPete

    LightningPete Diagnosis: ARMAII-Holic

    2 Jul 2009
    Likes Received:
    i think the software and hardware developers need to work more closely together. I mean how cant a massive spec-ed 5000 ati series and 400 series nvidia not play crysis and ARMA2 without the critical frame rate drops in detailed and action packed areas respectively. I think there is something wrong with how GPU drivers work with software developers coding and methods... Time for improved communication, perhaps install a fibre optic phone line between nvidia and EA for example?
Tags: Add Tags

Share This Page