News GDDR5 for AMD’s next-gen graphics chip is shipping

Discussion in 'Article Discussion' started by CardJoe, 21 May 2008.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,343
    Likes Received:
    292
  2. Krikkit

    Krikkit All glory to the hypnotoad! Super Moderator

    Joined:
    21 Jan 2003
    Posts:
    23,410
    Likes Received:
    363
    What ever happened to GDDR4? Too expensive?

    I think AMD need a rework of their lineup before they'll be able to compete at the very top-end on single-card solutions with nVidia again, although the 3870 X2 was pretty interesting.
     
  3. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    Will be interested to see what (if any) improvements are shown in multi-gpu setups, as crossfire has potential but it is frustratingly inconsistent at present.
     
  4. WildThing

    WildThing Member

    Joined:
    26 Jul 2007
    Posts:
    816
    Likes Received:
    19
    Cool, I plan on getting one of AMD's new graphics cards, though I am always a bit sceptical if its just higher clock frequencies and not much change of the architecture of the gpu.
     
  5. chrisb2e9

    chrisb2e9 Dont do that...

    Joined:
    18 Jun 2007
    Posts:
    4,049
    Likes Received:
    41
    Thats what they are doing though,
    except that it looks like they wont bother fighting for the top card spot anymore.
     
  6. Panos

    Panos Member

    Joined:
    18 Oct 2006
    Posts:
    276
    Likes Received:
    2
    The future is RED!!
    Crossfire is being supported both by AMD and Intel apparently. And based on the fact that still Nvidia hasn't obtain licence for the next Intel series of processors, it doesn't matter how powerfull single GPU ATI can make as long is good enough and cheap to produce.

    As long as they can plug 3-4 GPU in one card and wire more than one of them on the motherboards (all Intel chipsets support xfire, along with ATI ones), the competition hasn't any reply other than follow (difficult since SLI is tricky especialy when it's not supported by the motherboards) or spend all their money to develop single chip solutions.

    Don't forget. Crossfire technology supported 32-way connectivity from day one. SLI is strill struggling to make 3 cards work.

    I would like to see those 4x PCI-e motherboards, with 4 3870X2 on.
     
  7. Hamish

    Hamish New Member

    Joined:
    25 Nov 2002
    Posts:
    3,649
    Likes Received:
    4
    didnt i hear somewhere that they were going to let both GPUs access the same memory pool?
    IE instead of having 512mb for each gpu it would be 1gig spread across both
     
  8. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    Well I am certainly looking forward to this. ^_^
     
  9. MrMonroe

    MrMonroe New Member

    Joined:
    27 Dec 2007
    Posts:
    195
    Likes Received:
    0
    Fixed. Brute force computing is a bad way of solving problems.

    Too bad it doesn't provide many tangible benefits, and it certainly isn't justifiable considering price. (Neither is 3-way SLI, of course)
     
  10. rhuitron

    rhuitron Bump? What Bump?

    Joined:
    15 Aug 2006
    Posts:
    125
    Likes Received:
    0
    Seriously, What the flood ever happened to GDDR4?
     
  11. sagittary

    sagittary New Member

    Joined:
    2 Feb 2008
    Posts:
    16
    Likes Received:
    0
    It will be interesting to see how things develop. The change in ATI's approach mirrors what happened in the CPU arena - a change from trying to pump out high amounts of megahertz to instead focusing on efficiency across many CPUs with more moderate brute force ability. As the demands of GPUs and current graphical technology (as far as regular people go) handle multi-GPUs differently than how programs can use multi-CPUs, I can see the change in approach only working if they succeed at more than just a hardware level. Not just good drivers but -great- drivers and the ability to seamlessly use many GPUs as easily as one uses many CPUs in implementation and usage (for developers and consumers).
     
Tags: Add Tags

Share This Page