1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News New dual-GPU ATI Radeon 5970 spotted

Discussion in 'Article Discussion' started by Sifter3000, 2 Nov 2009.

  1. Star*Dagger

    Star*Dagger New Member

    Joined:
    30 Nov 2007
    Posts:
    882
    Likes Received:
    11
    I am buying one once they are released, I'll let you know how it runs.

    Yours in Eyefinity Plasma,
    Star*Dagger
     
  2. ChaosDefinesOrder

    ChaosDefinesOrder Vapourmodder

    Joined:
    6 Feb 2008
    Posts:
    707
    Likes Received:
    7
    lol @ CPU fan held on with a cable tie...
     
  3. alwayssts

    alwayssts New Member

    Joined:
    20 Feb 2009
    Posts:
    31
    Likes Received:
    2
    First of all, apologies for speculation and being long-winded, but hear me out.

    More than likely the 5890, which will be clocked/priced to out-'compete', if not outperform, the GTX260 448sp, is what's up their sleeve. 975mhz-1ghz seems reasonable.

    The way I figure it is this:

    nVIDIA salvage parts start as 2 clusters disabled, then move to 1. IE, the 8800gts (96, then 112) and GTX260 (192 then 216). Two clusters this time around would equal 448sp, later moving to 480sp for a rev2/'375'-type part. Full Fermi is 16x32sp, or 512sp (1024 flops per clock).

    Also, Fermi likely keeps the same TMU/ROP/FLOP ratio as GT200, if you figure 48 ROPs, 128TMUs and a 2.5/1 shader/core clock ratio - all which seem likely for obvious advantageous reasons.. Through this you can somewhat extrapolate performance. I figure 700c/1750s (or close) for the gtx380 and 640/1600 (or close) for the GTS360 as this mimics disparity from last gen, putting them at 1.35 and 1.7 the GTX285 spec...with GTS360 being '80%' (more-or-less) of GTX380. Yes, the architecture is different and is a variable. That being said, nVIDIA's architecture likely to scale more linearly in performance compared to ATi's because most games are geared toward their higher pixel and texture to flop ratios, which like I said, will likely remain consistent, if not optimized by the new arch.

    So, anyways... considering where 5870 lies compared to GTX285, I think an overclocked 5870 (5890) seems like the secret weapon. Think of it as GTX260 v. 4870 v. GTX260216 v. 4890 v. GTX275 v. 4850x2 part 6. If there's a GTS360 480sp to take on the 5890, consider that part 7. 5950 is essentially a preemptive part 8, and I'll explain in a second.

    The point of it all is, whatever nVIDIA wanted to charge for desktop Fermi, they won't be able to. Say it was $500 and $400. 5970 will actually likely surpass Fermi at Dual Precision FP at the same price, granted at a higher TDP, but gaming won't even be close. The salvage part would be attacked from above by two highly salvaged chips on one pcb and below by one highly binned one. Between the rumored 40% less die size (plus lesser ram costs because of bus size) ATi will be able to be price competitive at every point for Fermi between 5890 (gts360 rev 1), 5950 (gts360 rev2/GTX380) and 5970 (FermiGX2? If it exists, it will likely be cut down to GTS360x2), if they need to be.

    In other words, ATi's waiting until Nvidia's specs and prices are out to complete their lineup and price them out of the equation.

    Consider this possible realistic price structure after the 5900, 5890, and Fermi launches:

    5970: $500 (725-750mhz, full part)
    5950: $400 (625-650mhz, 90% part)
    5890: $350 (~1ghz)
    5870: $300
    5850: $230

    *Worth noting is that with these specs, it allows for fine binning of Cypress chips, as none would be used in the same configuration in another card. 5950 would actually use the most salvaged chips, hence why it could afford the small price difference between it and the highest binned single chip.

    Those speculative prices are pretty much exactly set up to reflect the difference in theoretical performance. While obviously performance doesn't scale linearly (it's actually closer to half most of the time in ATi's architectural case) and xfire certainly doesn't scale perfectly in all scenarios, it goes to show two things. One is that it correlates well with the law of diminishing returns of investment towards the high-end. The other is that is shows very well that when all is said in done (when the whole Evergreen family is launched and prices stabilize), ATi wants to say they offer you 1TF per $100, and that will likely be the case.

    So, I ask, where does Fermi fit in to this equation? Surely not where nVIDIA wants it to...Unless they REALLY JUST "DON'T CARE ABOUT GAMERS." I jest.

    Apologies again for writing a whole article in the comments, just thought it might be worth an explanation of my speculation. While surely everything won't line up exactly how I forecast, I think it'll end up being close enough so the points are valid.

    Feel free to pick me apart and call me crazy! :)
     
    FuzzyOne likes this.
  4. p3n

    p3n New Member

    Joined:
    31 Jan 2002
    Posts:
    778
    Likes Received:
    1
    ZZZZ no way two cores will perform as well as a single GPU in performance/watt/$$$
     
  5. Goty

    Goty Member

    Joined:
    13 Dec 2005
    Posts:
    411
    Likes Received:
    4
    Well, considering the fact that a single GPU also won't be as fast (in most cases), I don't think that's much of a concern.
     
  6. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    how is the memory cooled? it looks like all it came in contact with what the GPUs or is it just using thermal pads and they simply touch the bottom of it?
     
  7. Tulatin

    Tulatin The Froggy Poster

    Joined:
    16 Oct 2003
    Posts:
    3,161
    Likes Received:
    7
    I really hope that one of the accessories included with this by the OEM is either one of those metal wings that branches back to a support bracket, or an adjustable stand to support it from the case floor.
     
  8. johnnyboy700

    johnnyboy700 Active Member

    Joined:
    27 May 2007
    Posts:
    1,554
    Likes Received:
    18
    Just remember how much of a dual cards performance depends on its drivers
     
  9. liratheal

    liratheal Sharing is Caring

    Joined:
    20 Nov 2005
    Posts:
    11,881
    Likes Received:
    1,351
    ...gimme.
     
  10. 500mph

    500mph The Right man in the Wrong place

    Joined:
    22 Jun 2007
    Posts:
    2,129
    Likes Received:
    32
    The naming scheme leave room for a 5990 :D
     
  11. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    10,945
    Likes Received:
    311
    the naming scheme is a direct copy from the very badly named gtx295......

    why not call this 5870x2?
     
  12. Jetfire

    Jetfire Member

    Joined:
    28 Sep 2009
    Posts:
    105
    Likes Received:
    0
    Jeez, we'll be buying external cases soon to house our graphics cards.

    A 4-way SLI box sitting on top of your PC case anyone? :D
     
  13. Fozzy

    Fozzy New Member

    Joined:
    25 Jan 2005
    Posts:
    1,413
    Likes Received:
    2
    I do find it funny that you are so quick to point out the original posters faults in writing or "journalism" when almost every article I've ever read from bit-tech seems to skip any sort of revision and contains numerous mistakes that could be considered those of a novice..... just thought I'd point that out.

    As for the GPU. I love the matted industrial effect as oppossed to the glossy black that has been the norm. I'm very excited to see a performance review soon.

    back to the bashing. I love bit-tech....just watch the flaming
     
  14. Muunsyr

    Muunsyr New Member

    Joined:
    22 May 2005
    Posts:
    53
    Likes Received:
    1
    That was a great post alwayssts. I read a while ago (and don't know how accurate it is) that nVidia will also be using GDDR5 chips this time around? If so, they could possibly use a lower speed (GDDR5, as compared to ATi's parts) in order to help keep costs down?
     
  15. Mentai

    Mentai New Member

    Joined:
    11 Nov 2007
    Posts:
    758
    Likes Received:
    1
    +1
     
  16. SMIFFYDUDE

    SMIFFYDUDE Supermodders on my D

    Joined:
    22 Apr 2009
    Posts:
    2,897
    Likes Received:
    103
    13.5", i'd have to do without hard drives if i'm to get one in my case. Why don't they make them wider. My case is nothing special, but it still has a 3" gap between the side window and the gfx card.

    I think 5870 X2 makes more sense, if your in the market for this card you'll be educated. Nobody will be going to PC World and asking if the card will allow them to surf the internet.
     
    Last edited: 3 Nov 2009
  17. Slizza

    Slizza beautiful to demons

    Joined:
    23 Apr 2009
    Posts:
    1,738
    Likes Received:
    120
    Going to be a interesting year in the graphics department!
    Think i will ride it out and upgrade my GTX 280 as late as possible wich could be a while yet seeing it still tanks all games.
     
  18. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. New Member

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    what happened to 4890 x2 ? and am i the only one thinking we are seeing 2nd generation directx 11 cards from ATI/AMD without so much as a whisper from NVIDIA. ATI is going ballistic right now with the video cards. (wish they apply this to the AMD side) 5970! this card is going to be a beast and a half. Can anybody tell me how NVIDIA is going to counter this onslaught of GFX card supremacy?

    and what trick up their sleeve do they have waiting in cloak mode? (4890 x3) on a 45mm board.
     
  19. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,955
    Likes Received:
    17
    Am I the only person here who refuses to even think about getting a dual GPU card. I just want a powerful GPU that doesn't need driver updates every other day to get the most out of it.
     
  20. feedayeen

    feedayeen New Member

    Joined:
    18 Jun 2008
    Posts:
    204
    Likes Received:
    21
    You're not the only one. I'd be nice if bit-tech tested there X2 cards with a few more games that tend to be gimped by their drivers rather than games that are next to guaranteed to have frame rates in the 70s and 80s.
     
Tags: Add Tags

Share This Page