1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics ATI 5970 vs nVidia GT300 (Fermi) [Not Rants-Only Speculation/Intelligent Discussion]

Discussion in 'Hardware' started by Jasio, 18 Nov 2009.

  1. j_jay4

    j_jay4 Minimodder

    Joined:
    23 Apr 2009
    Posts:
    518
    Likes Received:
    14
    There is a limit an atom of silicon is 222pm in diameter for example so it will never get to a pico meter scale. 11nm should be reached in 2022 however below this quantum tunnelling will become a major problem and it is unlikely conventional lithography, etch or even chemical-mechanical polishing processes can continue to be used, because these materials contain a high density of voids or gaps. Even at 11nm the gate size will be 6nm which means there's only 27 atoms between gates and it would also have to be a mono layer of atoms thick which is crazy. Completely new technologies will have to be developed such as non-silicon extensions of CMOS, using group III-V elements or nanotubes/nanowires, as well as non-CMOS platforms, including molecular electronics, spin-based computing, and single-electron devices, have been proposed. Therefore there is going to have to be a move in the next decade or so into nanoelectronics.
     
    Last edited: 19 Nov 2009
    pimonserry and tonpal like this.
  2. j_jay4

    j_jay4 Minimodder

    Joined:
    23 Apr 2009
    Posts:
    518
    Likes Received:
    14
    Back on topic
    Not to mention hot. Sounds like a total repeat of the GTX 280 then? I think AMD will be #1 for a while yet unless nvidia go stupid with performance, but then current games wouldn't utilise it so most people will probably get a much cheaper ATI card unless they're folding
     
  3. Jasio

    Jasio Made in Canada

    Joined:
    27 Jun 2008
    Posts:
    810
    Likes Received:
    13
    As I mentioned earlier; "stupid" will be limited to a 300 watt TDP as per the ATX standard, something the 5970 is already hitting; so the question is how stupid can you get within the standard, or how far can you over-engineer the product to exceed the ATX spec once someone gets home and flips the overvolt switch.
     
  4. bigsharn

    bigsharn Officially demotivated

    Joined:
    9 May 2008
    Posts:
    2,605
    Likes Received:
    83
    Well like it's been said already it wouldn't be too hard to make a card that performs up to 500w and clock it right down so it works at 300w, ok maybe not THAT extreme but you get the idea

    I for one am looking forward to fermi, NVidia's coolers have always seemed to be the most efficient at dissipating heat, and we could easily get to the stage where 700w coolers are fitted to 300w cards just to give that overclocking potential.

    On a side note I reckon we'll hit a point where silicon does become unusable, and companies are realising this and experimenting with other methods, diamond based chips and holographic systems are being designed and theorised over all the time (I cba finding the source, it's far too early in the morning)
     
  5. Jasio

    Jasio Made in Canada

    Joined:
    27 Jun 2008
    Posts:
    810
    Likes Received:
    13
    An added note; those who hated/disliked the old ATI coolers might not be aware that the new ATI's (5970 included) use a vapor chamber instead of a standard fan cooler -- owners of Vapor-X video cards will be familiar with this, and it's something ATI is adopting on all their reference cards, which is why the 5970's cooler can dissipate 400 watts without sounding like a 747 taking off.
     
  6. pimonserry

    pimonserry sounds like a party.

    Joined:
    20 Dec 2008
    Posts:
    2,113
    Likes Received:
    75
    Mmm, it'll be nice for a reference cooler to not be louder than the music from my speakers >.> (not that I'll be buying a 5970 until I win the lottery).
    Well done ATi for putting vapour chambers in :thumb:

    Slightly off-topic, is the 5870/5850 cooler a vapour chamber model? Or only the 5970?
     
  7. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    What I really want to know is how efficient and how effective the AA/AF algorithms are for the GT300. If they are better than ATi's in both performance and quality, that just might buy me over.
     
  8. Horizon

    Horizon Dremel Worthy

    Joined:
    30 May 2008
    Posts:
    765
    Likes Received:
    10
    Only the 5970, the rest of the 5-series are traditonal fan/heatsink
     
    pimonserry likes this.
  9. Moyo2k

    Moyo2k AMD Fanboy

    Joined:
    11 May 2009
    Posts:
    1,482
    Likes Received:
    52
    mmm, full tower case with 4 5970s... fap fap fap... (and yes I know about the limitations in using the GPUs but hey, its an 8 GPU computer....)
     
  10. Aracos

    Aracos What's a Dremel?

    Joined:
    11 Feb 2009
    Posts:
    1,338
    Likes Received:
    47
    That's the one reason why I want to go for a ATI 5 series card, the pixel perfect anistropic filtering, sounds heavenly to me. I may not be able to notice it but I'm very anal about audio/visual quality. I hope the GT300 series can match the 5 series for AF awesomeness. Does anyone think they'll just release the top end part or will we see a GTX260/GTX280 case again where they end up revising both cards? I just want something at the price of £150-£200 but be Nvidia and compete against the 5850 but a card that doesn't get a revision >.<
     
  11. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    First you'll get the high end one... at a premium price, then it will drop down a bit after a few months, and then like all hardware manufacture, provide a cheaper cost model which is in reality a broken high end one, where it was re-worked on, to make it work perfectly. (A great way to save money, which translate saving onto you). Nvidia is never worried about ATI. They know that Fermi will be an ATI killer. If it wasn't, then you'll see Nvidia taken the most powerful Geforce, overclock it, do a quick DirectX11 drop without much optimization as its an optimized Direct 10 card, and ship it with the GTX 300 model on. So far that won't be case. Nvidia seams to be taking it's time, as I think they know they got some powerful and exiting product.

    (Also I am sure the card will be smaller... cause I looked at cases around, and the picture shown at bit-tech of the fermi, and it looks like it's the length of a ATX board. If that is the case, then very few cases will allow it to fit. My guess, is that it's just some engineering sample... maybe put on purpose on a large board, to fit (I am guessing here) diagnostic circuits and such... I mean they need to test the board internals correctly somehow, right?)
     
  12. Guest-16

    Guest-16 Guest

    We've done that with four GTX 295s actually for Folding. It actually cooked the lab - the air con couldnt keep up: it was like a sauna. The "housing" dept called our boss asking why there was a sudden spike in power usage :D

    It took 950W at the wall: the Enermax Rev 1050W handles it fantastically. In fact, it's still folding to this day with six 8800GTs in an X58 Asus WS motherboard and a £13 Antec "big boy" fan cooling them all. The Enermax Revs (and Seasonic M12Ds: we've also tried) are fantastic power supplies - it's literally been at full load for months without a hickup on an electrical ring that has all sorts plugged into it, in the middle of London!

    As for thread discussion: until we see final product Fermi, it's about as powerpoint as Intel's Larrabe. Because they can make the best 3Bn transistor MIMO GPU in the world, but if the drivers are too complex/poor performing/unstable, then it's not worth the sand it's made of. Or, if they can't Fabricate more than 5 at a time because the wafer has less squares than a chess board (not to mention the TSMC yield effect), then it's equally as ****ed.

    Honestly, I dont care when it arrives either. It's not like the DX11 games are crying out for it, AMD has more preorders than sales and a week after its release everyone will have forgotten about the delay anyway.
     
  13. tumblebomb

    tumblebomb What's a Dremel?

    Joined:
    8 Sep 2009
    Posts:
    10
    Likes Received:
    0
    From what I've been reading about Fermi and DX11 is that its the idea to be so freely programmable that it doesn't limit what developers want to do. I think the main problem might be just getting to understand all that they can do on it, so it'll take a even longer to see anything which can utilize the hardware properly.
     
  14. Aracos

    Aracos What's a Dremel?

    Joined:
    11 Feb 2009
    Posts:
    1,338
    Likes Received:
    47
    That's a thing, do you think the move to MIMO would have any effects on past game compatibilty that are designed for their normal SIMO architecture? Especially very old games from pre 2000? I personally know crap all about the differences in architecture between GPU's so forgive me if that's an incredibly n00b question.
     
  15. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Still, right now is a terrible time to buy any GPU as the GTX2xx series have soared $30 and the HD4xxx is old(speaking from a RV770PRO owner).

    I do hope it delivers as I really do need a good GPU to run STALKER!
     
  16. srgtherasta

    srgtherasta Minimodder

    Joined:
    6 Sep 2009
    Posts:
    275
    Likes Received:
    13
    Hi all i'm new but here's my two cents, I was waiting to go from my 8800gtx to a gt300 card but i've lost hope and picked up what might be the last gtx275 in northern irealnd for £156. As it is the first i think we'll see the gt300's is around feb/march time. This is not a cood time as ati have near total control of the top end stuff and prices are going the wrong way. ps very happy with my 275 anyway.
     
  17. PizaDeOveja

    PizaDeOveja death to the waterparty's!

    Joined:
    24 Nov 2009
    Posts:
    9
    Likes Received:
    0
    Hi there. I really enjoyed this thread and I learned quite a bit of useful info about the future of graphics cards.
    That being said, I’m currently running in winXP an amd64 4600+ and a 8800gtx and I can play on average at the frame cap (60fps) or very close to it most of the time, on the brand new CoD modern warfare II at full 1920x1200. I can get low frame rates but it’s a the exception rather than the rule.

    I just read the crossfire article in this web and apparently the tech guys had to un-cap the games so they could tell which card is more powerful, because with the sensible limit of 60fps(which is the one for the monitors most people use:worried:) you wont get more performance from a quad monster SLI than from a previous gen card alone(or even mine which produces 60fps on any game I’ve tested so far).

    Apparently some setups ended up making the CPU the bottleneck, meaning that those setups are sli(double) useless, the are good only for offering more fps than your monitor will show and then even a monster CPU will bottleneck it. Double useless. I guess that anyone will concede that the only real use for that kind of horsepower is to do one’s lion’s share of contribution to the ozone layer.

    I noticed that pic of that guy that apparently has his rig next to him on the table. It must be a turn on to feel the wave of heat in the face and think about all that horse power that we don’t see but certainly feel:rock:. Well I have a buddy that has an M3 and he spends ¾ of his time in car on traffic jams:duh:.
     
  18. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    I'd say that with a 30" Monitor Multi-GPU setups are a must. 2560x1600 is quite a high resolution.
     

Share This Page