1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News G80 launch specs released

Discussion in 'Article Discussion' started by Da Dego, 5 Oct 2006.

  1. Kipman725

    Kipman725 When did I get a custom title!?!

    Joined:
    1 Nov 2004
    Posts:
    1,753
    Likes Received:
    0
    The inqurer was floating around 250w just for one card with external psu a while back so the 450w psu for the entire system isn't as bad as it could be. I would seriously recomend not getting an ATX standard PSU with these cards though as they will overload the 18A per rail maximum which an ATX 2 psu is ment to cut off at. PPC do a psu with a single 12v rail that can run stabley at 60A which should cover these cards. (instead of two 18A 12v rails)
     
  2. EQC

    EQC New Member

    Joined:
    23 Mar 2006
    Posts:
    220
    Likes Received:
    0
    If I remember correctly, each of the PCI-express power connectors can supply a maximum of 75 Watts. Add that to a bit of power coming through the motherboard to the GPU, and the total is probably around 200 Watts absolute maximum power draw from the G80 board.


    Personally, I don't game much and I insist on a silent (ie: no fan) GPU, so I see where you're coming from. On the other hand, an 8800GTX will probably run, what, $600? If somebody's going to spend $1200 on graphics cards alone for an SLI system, I don't think they ought to be complaining about spending $500 on a power supply.

    Average Joe-Gamer spends what, around $300 for his graphics card? (that's Avg. Joe Gamer...not high-end-gotta-have-the-best gamer, and not regular old average joe me who spends <$100 on a graphics card...and not average joe regular person who has integrated graphics from Dell). Maybe the power supply for Average Joe-Gamer costs around $100 - $125?

    So Mr. 8800GTX SLI spends 4x as much on his graphics card, and 4x as much on his power supply. Seems reasonable to me? If he could taper it back to just one 8800GTX (something most of us probably still can't afford...), he can probably get away with that $125 PSU.
     
  3. BFGunrunner

    BFGunrunner New Member

    Joined:
    13 Mar 2006
    Posts:
    123
    Likes Received:
    0
    What about ATI's new card I wanna see whats up with that beast, It could probally beat off nvidias card.
    I hope it will be less power deamading....
    I'm going to be building a new conore system but I'm not sure if I'm gona go DX 10 just yet, If they don't have mid range cards I'll be getting my self a X1900XT hopefully by then they'll be come down in price by a lot.
     
  4. -EVRE-

    -EVRE- New Member

    Joined:
    12 May 2004
    Posts:
    372
    Likes Received:
    1
    *sigh*
    when I put $110 into my Enermax 500w and $98 into my sisters 550w Antec NeoHE I thought I was future proofing our comuters... atleast we have enough power to run 1 of these (That should be enough for a while :b)

    I realy want a Dell 30" screen. My 7800gt's are doing very well for me right now on a 19", but 3.2x as many pixes in the 30" are going to require one of those bad boys!

    what does this mean?
    (128 unified shaders clocked at 1350MHz)

    is that like 128 pixel rendering piplelines?
    ie my 7800's have a grand total of 40 clocked at 445mhz and this thing has (does same thing as pixel rendering pipeline just with a different name?) 128@ 1350mhz?
     
  5. riggs

    riggs ^_^

    Joined:
    22 Jul 2002
    Posts:
    1,724
    Likes Received:
    3
    1.21 Gigawatts! Er, I mean 800 watts! That's a hell of a lot. No doubt they'll be damn fine cards though...
     
  6. Sea Shadow

    Sea Shadow aka "Panda"

    Joined:
    15 Jan 2004
    Posts:
    614
    Likes Received:
    13
    All I have to say is: EVGA step up program, HERE I COME :rock:

    On a more serious note, I hope they manage to launch the lower end models soon as I really find it hard justfying such a large expenditure of money for a single computer component.
     
    Last edited: 5 Oct 2006
  7. Drexial

    Drexial New Member

    Joined:
    28 Jul 2005
    Posts:
    307
    Likes Received:
    0
    ATi basicly said that cooling and power effeciency isnt going to even be in the minds of the developers of their DX10 cards. so the requirements for theirs will most likly be signifigantly higher. but it would seem that ATi is going to bruit force their way to performance, which usualy isnt the way ATi handles things.

    it would seem that there is somthing in DX10 that forces power draw. when you concider the advances in CPUs they are starting to drop in consumption but increase in performances.
     
  8. BlueDemon

    BlueDemon New Member

    Joined:
    17 Mar 2005
    Posts:
    58
    Likes Received:
    0
    As GPU power consumption is rapidly departing the "irrelevant" range, and utility bills are consequently ramping up, retail price will lose some of its significance. We should consider the Total Cost of Ownership (TCO) of those power-hungry mofos, not just the initial cash lay-out.
    I'm not kidding either, those cards will bleed you dry while they are heating up your place ;)
     
  9. Redbeaver

    Redbeaver The Other Red Meat

    Joined:
    15 Feb 2006
    Posts:
    2,057
    Likes Received:
    34
    if the GTS performs at least 40% above 7900GT, and its below $499, it will find a place under my xmas tree........

    i just want to enjoy Crysis with all the eye candies...........
     
  10. Skill3d

    Skill3d New Member

    Joined:
    29 Sep 2005
    Posts:
    205
    Likes Received:
    1
    looks like it's going to be old-school dual PSU :) but serious, two 8800's in SLI with a system along with those specs and offcourse 2 large tft screens..... hmmmm global warming anyone?
    although you won't need any other heaters for your home :) so perhaps its better this way :p
     
  11. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    I'll sell you an 8500, It's almost as good :worried: :naughty:

    Intel got owned on that until the Core2Duo so it swung way towards efficiency :thumb: Now that AMD owns ATI it might get interesting.

    :clap: that was great, Thanks.
     
  12. TomH

    TomH And like that... he was gone.

    Joined:
    28 Nov 2002
    Posts:
    756
    Likes Received:
    6
    From the article:

    I'm pretty sure the dual card model is the 7950GX2 ?

    Tired? Maybe.. Bored? Definitely!

    Reading back through the comments again though. This is really reminding me of the 5800 Ultra launch :rolleyes:
     
    Last edited: 5 Oct 2006
  13. JADS

    JADS Et arma et verba vulnerant

    Joined:
    27 Mar 2001
    Posts:
    2,918
    Likes Received:
    1
    The difference between the Prescott and these next generation graphics cards is the dreaded performance per watt. Prescott offered relatively low performance whilst putting out a lot of heat and demanding a high power draw.

    At least with these next gen cards you know you are going to get something absolutely mind blowing so I'm not overly fussed about the heat being put out by them nor their power requirements.
     
  14. Grinch123456

    Grinch123456 New Member

    Joined:
    9 Aug 2006
    Posts:
    99
    Likes Received:
    1
    Maybe the next generation of graphics cards can have foot warmers attached, for that 800watt cooling sensation aka, "Holy jesus! My feet are burning! Oh good god! I'm aethiest, but still saying god! Why am I doing this and why am I still 'speaking' in not all caps if my feet are burning!!!?!"
     
  15. Fozzy

    Fozzy New Member

    Joined:
    25 Jan 2005
    Posts:
    1,413
    Likes Received:
    2
    Hmm. Not too concerned TBH. I have to wait for windows to come out before I buy one anyways. I probably wont go DX10 untill this summer. And not all is lost. You can always get a dedicated cddrive psu for your graphics card. I got a 600w psu already and I'm sure I could hook up my 320w meanwell if needed
     
  16. BlackMage23

    BlackMage23 RPG Loving Freak

    Joined:
    4 Aug 2006
    Posts:
    259
    Likes Received:
    1
    I think I will skip this generation, and stick with my 2 7900's. I expect the 9 series will be a really big step up as they would have had more time to play with DX10.
     
  17. speedfreek

    speedfreek New Member

    Joined:
    9 Nov 2005
    Posts:
    1,453
    Likes Received:
    1
    :hehe:

    Im kinda a nvidia fanboy but im waiting to see what ATI has coming up.
     
  18. Firehed

    Firehed Why not? I own a domain to match.

    Joined:
    15 Feb 2004
    Posts:
    12,574
    Likes Received:
    16
    I'll cancel that order on the wintertime space heater then. Holy crap.
     
  19. Warrior_Rocker

    Warrior_Rocker Holder of the sacred iron

    Joined:
    26 Jun 2005
    Posts:
    938
    Likes Received:
    1
    *scratches head* a graphics card taking up twice as much power as my entire desktop does? :shiftyeyes:
    -Shuttle sb61g2v3
    -Pentium 4 3.0E prescott
    -2x 512mb Corsair XMS
    -Radeon 9600se oc'
    -2x 250gb Hitachi Sata-I
    -1x 500gb Hitachi Sata-II

    Somehow I just dont get how nvidia plan's make this a reality.

    Next news article will probably focus on the new heat return lines nvidia will soon start install in peoples homes.... :grr:
     
  20. kye

    kye New Member

    Joined:
    1 Dec 2004
    Posts:
    269
    Likes Received:
    0
    Even how good the card is, i'll probably be buying a DX10 card mostly considering a power consumption/preformance ratio. If ATI released a card slower but used much less power id buy the ATI card.

    Kye.
     
Tags: Add Tags

Share This Page