1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News G80 launch specs released

Discussion in 'Article Discussion' started by Da Dego, 5 Oct 2006.

  1. Da Dego

    Da Dego Brett Thomas

    Joined:
    17 Aug 2004
    Posts:
    3,913
    Likes Received:
    1
  2. rupbert

    rupbert What's a Dremel?

    Joined:
    28 Jul 2002
    Posts:
    800
    Likes Received:
    0
    As you say, it looks like the GTX could be overclocked beyond 1000Mhz...

    Now all we need is pricing!
     
  3. Lazlow

    Lazlow I have a dremel.

    Joined:
    8 Aug 2003
    Posts:
    1,464
    Likes Received:
    0
    Ouch, 800w would hurt your pocket, but would they actually pull that much?

    Specs look very nice, though 640mb sounds a bit of an odd configuration.
     
  4. M4RTIN

    M4RTIN What's a Dremel?

    Joined:
    11 Sep 2006
    Posts:
    1,259
    Likes Received:
    3
    that isnt all that power hungry after all. a few points tho. firstly why the odd ram amounts on them, plus why gddr3, the shader count looks very impressive tho.

    ill wait till the 8600's come out anyway.
     
  5. Sim0n

    Sim0n rm -rf /

    Joined:
    14 Dec 2001
    Posts:
    1,154
    Likes Received:
    0
    Are they really gunna need 2 PCIE molexs ?
     
  6. mclean007

    mclean007 Officious Bystander

    Joined:
    22 May 2003
    Posts:
    2,035
    Likes Received:
    15
    You reckon? That's more than most full systems draw! As the article says, the 8800GTX needs more power than the 7950GT, which is TWO cards in one!

    Hopefully the power management will be pretty efficient so it will throttle right back when not loaded - last thing you want is to be burning 450W on your GPU just to use Windows.

    640MB and 768MB aren't that odd - 640 = 512 + 128; 768 = 512 + 256. I note that in each case the bus is 1 bit wide for every 2MB of RAM (GTX = 768MB/384 bit; GTS = 640MB/320 bit)

    Looking forward to some benchies but, like M4RTIN, I won't be investing in the first wave. Will wait for the midrange parts, and to see how ATi's competing product line compares.
     
  7. M4RTIN

    M4RTIN What's a Dremel?

    Joined:
    11 Sep 2006
    Posts:
    1,259
    Likes Received:
    3
    the GTS only has a recommended 50W more than the x1800xt that i had. i wouldnt call it awful. anyway im sure everyone has a reasonable psu if they are going to splash out on these
     
  8. BlackMage23

    BlackMage23 RPG Loving Freak

    Joined:
    4 Aug 2006
    Posts:
    259
    Likes Received:
    1
    I love the way that the CPU makers are doing there best to make processors run more efficently and use less power, yet the the graphics card makers just keep drawing more and more power. I hope this trend does not go on much longer.
     
  9. rupbert

    rupbert What's a Dremel?

    Joined:
    28 Jul 2002
    Posts:
    800
    Likes Received:
    0
    Welcome to the forums :)

    I think this trend will end when the integration of the cpu and gpu begins...
     
  10. aggies11

    aggies11 What's a Dremel?

    Joined:
    4 Jul 2006
    Posts:
    180
    Likes Received:
    1
    A Priceless user-comment from one of the dailytech articles:


    My room is already waaay too hot in the summer. This ain't cool. Literally. ;)

    Aggies
     
  11. Grinch123456

    Grinch123456 What's a Dremel?

    Joined:
    9 Aug 2006
    Posts:
    99
    Likes Received:
    1
    Just in time for Christmas cold.
    Honestly, I'll just wait for ATI's card and see which one gives me enough overhead to overclock the bejeesus out of them. Also, $350 is way to much for a card (I don't have disposable income, sadly). Maybe $300 tops, and hopefully, my 500 watt SLI-not-certified-but-technically-can-SLI-with-2-PCIe-plugs power supply can handle this card. Still, the key here is to wait for ATI to come out with an offering and decide based on price/performance a bit down the road.
     
  12. Guest-23315

    Guest-23315 Guest

    do we know if there is a WCing block on the 8800 GTX...???

    im still probubly going to get one for my B-day....
     
  13. r4tch3t

    r4tch3t hmmmm....

    Joined:
    17 Aug 2005
    Posts:
    3,166
    Likes Received:
    48
    By the time I will be getting one I might aswell get an 8850GX2 or something along those lines.
    So they have a core clocked at 575MHz and Unified shaders at 1350MHz
    So, what is the core doing? If the unified shaders are unified, fill in my ignorence on whats left for the core to do?
     
  14. DXR_13KE

    DXR_13KE BananaModder

    Joined:
    14 Sep 2005
    Posts:
    9,139
    Likes Received:
    382
    450W????? WTF ARE THEY THINKING?????i want one of these but i wont buy them because of this stupid who has the biggar dick war for power.... come on!!!!! someday i will need a nuclear reactor inside my pc just to feed one of these cards, this is just dumb, why wont they folow intel path and create cool eficient and powerfull graphics cards instead of room heaters?
     
  15. gvblake22

    gvblake22 mmm, yep

    Joined:
    7 Dec 2005
    Posts:
    413
    Likes Received:
    1
    Well, they actually ARE kindof following intel's path, mistakes and all! :wallbash: :sigh: (speaking of Prescott)

    But I have a question and would like some clarification. That 400W and 450W rating for the G80's, is that the power consumption per card or the recommended PSU rating? Yes, this is a serious question! :p
     
  16. DarkReaper

    DarkReaper Alignment: Sarcastic Good

    Joined:
    9 Jan 2004
    Posts:
    1,751
    Likes Received:
    0
    You should be relieved to hear that it's the recommended PSU rating, according to DailyTech :)
     
  17. EQC

    EQC What's a Dremel?

    Joined:
    23 Mar 2006
    Posts:
    220
    Likes Received:
    0

    I'm not certain, but I think that is indeed the recommended PSU rating.

    Most of the people above you in the forum are flipping out and seem to be acting like it's the GPU itself drawing that much power...but I think it's a 'whole system, including G80' type of thing. Manufacturers don't generally say "minimum power requirement of my component is ____" since power supplies aren't labeled that way. Manufacturers usually list the power draw of a typical (or high end?) system that includes their part.

    *edit* yarg! DarkReaper beat me to it. So i'll add one more idea: since Intel's move from super-power-hungry P4 to less-power-hungry Core2, I wonder if the power numbers reflect both a drop in CPU requirements and an increase in GPU requirements -- ie: is the GPU using EVEN MORE power than it seems like, since the CPU may now be a smaller number in the overall system power draw?

    So maybe it used to be (I'm just guessing at numbers here):
    125W GPU + 125W CPU + 150W other stuff = 400 W system,
    and now it's 210W GPU + 90W CPU + 150W other stuff = 450 W system

    an increase of more than just 50W for the GPU over the 7950GTX...
     
    Last edited: 5 Oct 2006
  18. mclean007

    mclean007 Officious Bystander

    Joined:
    22 May 2003
    Posts:
    2,035
    Likes Received:
    15
    In which case, it's a totally meaningless number. 450W might be more than enough for an E6300, 8800GTX, one HDD and nothing else, but whack in a Kentsfield, multiple RAIDed Raptor Xs, a couple of opticals, a PhysX and an X-Fi and you've got a problem! Realistically, early adopters of 8800 cards are going to have meaty systems to start with, and are likely to be into OCing their rigs, so I doubt 450W will suffice. That said, those people probably already have 600W+ PSUs so power won't be a problem.
     
  19. BlackMage23

    BlackMage23 RPG Loving Freak

    Joined:
    4 Aug 2006
    Posts:
    259
    Likes Received:
    1
    Well, from the early pics of the 8800 that shows 2 power connecters on the board, I think it is save to say that it is drawing a lot of power.

    Oh, and thanks for the welcome rupbert.
     
  20. Guest-23315

    Guest-23315 Guest

    welcome mr. Mage.....

    so now, if we want 8800 GTX/S SLi then we need to get a quad SLi PSU.....

    crickey... i think nVidia is barking up competly the wrong tree with its consumption on these GPU's
     
Tags: Add Tags

Share This Page