Discussion in 'Article Discussion' started by Da Dego, 5 Oct 2006.
As you say, it looks like the GTX could be overclocked beyond 1000Mhz...
Now all we need is pricing!
Ouch, 800w would hurt your pocket, but would they actually pull that much?
Specs look very nice, though 640mb sounds a bit of an odd configuration.
that isnt all that power hungry after all. a few points tho. firstly why the odd ram amounts on them, plus why gddr3, the shader count looks very impressive tho.
ill wait till the 8600's come out anyway.
Are they really gunna need 2 PCIE molexs ?
You reckon? That's more than most full systems draw! As the article says, the 8800GTX needs more power than the 7950GT, which is TWO cards in one!
Hopefully the power management will be pretty efficient so it will throttle right back when not loaded - last thing you want is to be burning 450W on your GPU just to use Windows.
640MB and 768MB aren't that odd - 640 = 512 + 128; 768 = 512 + 256. I note that in each case the bus is 1 bit wide for every 2MB of RAM (GTX = 768MB/384 bit; GTS = 640MB/320 bit)
Looking forward to some benchies but, like M4RTIN, I won't be investing in the first wave. Will wait for the midrange parts, and to see how ATi's competing product line compares.
the GTS only has a recommended 50W more than the x1800xt that i had. i wouldnt call it awful. anyway im sure everyone has a reasonable psu if they are going to splash out on these
I love the way that the CPU makers are doing there best to make processors run more efficently and use less power, yet the the graphics card makers just keep drawing more and more power. I hope this trend does not go on much longer.
Welcome to the forums
I think this trend will end when the integration of the cpu and gpu begins...
A Priceless user-comment from one of the dailytech articles:
My room is already waaay too hot in the summer. This ain't cool. Literally.
Just in time for Christmas cold.
Honestly, I'll just wait for ATI's card and see which one gives me enough overhead to overclock the bejeesus out of them. Also, $350 is way to much for a card (I don't have disposable income, sadly). Maybe $300 tops, and hopefully, my 500 watt SLI-not-certified-but-technically-can-SLI-with-2-PCIe-plugs power supply can handle this card. Still, the key here is to wait for ATI to come out with an offering and decide based on price/performance a bit down the road.
do we know if there is a WCing block on the 8800 GTX...???
im still probubly going to get one for my B-day....
By the time I will be getting one I might aswell get an 8850GX2 or something along those lines.
So they have a core clocked at 575MHz and Unified shaders at 1350MHz
So, what is the core doing? If the unified shaders are unified, fill in my ignorence on whats left for the core to do?
450W????? WTF ARE THEY THINKING?????i want one of these but i wont buy them because of this stupid who has the biggar dick war for power.... come on!!!!! someday i will need a nuclear reactor inside my pc just to feed one of these cards, this is just dumb, why wont they folow intel path and create cool eficient and powerfull graphics cards instead of room heaters?
Well, they actually ARE kindof following intel's path, mistakes and all! (speaking of Prescott)
But I have a question and would like some clarification. That 400W and 450W rating for the G80's, is that the power consumption per card or the recommended PSU rating? Yes, this is a serious question!
You should be relieved to hear that it's the recommended PSU rating, according to DailyTech
I'm not certain, but I think that is indeed the recommended PSU rating.
Most of the people above you in the forum are flipping out and seem to be acting like it's the GPU itself drawing that much power...but I think it's a 'whole system, including G80' type of thing. Manufacturers don't generally say "minimum power requirement of my component is ____" since power supplies aren't labeled that way. Manufacturers usually list the power draw of a typical (or high end?) system that includes their part.
*edit* yarg! DarkReaper beat me to it. So i'll add one more idea: since Intel's move from super-power-hungry P4 to less-power-hungry Core2, I wonder if the power numbers reflect both a drop in CPU requirements and an increase in GPU requirements -- ie: is the GPU using EVEN MORE power than it seems like, since the CPU may now be a smaller number in the overall system power draw?
So maybe it used to be (I'm just guessing at numbers here):
125W GPU + 125W CPU + 150W other stuff = 400 W system,
and now it's 210W GPU + 90W CPU + 150W other stuff = 450 W system
an increase of more than just 50W for the GPU over the 7950GTX...
In which case, it's a totally meaningless number. 450W might be more than enough for an E6300, 8800GTX, one HDD and nothing else, but whack in a Kentsfield, multiple RAIDed Raptor Xs, a couple of opticals, a PhysX and an X-Fi and you've got a problem! Realistically, early adopters of 8800 cards are going to have meaty systems to start with, and are likely to be into OCing their rigs, so I doubt 450W will suffice. That said, those people probably already have 600W+ PSUs so power won't be a problem.
Well, from the early pics of the 8800 that shows 2 power connecters on the board, I think it is save to say that it is drawing a lot of power.
Oh, and thanks for the welcome rupbert.
welcome mr. Mage.....
so now, if we want 8800 GTX/S SLi then we need to get a quad SLi PSU.....
crickey... i think nVidia is barking up competly the wrong tree with its consumption on these GPU's
Separate names with a comma.