From beareyes.com via the Inq: Plus a picture of what's allegedly an 8600: Thoughts? I'm not sure of the reputation of the original source, but it all seems plausible. I'm sure Tim is NDA-ed up to the eyeballs, but surely you could say if it's wrong?
Anandtech confirmed this in their interview with Nvidia at Ces. I guess the new cards are just variations on the existing ones will less memory.
Not completely true - the 8600 and 8400 are both cores with different numbers of shaders, there are some G80 models coming soon with half the RAM, but the same numbers of shaders etc apparently. They should rock.
i'll be on the look out for an 8600, because it seems GTS's are now down to the price they should have been at their launch...
I'm surprised that they useing the Ultra name... Probally becuase there last ultra card was the 6800, My theory is that 86 and 68 same numbers so thats why they started using "ultra"
in theory 48 unified shaders should be something like a x1950 pro, and the 64 shader one like an x1900xt so not bad at all for mid range.. assuming they are happily under £200 tho
I think that the 8600Ultra should be better then the x1900xt, since the x1900xt has 48 shader processes and the 8600Ultra has 64 unified ones, even if they are a bit lowerclocked, and has similar memory speeds. The 8600GT probably be in between the x1900xt and the x1950pro do to its low clocks but having more shaders then the x1950pro. This is just my logic on this, I have no idea if its the right logic.
idk, last generation and this generation are much different. A 7600GT is this generations mid-range card, and it performs a bit less then an x850xt in most things. So I would expect an 8600Ultra to be a little less then an x1900/x1950xtx. The 8600GT is going to be interesting, depending on its price and performance it seems like it has potential to be a good card. Having as many unified shaders as an x1950xtx has shader processes. It has less clockspeed, but should hopefully get good performance and possibly overclocking potential. Although this is all my random speculation based on previous generations.
The 80nm refresh of the 8800s (which could be 8900s or something) could potentially have GDDR4 if they get slightly behind the X2800s. [/pure speculation]
Another thing is the power requirements....ok now what?same as the bigger beast sucks or is it some relief?
i think theyre gonna save the gddr4's for their new series/improvements release later this year or so ive heard
Be carefull things are not always what they seem.. 8800GTX = 128 processors each working on a 32 bit Float (Scalar). X1900XT = 48 processors each working on 3*32 bit Floats (Vector) + a 32 bit Float (Scalar) similtainiously. (total of 4 32 bit Floats) So when the R600 comes out with 64 processors, each of these will handle 4 Floats like the X1900 as opossed to the GTX with 128 processors handling 1 each. So even though the GTX shader code is going 1.35 Ghz the new R600 will not have to match this speed in order to do more GFLOPS. 1.35 Ghz *128* 1 * 32 bits = shader power Vs. 800 Mhz?*64 * 4 * 32 bits = 19 % more GFLOPS
Proof will be in the benchmarks, I will wait and see who provides the best performance for my dollar in a DX 10 game(Probably Crysis) and then choose. Competition between them only benefits us consumers right?