Discussion in 'Article Discussion' started by Sifter3000, 20 May 2010.
I'm guessing there will be a lot of money in this for nvidia.. thus more money for them to pump into R+D dept... so better graphics for us Its like F1 and the space industry.. the tech is developed there and fed into the consumer market.
Fermi had to be good for someting right?
Yikes - just don't run them 24x7 and then act surprised when they produce a load of memory errors and start dying after a few months.
nVidia has to earn money somehow, after all.
I thought the chubby bloke was going to get vapourised by 100 fermis upon opening the oven door.
Fermi was designed for GPGPU, I hope they at least do this right.
Sauna room is good for slimming down.
Are those new 470's and 480's really that hot? I thought my 275s were hot as ****!
For HPC and servers, they are usually run in air conditioned (think 5-10ºC ambient) rooms.
Downside is the price. I want 3 or 4 in a workstation for some GPU stuff I'm doing, but it's gonna require 10k roughly.
Nvidia always seem to be making money - even when geeky forums are certain of their doom the figures generally still show them making a profit.
This is obviously promising, although it's pointless putting gpu's in your data centre if none of the applications actually use them. Don't know how long that will take - a few years I expect?
I guess this is the new topic for the next 5 years.... Temps. It used to be all about clock speeds but now all the comments are about temps. Temp is not everything. We have got used to this idea that everything 'needs' to be running at minimal temps just to perform and this is just not true. None of us need to use 3rd party coolers on our CPUs unless you are overclocking or 'that' worried about a little noise, but we all rush out to buy the latest coolers... and then replace those with the latest colour of the same thing!
Fermi runs hot, this is a matter of fact and design and the high temps they do run at do not hinder performance of the GPUs and the heat generated does not really cause any issues with any other componants either... so get over it.
So lets focus on actual performancce here and not silly numbers of clock speed, temp, noise levels or any other insignificant data... these cards work... and they kick arse while doing it.
Something else I would just like to add... all this recent hype about Eyefinity is a waste of space as it will never take off the way ATi hopes. But isn't it funny how the movie industry is all about 3D technology once again and nVidia has been advancing 'that' tech for the last couple of years now.
If the components run very hot , there's more probability that the hardware fails. (simple)
I could look at performance, but i don't like to have an heater on my pc, nor a jet blowing fan... and specially don't want to spend loads of money only in consumption. Efficiency is the word right now, and comparing the GTX480 and the HD5870, I would choose HD5870 any day, it's quieter, consumes much less power and doesn't have high temps (like the ones I had with my HD4870), and the difference are of only a few fps... AND it's more expensive!!
I agree with u, Eyefinity is reaallyy too expensive for the gamer, only the ultra-high end will buy one of those Samsung 6-panel thingy...
But it's great for work, having a card that can output up to 6 monitors with different info, that's alot of money that the companies save...
We've hit the 300w PCIe ceiling; that goes then it will get interesting. I think Fermi solo (and the dual cards in the past) has managed to max the draw.
Now with that out of the way its going to be about performance. Thankfully Fermi got panned for the simple act it was too hot and not fast enough to justify itself so hopefully it will not be the prototype for cards in teh future.
So IBM got the Tesla chips that Oak Ridge didn't want ?
I bet IBM didn't pay full whack for them.....
Separate names with a comma.