Discussion in 'Article Discussion' started by CardJoe, 3 Aug 2010.
Unlike Blizzard to miss something like that...
so the story is that a game is making graphics cards work hard and get hot? hold the front page.
i suppose only fermi owners will need to worry about this though...
Hmm. I take it that my watercooled setup is doing it's job then. And just a thought, wouldn't enabling vsync help? Please correct me if I'm wrong.
It's an unusual thing to miss but you can't blame Blizzard for your faulty cooling. Stress test people!
Wait, what? If your system is set up correctly, it shouldn't overheat in the first place. A FurMark session is much more likely to expose cooling problems in graphics cards.
@d3m0n_edge No, vsync won't help since it doesn't cap at 60fps, but at multiples of it.
Never had an issue..
My frame rate regularly soars into the hundreds/thousands on cut scenes on SCII and multiple games..
Shouldn't the GPU shut down/down clock itself to prevent it from getting cooked?
Exactly. Everyone's getting on the blame Blizzard bandwagon, but I'd be much more likely to blame dodgy hardware designs. Exceeding the TDP of the chip like this is crazy.
Still, as I understand it most of the problems are occurring in laptops, probably used by non-hardcore gamers. laptop manufacturers are pretty notorious for their dodgy hot running designs.
My GPU never goes above 70C and can survive prolonged FutureMark runs however I got constant crashing in those cutscenes. I dunno if it has anything to do with overheating but something ain't right there, that's for sure.
Than again, seeing FRAPS tell you that the current scene is rendered at 600 FPS is pretty damn cool
It should. Dont think I ever saw more than 61 FPS in Fraps.
As any Engineer/Designer/Developer will tell you; no matter how rigorously you test your product, no matter how thoroughly you try to envision the possible permutations of how it will be utilised... The minute you release it to the general public, some sod will find a way to break it in a way that you never thought plausible.
The good news is that the Battle.net is there to enforce updates upon users, in order to protect them from their own dodgy hardware. What would you do without a mandatory Internet connection eh?
If I were to point fingers at anyone, it's the GPU designers and their partners for designing cards that CAN fail in this way. Mainly due to the fact that you would have hoped this would be a fundamental consideration of their design and testing. However, in some way I'm sure you could just refer back to the first paragraph.
as a proud owner of antec 1200 , if my air cooled hardware started to overheat i really would start to worry.
from my experience i know a lot of people who play games/ watch movies on their laptops in their beds. no wonder the bloody things overheat in situations such as this!
It's not the GPU designers fault. Explain why it's their fault that Blizzard told SC2 to render a basic menu like a time demo...
Having the ability to perform actions like Blizzard have on a card is paramount, it's just Blizzard didn't mean to do it.
Rule #1 of hardware: Instructions sent to the hardware should not physically break it.
The fact that Furmark could destroy certain cards, and SC2 can destroy others shows corners being cut. You can remove the heatsink from a modern CPU while it's running and it won't burn up. It's like watching a certain TV show will break your TV. If the hardware is capable of something, then every part of it should be up to it. If the heatsink or VRMs aren't good enough, then it's simply corner/cost cutting.
I was just stress testing my rig using StarCraft 2, when my GPU blew.
Yeah, I was playing Starcraft 2 on max settings for about 3-4 hours nonstop when all of a sudden my computer shut down. I realized it had to be my graphics card, and turned on N-tune to see what temperatures I had. My GPU peaked at around 80 degrees celcius, so I decided to turn up my GTX 260's fan to a fixed 70% fan capacity while I was playing. This also seems to happen to me with Bad Company 2.
At least now I know that it wasn't my graphics card not being able to handle new games
How much longer would this have taken Blizzard to notice if they'd released during the (northern hemisphere) winter?
(I was really happy with how quiet my new case was, until the ambient temperature got up a bit and I had to turn the graphics card fan right up to stop in game crashes)
fixing fan speed is the issue here, not Blizzard, nVidia, ATI or anyone else.
to download MSI Afterbuner, set dynamic fan speed with fan max out at 100% at high temperature. now play any game to your heart's content, it will never overheat as long as the cooler is up to the task. (im sure it is, seeing overheating occurs mostly due to people not setting fan speed correctly)
on a side note: what is to blame on Blizzard is inexcusable omission of Anti-Aliasing in a designed-for PC game.
Although in normal gameplay you usually dont notice, there are some cinematics and areas in gameplay where you just get distracted looking at the jaggies.
Applying AA in the driver control panel makes the in-game map blurry.
Separate names with a comma.