Discussion in 'Article Discussion' started by Tim S, 26 Mar 2009.
With the advancements of the last few years I've been completely left in the dust. I also believe what's been said is true and hope it is.
I personally think the 3D age is over, it's AI and Physics age, here we go multicore!!
It's the what can consoles do age?
As soon as the XboX 1080 and the Playstation 4 and the Weeeeeeeeeeee! come out, there will be another push for more advanced graphics to help sell those consoles.
I see your arguement, but i think you'll go down in history with all the other people that think technology has stopped.
I'd much rather devs concentrated on creating a game that's styled really well than has just technically good graphics. I loved looking at BioShock far more than Crysis just because Art Deco covered in blood will always win over Jungle Island covered in blood.
Sorry, I think you must have commented on the wrong article there. Don't worry, it's easily done.
Sony and Nvidia seem to be pushing the 3d gaming angle, I reckon this along with better AI/Physics is going to make gaming much better.
No. Graphics have not plateaued. If they have, how will nVida, ATI and any other graphic card manufacturer that care to join the foray continue to sell graphics cards? People will just save up for a nVidia 295 or an ATI 4870x2 and never buy another graphic card - ever. That would not be good news for the Graphic Card manufacturers - or us.
Graphics have a bit of a way to go before they are indistinguishable from real life. But it is possible that in a few years time, we could be playing games that are that good. Then there's 3D Graphics. Then there's holographics. Then there's...Who Knows?
I hope not, if you look at the best off-line rendered digital Hollywood productions, there are still many areas for improvement. I do believe that as graphics become increasing realistic, they will become subservient to gameplay and level design, after all, as you flee a terrifying monster or drive Laguna Seca at 200mph, how much time do you have to admire incredibly intricate details.
As we enter the nano-technology era, it would be great to achieve truly portable gaming computers, visual output device included. Imagine how we will laugh in 10 years time at the size & weight of our ATX cases and TFT monitors! From your comments, can we infer that you do not think DirectX 11 will bring a major improvement in graphics? And lastly, you are the editor of Bit-Tech, suggest to NEC that you require a longterm test of their 30inch panel, it's common practice in the motor journalism industry, you have to suffer for your art
Compare the performance of the 8800GTX, 9800GTX, GTX280 and GTX285, sure they do increase with each iteration but the performance jump has become increasingly smaller each time.
In this respects, I do agree with the article, the technology behind G-Cards does seem to be slowing down somewhat. The question, however, is whether this is because game technology has slowed down, or whether nVidia's relative dominance of the graphics performance sector has meant that they haven't tried as hard between iterations anymore
It's also certainly true that recent games have deviated from the previous "pretty = better" and instead gone more towards "better = better" which, despite being a rather superfluous statement on its own, actually means that they're focussing on making the games more playable, more enjoyable and simply focussing on other areas of gameplay than just graphics. This can only be a good thing, as a pretty game that needs nuclear power and a small loan to play properly doesn't nescessary have the enjoyment value (e.g. Crysis) as something that can be run on a machine 4 years old (e.g. Source engine games)
so wait... your point then is that you'd like 2015's Atom platform to play games at full/full/full/max/max settings?
To me, it seems rather like you are hoping that no further advancements in software technology are madefor games, so you can enjoy high-end gaming on mid-range hardware. It sure sounds nice, but unless i'm wrong *again* about what you are trying to say, i think you'll be disappointed.
If the software side of graphics stops, manufacturers will also stop going bigger. This means that that 2015 GPU (the ATI 9870, or the nVidia GTX1495+GSOX2-9600GT) will be no more powerful then today's generation of cards, just smaller and more economic. Plus, they will still be high-end. The only way for midrange cards to get to that level of performance, is to either play old games (Homeworld for the win!) or for harware and software to get seriously de-synched. You seem to be hoping for that de-synch, but the problem with that is that nobody will buy high-end, and prices of mid-range will go up to the point that they are high-end again.
PC gaming graphics are still rasterised-based. We will not truly come near a plateau until we're using ray-tracing for everything you see on screen. On top of that, a great physics engine will be required. Just look at any body of moving water, or a flame, smoke and dust clouds etc in a current game - they're all still a long way from being as good as they could be, and won't achieve this until physics engines really start to handle it all.
All that said though, maybe rasterised graphics are reaching its plateau, but not PC gaming graphics as a whole.
I'd have to re-read my blog again but I'm pretty sure I didnt say any of the above.
Quote for truth
Well, it is unlikely that the games ai is going to grow as fast as shaders and high res textures do. The simple reason is that good graphics are easier to do and benefit hardware manufacturers more than better ai.
I do hope that graphics progress does at least slow down and let the hardware take the lead again, but I definitely do not hope progress stops altogether any time soon.
Oi, please don't be such a git about it
If it's not your desire to sit in first class with 3rd class tickets, I'm seriously questioning what it is that you ARE trying to say here...
So maybe i'm just wrong about you wanting things to stop, while in fact you want software to slow down, but that doesn't negate my point that as soon as mid-range hardware will be able to serve you all the possible candy, no card will ever be positioned above that, making the card high-end again, and i bet you can count on pricing to work accordingly.
I agree with the L4D + Crysis comment a little, they need to make most of what they have, and not add and add and add to try and make a game look better. Making things "too" realistic is not the way to go at the moment, just try and make a game look nice.
Well, to be fair, if you're going to put words in peoples mouths then youve got to be prepared to take responsibility for doing so.
It depends if the exponential increase in price/performance ratio of integrated circuits is enough to help GPU speed vastly outstrip gaming requirements. If games graphics plateaued roughly where theyre at now then the amount of money it would cost to actually manufacture a future chip capable of running games smoothly with full sauce at 2,560 x 1,920 with 16x AA would be very low and thus unlikely to cost large sums of money. Especially considering the how fierce competition is in the GPU market.
Thats also before taking into consideration that rasterisation wont last forever. But thats a whole other ball park as there are so many unknowns. I liked Sebbos comment.
A few years of affordable 30in rasterised gaming prior to ray tracing or whatever comes next would be a blessing indeed. Maybe gamers would love it so much that they never bothered the spend the cash to be early adopters of ray tracing and the technology was a flop because we were all loving out 54in displays too much. Though then that would depend on what the game devs were up to. The debate rages on.
but the problem is that we NEED software to make hardware developers to release newer products. otherwise the hardware company will try to maximise shelf-time by simply rebranding every few month (nVidia) with no real development.
before Crysis came out, 8800GTX was sitting at the top for a loooooong time, and people are saying it's enough. only after the release of Crysis, more affordable hardware was released.
Separate names with a comma.