Discussion in 'Article Discussion' started by CardJoe, 9 Jun 2011.
But...id haven't made a decent game in years.
If thats the case then the next round of consoles will need to be loaded with lots of RAM to help extend the lifespan, as the most common complain from most devs seems to be the lack of RAM
Its taken you 6 years to "focus on making" what looks to be dumbed down Borderlands with zombies, id.
I'm not sure the breakneck pace of console development (really? I mean, REALLY?) is what's stopping you being creative.
Let's see... Nintendo, Sony and Microsoft (the current console-brands) have and had lifecycles of some six years. If this isn't going to change, then I don't really see a problem there, as six years is a very long lifespan for electronics these days.
I see more a problem on the developers side, who simply aren't that good at developing games. It takes them way too long because they focus too much on graphics. Graphics aren't that important tho, but we want fun to play, innovative and interesting games.
They apparently cut ram out of the Vita to keep the price the same as the 3DS. Usually I would think that wouldn't matter, since handhelds never used to have direct competition with an upgradable system (e.g. PC) outdoing it, but now they have upgraded SKU's on iPhones most years...
I expect cell phone games to start surpassing some Vita games graphically in a few years. Man that feels weird to say.
On topic, I agree with Spuzzell. Rage should have been out 2 years ago when I might have cared. Now their tech isn't even impressive (see Battlefield 3).
Yep, would have to agree
Yeah... lets keep consoles for decades... freaking old hardware like now...
Stop bitching and start to release games faster (or at least don´t take a century to release one).
That's just a rumour that was debunked a while ago by Sony.
The hardware of current consoles is actually fast enough for games. They just need to release good games instead of only pushing graphics.
I can't say it often enough, but gameplay > graphics !
By that logic we should still have a PS1, or maybe an SNES. A 6 year cycle is a long time. I certainly don't run a video game company, but working in IT, if it took us more than 6 years to develop and release a decent product we'd be hammered. Really unless you are attempting to recreate a unique sandbox the size of the moon (and all pre-coded) the thing really shouldn't take you more than 2 years tops once you have final dev hardware. If it does either you've got a bad team dynamic, poor leadership, lack of experience and skill or bad upper management.
6 years is plenty long enough thank you very much. If you want console games to even smell like the PC variant, let alone look remotely like it longer than 6 years won't allow you do that. Sure you can get away with more on a console because you can code directly, but 6 years of CPU and GPU advances is a hell of a lot, let alone longer than that.
But nobody is focusing on making better games.
I want lose my memory and to go back to November 1998, there were more classic games released in that 1 month than were released in the last 5 years.
They just want a longer lifetime so that they can actually RELEASE a single game in the lifespan of the hardware
This seems to be the sad truth; as rose-tinted as it risks being.
Courtesy of RPS.
Just thought the above quote might be of some interest for this topic. I get the impression that the creative side of the team (and probably the beancounters) might be content with aging console hardware for the sake of stability, but that the technical gurus like Carmack are tired of trying to cram their ideas into whatever an X360 can handle. Or maybe I'm reading his words out of context and with the bias of a PC gamer. It could be that Carmack enjoys the task of coding the engine efficiently to achieve multiplatform parity; I don't really know.
On the other hand, the last paragraph highlights the other big issue with multiplatform development, the same issue raised by AMD recently (and later retracted if I recall correctly) - That the overheads caused by DirectX and OpenGL bloat are holding back the performance of PC games when compared to coding to the metal as on consoles.
As someone on RPS pointed out in the comments though:
There's been a lot of sound and fury over whether this is actually a worthy topic of discussion, or if it ever makes sense to think of a gaming-centric OS distribution to cut that bloat and overhead out; but it certainly makes me wonder.
I remember seeing launch titles on the PS2 and thinking "wtf. My PC is considerably more powerful than a PS2, with a faster processor, faster dedicated GPU and way more volatile storage. Why can't I get visuals like that in my PC games? Is there really so much processor time being used by Windows, an array of drivers and a bunch of background processes that do things I don't need to run my games?"
The answer is probably 'no', and though it likely has an impact, a lot of the performance difference likely comes down to the overhead caused by abstraction in DirectX/OpenGL and so on - A combination of the problems of hardware variety (something many of us here would hate to lose) and Windows-based PCs being 'everymachines'.
Additionally, we know that console games cut corners to achieve those visuals at solid framerates. When we were playing Morrowind in 1600x1200 on PC, the PS2 was playing at 640x448; and the parity in resolution between console and PC games these days (pushed by the 'HD' fad) is showing the inadequacies of the consoles up.
I've still always wished I could have an x86 Gaming OS to get the best out of my PC for games without just turning it into a console with a keyboard & mouse, but we certainly won't see that from Microsoft any time soon judging by their attitude to Windows gaming; And as much as I'd love to see Linux filling those shoes I can't see it actually happening any time soon, if ever.
Somewhat amusing coming from the company that was all tech and no design for so many years.
I understand where he's coming from, the the truth is devs haven't been polygon chasing for a few years. The best looking game ever released (in terms of engine power, if not design) is arguably still Crysis, and that came out four years ago.
No doubt this is driven by the locked down consoles, but there is the fear that when a next-gen console comes out you'll get two years of devs simply getting to grips with the added power instead of making good games.
Wait, this is coming from id? A company whose every employee eats ones and zeros and shits machine code asking for easier dev conditions?
"Stability"?!?! Are you freaking kidding me, Xbox and PS3 are already what, like 5, 6 years old? That's pretty much 10 average PC upgrade cycles. How much longer does iD need a console to stick around before they get off their asses and release something decent for it... Rage just looks like a complete ripoff of Borderlands anyway.
ID are over totally estimated, one trick ponies, imo.
rage doesn't look like borderlands to me.. totally different art style- remember in borderlands you ran around with a ***** shooting vaginas.. yes that's exactly what it was
I actually liked the graphics of Borderlands and Firefall will use the same art-style fortunately. Gameplay needs to be fun, graphics are overrated.
Separate names with a comma.