I counter your BS with my own posted BS: http://www.pcgamer.com/2013/03/11/avalanche-playstation-4-pc/
problem you acan only code for a specific feature set so you would need to code directly for (say) an intel i5 2500k , on a Asus P8Z77-V LX board with an asus HD7870 with corsair xms 3 1600 c9 ram change the ram , that's a different vendor id , same with every other part.
Yup, that would be a problem. http://www.tomshardware.com/news/API-DirectX-11-Shader-Richard-Huddy-PC-gaming,12418.html I searched DX 11 API performance and the very first link mentions Bit Tech! "I certainly hear this in my conversations with games developers," he told Bit-Tech in an interview."
You change one part you break the direct to metal coding. Nvidia have done direct to metal for some of there pc demos so has AMD but actually coding a game to it would be alot of difficulty.
Is this as equally deluded or will it actually give 4x the power? http://www.tomshardware.com/news/Xb...ource=dlvr.it&utm_medium=twitter#xtor=RSS-181
It could play a role in multi-player games, something the current generation is hopelessly limited at!
So they aim for 1 billion lifetime sales and want to supply the server power for all those? Good luck, Microsoft! Offloading necessary computations to the cloud seems to me an absolutely terrible idea anyway. Once again, if you live in an area with poor internet connection, your ****ed. (but yeah, “why would you want to live there?”, eh Microsoft?)
Thats pretty cool. I guess you could have computations that aren't deterministic/real time bounced a way to a server farm and back while the local machine is taking care of stuff that needs to be quicker than the lag the internet would bring. Maybe its possible to render an entire background land scape and stream that while the local machine takes care of user input and non playable characters. There could be potential to really expand the whats possible in games.
True, but then again, what developer is going to think about it that way let alone microsoft letting them!
Developers will use the technology (whatever it may turn out to be) if it adds something worth while to a game and if Microsoft give them the tools to utilise it in a straight forward manner.
I was playing Battlefield 5 when suddenly I lost internet connection and all the background scenery disappeared.
Was coming here to post this, basically This whole cloud computing buzz is nothing but pie in the sky ideology. No different than Sony pushing the Cell as being something that it most certainly wasn't.
The cloud has it's time and a place. But I don't want it to start brandishing other weather related nonsense to describe it in the future... Can't login today... totally overcast.
It seems to be how he worded it which makes me laugh. Saying 'most' is clearly defining all PC's out there, that would include regular PC users who don't game (which most are). As for the EA dude, he clearly has a motive to say that. Find it funny that he says that after they said Ignite won't be on the new FIFA game for example.
via linux isn't direct to metal coding. There are too many variables in PC's to do this, it's only possible if you've got a set spec that doesn't change
So that's actually the one big advantage that consoles have. Because every eggs box is the same, every Playstation is the same. If only they didn't use those numpty controllers. I read a bit about the new eggs box 1. They mentioned it having a gaming OS and a version of windows and that both could run at the same time.