Discussion in 'Article Discussion' started by Gareth Halfacree, 22 Aug 2014.
Up to 250ms of lag disappears.
sounds more like selectable parallel (or divergent) timelines than time travel...
I know, but c'mon: It's called DeLorean, Marty!
Do they do this using a flux capacitor running at 1.21 Giga Watts
Forgive my lack of understanding, but isn't the problem with streaming not the time it takes to render what is happening, but the time it takes to transmit what has happened ? In other words isn't the latency down to the network being used and not the hardware doing the rendering.
Read the white-paper (or just re-read the article): DeLorean works by guessing what you're going to do. A crude example, which is a far bigger change than the team is working on: you're at the end of the corridor and can turn left or right. The cloud server renders frames for both possibilities and sends them to your client machine before you've pressed a key; when you make a decision and press left, your client software shows you the already-rendered 'he turned left' frame and discards the 'he turned right' frame. Et voila: latency, she dun gone vanished.
In reality, the system is working on teeny-tiny per-frame differences, but that's the general principle as I understand it. There's a lot more to it, naturally - and it's all explained in the white paper. The result is that network latency of up to 250ms can be hidden entirely to make the game feel as responsive as if it were being played on a local machine, and latency higher than that - on a mobile network, for example - can be effectively reduced by 250ms (so a 500ms latency becomes a 250ms latency - still high, but half what it was.)
Since all the possible outcomes exist until determined by user interaction I would have called this technology Schrodinger rather than DeLorean.
Totally ingenius, but I assume processing power and bandwidth required must be humongous? It is not just turn left or turn right, its actually perform any minute amount of movement in any axis plus any other action then draw and send every possible frame (one will be used the rest is waste, lol!)? It is very much like a grand master in chess predicting all the possible moves for several turns, maybe they should call it Kasparov...
Multiply by the 1,000s of people playing at the same time at different points in the game and surely that would require a ridiculous amount of hardware, maybe the likes of Nvidia could afford to build it and then lease bandwidth to others, probably not AMD though
Thanks, i was thinking the rendered frames would just sit on the server until it knew what to send, it didn't occur to me that it just sent all possible outcomes for the next so many milliseconds to the client and let it choose what frame to display out of all the possibility.
Sorry i didn't have time to read the white-paper, and maybe didn't read the article thoroughly enough (or I'm just a bit dense)
I guess this would use more bandwidth if it was sending loads of frames to the client, even if only some of those frames are ever used.
There's bandwidth-management stuff in DeLorean too. Long story short: yes, it uses more bandwidth, but not *massively* more. Bear in mind that OnLive recommends a stable 5Mb/s connection for 720p, so you could triple that and still be in spitting distance of Ofcom's May 2013 national average of 14.7Mb/s.
Dout every ISP will be happy to have you drain 5mb or 15mb from them every second. More so after 6pm at night when throttling occurs across most UK ISPs.
Netflix uses 25Mb/s for its "Ultra HD" streams, so...
25megabit or megabyte?
Bit. Hence the lower case b.
In Britain, sure. There's still plenty of the US that's missing out,. Actually, the US might be the odd man out on this one, I suppose. The rest of Europe and the parts of Asia that matter market-wise are likely just fine. Australia's probably pretty screwed too, though, I suppose.
This suggests that the national average download speed for the US is 28.7Mb/s. Australia gets 16.2Mb/s. Granted, these are averages that will be skewed upwards by small-scale but extremely high-speed services like Google Fibre, but by those figures more homes in the US and Australia would be able to use a 15Mb/s streaming service than would not.
That said, I'd be tempted to halve the figures on offer from that site: it claims the average for the UK is nearly 30Mb/s, double that of Ofcom's average. I guess more people use Speedtest to verify that they have teh fastz0rz than to check if their connection is slow.
EDIT: Here, this might be a bit better: Akmai's State of the Internet report for Q4 2013. That says that average broadband speeds in the US broke 10Mb/s for the first time - meaning you could double the bandwidth required of a game-streaming service like OnLive and still be suitable for more homes than not. Still haven't been able to find any official figures, mind: does the FCC not produce a report like Ofcom's?
Separate names with a comma.