1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Microsoft hides cloud latency with DeLorean

Discussion in 'Article Discussion' started by Gareth Halfacree, 22 Aug 2014.

  1. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,896
    Likes Received:
    979
  2. ChaosDefinesOrder

    ChaosDefinesOrder Vapourmodder

    Joined:
    6 Feb 2008
    Posts:
    706
    Likes Received:
    7
    sounds more like selectable parallel (or divergent) timelines than time travel...
     
  3. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,896
    Likes Received:
    979
    I know, but c'mon: It's called DeLorean, Marty!
     
  4. Big_malc

    Big_malc Well-Known Member

    Joined:
    7 Sep 2010
    Posts:
    1,598
    Likes Received:
    78
    Do they do this using a flux capacitor running at 1.21 Giga Watts
     
  5. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,554
    Likes Received:
    203
    Forgive my lack of understanding, but isn't the problem with streaming not the time it takes to render what is happening, but the time it takes to transmit what has happened ? In other words isn't the latency down to the network being used and not the hardware doing the rendering.
     
  6. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,896
    Likes Received:
    979
    Read the white-paper (or just re-read the article): DeLorean works by guessing what you're going to do. A crude example, which is a far bigger change than the team is working on: you're at the end of the corridor and can turn left or right. The cloud server renders frames for both possibilities and sends them to your client machine before you've pressed a key; when you make a decision and press left, your client software shows you the already-rendered 'he turned left' frame and discards the 'he turned right' frame. Et voila: latency, she dun gone vanished.

    In reality, the system is working on teeny-tiny per-frame differences, but that's the general principle as I understand it. There's a lot more to it, naturally - and it's all explained in the white paper. The result is that network latency of up to 250ms can be hidden entirely to make the game feel as responsive as if it were being played on a local machine, and latency higher than that - on a mobile network, for example - can be effectively reduced by 250ms (so a 500ms latency becomes a 250ms latency - still high, but half what it was.)
     
  7. Bungletron

    Bungletron Well-Known Member

    Joined:
    25 May 2010
    Posts:
    1,164
    Likes Received:
    57
    Burn.

    Since all the possible outcomes exist until determined by user interaction I would have called this technology Schrodinger rather than DeLorean.

    Totally ingenius, but I assume processing power and bandwidth required must be humongous? It is not just turn left or turn right, its actually perform any minute amount of movement in any axis plus any other action then draw and send every possible frame (one will be used the rest is waste, lol!)? It is very much like a grand master in chess predicting all the possible moves for several turns, maybe they should call it Kasparov...
     
  8. Umbra

    Umbra New Member

    Joined:
    18 Nov 2013
    Posts:
    636
    Likes Received:
    17
    Multiply by the 1,000s of people playing at the same time at different points in the game and surely that would require a ridiculous amount of hardware, maybe the likes of Nvidia could afford to build it and then lease bandwidth to others, probably not AMD though :lol:
     
  9. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,554
    Likes Received:
    203
    Thanks, i was thinking the rendered frames would just sit on the server until it knew what to send, it didn't occur to me that it just sent all possible outcomes for the next so many milliseconds to the client and let it choose what frame to display out of all the possibility.

    Sorry i didn't have time to read the white-paper, and maybe didn't read the article thoroughly enough (or I'm just a bit dense) :D

    I guess this would use more bandwidth if it was sending loads of frames to the client, even if only some of those frames are ever used.
     
  10. ChaosDefinesOrder

    ChaosDefinesOrder Vapourmodder

    Joined:
    6 Feb 2008
    Posts:
    706
    Likes Received:
    7
    Heisenberg, surely?
     
  11. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,896
    Likes Received:
    979
    There's bandwidth-management stuff in DeLorean too. Long story short: yes, it uses more bandwidth, but not *massively* more. Bear in mind that OnLive recommends a stable 5Mb/s connection for 720p, so you could triple that and still be in spitting distance of Ofcom's May 2013 national average of 14.7Mb/s.
     
  12. rollo

    rollo Well-Known Member

    Joined:
    16 May 2008
    Posts:
    7,691
    Likes Received:
    98
    Dout every ISP will be happy to have you drain 5mb or 15mb from them every second. More so after 6pm at night when throttling occurs across most UK ISPs.
     
  13. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,896
    Likes Received:
    979
    Netflix uses 25Mb/s for its "Ultra HD" streams, so...
     
  14. rollo

    rollo Well-Known Member

    Joined:
    16 May 2008
    Posts:
    7,691
    Likes Received:
    98
    25megabit or megabyte?
     
  15. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,896
    Likes Received:
    979
    Bit. Hence the lower case b.
     
  16. fluxtatic

    fluxtatic New Member

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    In Britain, sure. There's still plenty of the US that's missing out,. Actually, the US might be the odd man out on this one, I suppose. The rest of Europe and the parts of Asia that matter market-wise are likely just fine. Australia's probably pretty screwed too, though, I suppose.
     
  17. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,896
    Likes Received:
    979
    This suggests that the national average download speed for the US is 28.7Mb/s. Australia gets 16.2Mb/s. Granted, these are averages that will be skewed upwards by small-scale but extremely high-speed services like Google Fibre, but by those figures more homes in the US and Australia would be able to use a 15Mb/s streaming service than would not.

    That said, I'd be tempted to halve the figures on offer from that site: it claims the average for the UK is nearly 30Mb/s, double that of Ofcom's average. I guess more people use Speedtest to verify that they have teh fastz0rz than to check if their connection is slow.

    EDIT: Here, this might be a bit better: Akmai's State of the Internet report for Q4 2013. That says that average broadband speeds in the US broke 10Mb/s for the first time - meaning you could double the bandwidth required of a game-streaming service like OnLive and still be suitable for more homes than not. Still haven't been able to find any official figures, mind: does the FCC not produce a report like Ofcom's?
     
    Last edited: 24 Aug 2014

Share This Page