1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News VR bods partner for VirtualLink cable spec

Discussion in 'Article Discussion' started by bit-tech, 18 Jul 2018.

  1. bit-tech

    bit-tech Supreme Overlord Staff Administrator

    Joined:
    12 Mar 2001
    Posts:
    1,595
    Likes Received:
    28
    Read more
     
  2. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,750
    Likes Received:
    173
    I'll have to twiddle my thumbs until access to the provisional spec is granted, but from the public description it sounds like a formal subset of Type C DispalyPort Alternate Mode with some additional requirements added (e.g. minimum power outputs, minimum DP bandwidths, almost certainly some host-side implementation requirements for latency). Nvidia were previously rumoured to be adding a 'VR specific' output for the Turing/Volta consumer cards, so chances are pretty high this is it (likely as part oft eh superset of DisplayPort Alternate Mode now that's proliferating to other monitors).
    What it doesn't look like, for good or ill, is an additional DP link clock above HBR3. The good is that makes it compliant with other ratified standards (rather than pre-spec DP 1.5's not-yet-final HBR4), the bad is that it's not as useful for pushing existing planar monitors like the new G-Sync HDR screens, which are hitting bandwidth starvation when you shove everything up to 11.
    A naive glance may make it look like it will bandwidth-starve VR too, but that would only happen if everyone sticks with 'dumb planar transport' rather than taking advantage of DP's Multi Stream Transport to do hardware-level fixed-foveated rendering: transport a full-density view for the centre of the panel, than either segment the surroundings into multiple sub-resolution (and probably non-square-pixel) streams or send a single secondary stream using a subdivison method like chequerboarding, interlacing, MUSE, etc. Valve has demoed chequerboard periphery rendering (but not transport) a few years ago, and back at OC1 Carmack was talking about interlacing methods for transport and display.

    ::EDIT:: OK, spec access acquired. It's actually a bit more interesting than just a DPAM subset: the USB2 lanes are repurposed as USB 3.1 lanes to make room for the 4x DP channels meaning the HMD-end of the cable MUST captive (this is explicitly confirmed as part of a spec, for conformity with the Type C remap ECN). Also for Type C conformity is the requirement that HMDs do not expose downstream-facing USB ports (i.e. because they advertise as their own special-purpose device class, they can't 'pass on' that of any devices connected to them, so need to incorporate all internal USB devices as a fixed set advertised on initial connection). Reference cable length is 5m, and all cables include active signal conditioners (similar to the Rift cable, the Vive broke the signal conditioning out into the Link Box). The spec includes a host-side adapter to take a DP output and USB 3.1 port on one side and a VirtualLink Type C port on the other for host devices that lack a dedicated VirtualLink port of their own. I hope that regular orientation-independant DPAM is also a possible mode choice for said adapter, DPAM adapters are currently mythical fairy-dust which makes use of Type-C DPAM monitors on PCs a real pain.
     
    Last edited: 18 Jul 2018
  3. Pete J

    Pete J RIP Teelzebub

    Joined:
    28 Sep 2009
    Posts:
    5,256
    Likes Received:
    299
    Maybe I'm just being dumb, but considering the push for wireless VR, isn't this a bit redundant?
     
  4. jb0

    jb0 Active Member

    Joined:
    8 Apr 2012
    Posts:
    359
    Likes Received:
    32
    Not really. The whole "three plugs is too complicated, we need to get it down to one" argument works exactly as well for wireless VR as it does wired.
    A wireless base station for a VR headset still needs the same video feed, bidirectional data connection, and power source that a tethered headset does.
     
  5. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,750
    Likes Received:
    173
    No, because we are far from viable wireless video for even current gen devices.
    For VR, you have a latency budget of 20ms for "motion to photons", from the time you sample head post to the time the panel needs to be emitting light based on that updated pose. Every single millisecond of fixed overhead - sample IMU and tracker data, transport those samples over USB, OS overhead, pose calculation from sensor fusion, compositing rendered layers, post-render warp of rendered image, readout of framebuffer for transport, transport and readin at display end, feed to the actual row and column drivers to compensate for pixel nonlinearity (based on past activation too), and the wait for all pixels to be readied before pulsing the display - eats into the time you have to actually render the image. For current systems, you have a fixed overhead between around 9ms (OVR) to around 13ms (SteamVR). This gives a 'render budget' of ~7-11ms.
    Rendering is the only portion you have control over the time it takes (by increasing or decreasing scene complexity). Every millisecond you spend on transport is a millisecond you must lose from your render budget to remain within the 20ms limit. And worse, latency added during transport is latency added after the last possible late sampling of the IMU to compensate for head motion, so every ms of transport latency added is a ms of directly visible delay in head movement.

    Current hacked-together wireless video solutions like TPCast (based on SiBeam's WHD modules. Lattice semi claim only 'sub frame' or no more than 16ms latency, TPCast's marketed claim of ">2ms transmission latency" is super cheeky in that is ignore all the overhead beyond the actual A-to-B transmission time) can easily bust you out of the 20ms worst-case latency budget. Incidentally the SiBeam WHD modules were designed for stationary applications with mains power available, which is why they have such terrible battery life and need mucking about with a dedicated router to handle all the side-channel data without interference from other devices). Even upcoming solutions that claim an end-to-end of ~3-5ms are still eating a massive chunk of your precious render budget (e.g. worst-case 5ms impact + 7ms budget means you have less than HALF the render time available, so scene quality must more than halve to compensate).;

    tl;dr Wireless transmission is too slow even for today's devices.
     
  6. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    4,178
    Likes Received:
    242
    According to rumours the upcoming nvidia GPUs (whenever they finally come out) will have hdmi 2.1, so it should be fine to disregard the needs of monitors when it comes to making a new connection for VR.
    (yes, I know that doesn't really fix it because a new monitor revision with hdmi 2.1 will also be required)
     
  7. RedFlames

    RedFlames ...is not a Belgian football team

    Joined:
    23 Apr 2009
    Posts:
    10,014
    Likes Received:
    939
    So, is this basically Thunderbolt without the backhander to intel?
     
  8. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,750
    Likes Received:
    173
    It still irks me that HDMI gained any traction at all as an interconnect for desktop monitors. DisplayPort for monitors, HDMI for AV equipment, everything works great. Start mixing the two (mainly using noncompliant HDMI modes with implicitly different levels) any you just get problems.
     
Tags: Add Tags

Share This Page