1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Displays The VR thread

Discussion in 'Hardware' started by Parge, 10 Apr 2013.

  1. neil_b

    neil_b Minimodder

    Joined:
    20 Jun 2009
    Posts:
    297
    Likes Received:
    1
    Hmm, as predicted the Oculus readiness tool rejected my GPU (no surprise there), but it also rejected the CPU...:miffed:. I might run the tool again and see if its ok with the case and power lead. Hmmph... Bah humbug, didn't want Oculus anyway. Stomps off in a sulk......:duh:
     
  2. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    Don't put much stock in that tool - it just looks at your specs and matches it to their recommended spec. Even a small overclock will push it past the i5 4590 that is the recommended CPU.
     
  3. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69
    people that are into radio controlled stuff, like cars and helicopters will be able to go into the vive or oculus, attend to open spaces and play with models of radio controlled gear they could never afford, I know there are some softwares like that for the normal screen and its kind of cool but in VR it would be awesome, I'm not really into R/C stuff but I do like it.

    second that the 2600k with a 4ghz OC with a 970 should be more than fine, getting up and driving higher GPUS could do with a more modern processor,
     
  4. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    4K the dif between a 2600k and a 6600k is 0fps on the same gpu. The only way the CPU will effect fps is if the resolution is sub 1440p. VR is higher should make no dif what CPU your running.

    SLI and Crossfire both see benefits on newer CPUs tested at stock settings. Easy way to calculate you gain around 1-3 fps on the average score per 100mhz over a stock 2600k at 720p the setting they test CPU performance on.

    Most 2600k will do 4.5ghz minimum.

    I borrowed a 980ti from a friend and played using dk2 VR Elite: Dangerous no problem on my old i7950 rig. ( CPU is what 6 years old at least maybe more )Was the experience that finally convinced me VR was worth the investment.

    Will be building a new rig just before it launches. I'd say what I've said for years above 1080p the performance benefits from a better CPU in games is very limited. Outside of the big RTS titles.

    The best stock CPU for gaming is the 5775c and its fastest at stock by a decent amount over all CPUs tested at there silly settings due to the 128mb eram chip. If the latest skylake series had that it would be a decent performance upgrade over Haswell.
     
  5. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69
    Last edited: 10 Jan 2016
  6. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69


    we had to make the best rift we could because we are committed to this unit for its entire product life cycle.
     
  7. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69
  8. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    Most reports put the life cycle at just under 18 months similar to most Smartphones
     
  9. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,284
    Likes Received:
    891
    I'm not currently able to view the video, but assuming that that's a direct quote, it's pretty meaningless - one definition of "product life cycle" could be "how long the developer remains committed to it". Of course, there may well be some additional context in the video that I've not seen, but on it's own that quote could mean pretty much zero.
     
  10. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69
    Any one know if the cv1 will support the dk2 sdk, so like SDK .06 for example.

    If I let go of my dk2 which I am thinking of doing but have no idea when or if it's even a good idea to do so atm, what happens if no race sims support the cv1 for 1,2,3 months, plus factor in if I pass it on now 2 odd months out from cv1 release.

    I guess even once cv1 hits dk2's will still retain value around.

    Any one still kept their dk2's ?
     
  11. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    "Longer than a smartphone, shorter than a console" was the last quote from Palmer on the Rift lifecycle. My guess is going to be 18 months to 2 years, depending on what technology becomes available in that timeframe, and how long that takes to mature and integrate.
    Inside-chance: eye tracking, if not sufficient for foveated rendering it is an interesting input device (knowing where a player is looking lets you do all sorts of neat tricks. Eye-controlled menus are not one of these). Eye-tracking for VR has now been demonstrated (by SMI) at sufficient performance to be useful, so now needs to be refined to be reliable and robust. Cost should be relatively minimal, an optical eye-tracker being effectively being two cameras (nIR, high framerate, resolution not critical) and some nIR LEDs.
    Outside-chance: wireless display link. If you'd asked me this a week ago my response would have been "Hahaha, no", but Nitero have now actually shown a functional demo of sufficient performance, something that had not yet happened. It's ultra-high frequency (and probably UWB) so is effectively line-of-sight, so assuming Nitero's cost estimates are accurate, the major barrier will be in adding sufficient antennae to have robust reception for a moving self-occluding target, with seamless handoff between receivers. This may be a Gen3/4 feature due to that challenge.


    As for OLED advancements: VR, at least for the next year or two, is not going to be even a single-digit percentage of the size of even Samsung's phone division alone. It's a drop in the bucket compared to R&D costs for OLED panels. For the forseeable future, the phone/tablet market is going to be the driver of OLED display advances, with VR getting to take that technology and massage it a bit into something more suitable.
    For comparison: just building an OLED fab plant is an upwards of $8 billion venture, 4 times the price Facebook paid to acquire Oculus. And that doesn't count the R&D required to develop panels, which is some cutting-edge organic chemistry (we're talking modelling quantum molecular interactions here to push forward luminous efficiency) and lots of process wizardry.

    No, CV1 support starts from SDK 1.0. Software build for SDK 0.8 and up might work, but there's no guarantee. Anything prior to SDK 0.7 is an absolute no due to the deprecation of Extended Mode.
     
  12. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69
    Thanks edzieba, :thumb:
     
  13. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    This PCIe to USB3.0 card has guaranteed compatibility with the Rift

    And by extension proabably this one too.

    EDIT: Also, second AMA

    He answers a few questions:

    On FOVeated rendering, the TLDR is - its cool, but eye tracking isn't fast enough yet for in game use.

    On SLI, basically in theory you could get a nice boost, but that is not the reality of the situation for current VR games (basically the same situation as non VR games, with driver support being patchy).

    On GSync/Freesync: GPUs sync with the display directly (ie: its built in)
     
    Last edited: 11 Jan 2016
  14. GravitySmacked

    GravitySmacked Mostly Harmless

    Joined:
    2 Mar 2009
    Posts:
    3,933
    Likes Received:
    73
    The 4 Port one works fine, the Oculus system check gives it the A-OK.
     
  15. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    Cool beans.

    I had a bit of a moment where I was like....hang on does this glorious ITX rig I've built have enough compatible USB 3.0 ports? Obviously I've no space to add an internal PCIe card. Had to go and look it up:

    • 2 x USB 3.1 Type-A Ports (10 Gb/s) (ASMedia ASM1142)
    • 4 x USB 3.0 Ports (Intel® X99)

    From what I'm reading the Intel branded ports are compatible.

    Some people might find that 2 or more of their Intel Host Controller ports are taken up by the front panel. In that case, something like this should do the trick.
     
    Last edited: 11 Jan 2016
  16. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Not really the same at all. For VR SLI/crossfire, the driver does effectively bugger-all. ALL the work is done by the game/engine developer in implementing multi-GPU for VR. If a developer does not implement it, there is nothing the GPU vendor (or Oculus, or anyone else) can do to 'force' VR SLI/crossfire. As VR multi-GPU relies on dispatching jobs correctly, it is almost certainly not something that can be hacked in after the fact (e.g. GeDoSaTo) in any way that actually provides a performance boost.
    Palmer's reply was a bit ambiguous. The Rift, like the Vive and PSVR, is locked to VSYNC at 90Hz. This is a requirement for Low Persistance driving; if you vary the refresh rate while only driving the display for a fixed period (1ms) per update, then the perceived brightness of the display will vary.
     
  17. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    Ah, I really meant the same situation for the consumer - with patchy support. I think using the word driver made my post a bit confusing though, my bad.

    So, essentially, it updates at 90fps whatever happens.... but if the in game framerate drops below 90fps, black frames make up the remaining frames?
     
  18. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    EVE Valk should have SLI CFX support then as its a exclusive launch title. The bigger games will never get SLI or CFX support devs don't do the work now. They are not going to do it for a even smaller audience. SLI and VR will be a tiny tiny % of people who own VR.

    Which is both worrying and annoying in equal measure. As you could technically get 2 cheaper cards say 2 970s and get better performance than a 980ti but for VR the 980ti is just the better card period no matter what.

    If VR takes off SLI and CFX could both be phased out.

    Also makes all duel gpu cards a waste of money.
     
  19. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    Well actually, the theory is that VR could finally be a perfect use case for SLI, in the form of one GPU Per eye, where each GPU renders the same scene from a slightly different angle. In this scenario, the likliehood of microstutter is hugely reduced since both GPUs would probably take the same amount of time to draw the frame. The idea is that this goes some way to solving some of the current annoyances that users of SLI experience. That is why AMD are marketing Gemini as a specific 'VR' card.

    It'll be some time before this is properly implemented though.
     
  20. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    Which game developer has the budget and know how to do this correctly whilst still maintaining normal SLI or CFX support.

    PC game SLI CFX support is pretty bad already even in sponsored games.

    Expecting developers to support a niche product on a already niche area of PC gaming seems like a long shot in the next few years. If VR on PC side gets huge then that would maybe change but SLI Crossfire rigs in general are still a very small % of PCs.
     

Share This Page