Discussion in 'Hardware' started by Parge, 10 Apr 2013.
makes you wonder why so few vr games support sli/crossfire.
Maybe I should of worded it as a higher PPI which in relation would mean a higher resolution. Also more FOV is another thing that we all want. Pimax have achieved that with their 8K model with 200° and a pair of 4K panels.
I read that Nvidia had to re-work SLI so one GPU renders the left eye, and the other the right eye. But it's down to developers to add the support to their games too.
in other news my hardly used vive is listed in the classifieds. im sticking with rift only because i got prescription lens for it. and use it for aseto corsa/ project cars 2
Because it's really fecking hard, and doesn't gain you as much as you'd expect.
VR is limited by latency. You want it as low as possible, and MUST be under 20ms between head movement and the next screen refresh ('motion to photons' latency). In practical terms, that gives you around 17ms to render for OVR, and around 11ms to render for SteamVR. This means AFR (Alternate Frame Rendering), the process by which every SLI and crossfire game renders, is not an option.
For a single GPU, you can do fun things like sharing the geometry calculations for both eyes (because you're looking at the same scene from different angles) and only splitting the unique rasterisation operations. But for multi-GPU you now have a choice:
- Do the geometry operations on each GPU, replicating all the work. Wastes power, no speed gain for geometry operations, need to ensure duplicated geo operations stay in sync (e.g. do not let a GPU sample a buffer, that buffer be updated, then the second GPU sample it) or end up with objects looking different to each eye.
- Do the geometry operations on one GPU, then copy the result to the other. One GPU now twiddles its thumbs (no speedup) and then neds to wait for the copy to complete before starting raster operations.
And then once everything is rendered, the half completed scene then needs to get pushed back to one GPU to be composited, warped, and then output. With all the otherhead, you don't get anywhere CLOSE to 100% scaling, but more like ~30% best-case.
And then you have fancier functions like ASW (Asynchronous SpaceWarp) where the eye buffer gets sent through part of the GPU encoding pipeline to use that FFB to extract motion vector data, and that motion vector data is used to extrapolate a new synthetic frame to use in the case of a dropped frame or a late frame. Done on one GPU you have access to both eye buffers to use for stereo estimation and to 'paste' missing pixels from one buffer to the other as objects rotate. With two separate GPUs, that now requires duplicating operations and keeping two chunks of memory in sync while doing a bunch of inter-GPU copying.
And because all this ties deep into the render pipeline, unlike 'normal' SLI/crossfire it cannot be 'applied' to a game at the driver level. It needs to be designed in from the start. That means all your work optimising your game for minimum latency (find pipeine bottleneck, relieve bottleneck, identify new bottleneck, etc) needs to be redone all over again because you have a totally different set of latency constraints. Thaat's now a bunch of extra work for the game/engine developers to do to cater for a tiny portion of the market for only a small performance improvement.
This probably isn't going to change until GPU architecture changes radically from throughput-optimised (as is currently the case, and as is needed for GPGPU) to latency-optimised. That's a 5-10 year timescale even assuming GPU developers are willing to bet on VR being a big enough market by then to justify upending the last two decades of GPU design.
I certainly do wear glasses. I think a hell of a lot or research is needed on my part, thanks for your help
I have worn glasses with my Rift, and it's fine for short usage, but they're the small rectangular type glasses.
I mainly wear contact lenses, which is much better.
As others have mentioned, you can get hold of prescription lenses that clip onto the Oculus.
Just bought a Rift from John Lewis let us hope it is as much fun as it seems. Will need to look into getting a prescription lens insert thingy though.
You won't regret it, especially in Project cars 2 and Lone Echo. What seems like 5mins in VR is about 1hr in real time.
If you have any trouble setting it up, especially if you start using USB extensions etc, (Only certain ones work) or need advice on what I use, then give me a shout.
Also depending on your motherboard, you may need to buy a Inateck pci-e usb 3.0 card (about £20) to spread the bandwidth, as these sensors push the bandwidth of the onboard chipset to it's limit.
What do you mean by only certain usb extensions work?
Just checked what I have in my cable box and all I have is a 5m USB 2 Active Extension Cable is that any use?
Mobo is a Gigabyte /GA-Z170-Gaming K3
When you start using extensions cables, you really want all your Sensors running USB 3.0, otherwise you will start to have tracking issues.
So extensions as per what Oculus have recommended;
PCI-E USB 3.0 card also recommended by Oculus.
My Oculus ran flawlessly on my MSI laptop, but now I'm back on a desktop, my z87 chipset was a pig to get it all working at USB 3.0, as having the 2 sensors and the Headset all connected to the same USB host caused endless problems due to bandwidth limitations. no other peripheral's will utilize the full bandwidth, so that's why it is recommended to spread the bandwidth over another USB host controller.
So I have 2 sensors connected to the Inateck card, and the headset connected to the onboard chipset, this spreads the bandwidth, and everything runs flawlessly at USB 3.0 speeds.
My advice is, plug it in and have a play, if you have issues getting past the setup due to bandwidth limitations, then the above info will help you have the best experience from it.
EDIT: Do not skimp on buying a cheap USB Host controller, as that will not work. (I found out the hard way...) - Oculus recommended Fresco FL1000 host controller as linked above.
I would rather have the extra kit to hand rather than wait to see if there are issues and it is not as if a USB card would not be useful anyway, so I will just order them now. Given the elongated delivery times at the moment.
Also, any HDMI 1.4 or newer cable will work if you plan on extending the headset.
OK, well if you use the card, then don't plug anything else into it, just the two Sensors, one in port 1, and the other in port 3 works best.
Those of you who wear varifocals have you bought lenses to fit the Oculus and if so where from?
If the GA-Z170-Gaming K3 listed in your sig is still correct, the extra USB PCIe card is not required unless you have literally run out of USB ports and don't want to use a hub. The USB3 controller built into the Z170 chipset is more than capable of hosting 4 cameras, the Rift, and any other USB devices you might throw at it via a hub (it's how I'm running things with an ITX Z170 board).
I have run out of USB slots so the USB card will be useful.
not at all clued up the differences of lenses but wouldn't a normal single focus lenses be better?
Don't understand, I need varifocals to see properly.
Go to your opticians and try some contact lenses. Best thing I ever did. I have continuous wear ones, which I sleep in for a month at a time. (Once you can overcome sticking your finger in your eye, it's quite easy.)
That is never going to happen as I have a serious phobia about sticking anything in my eye that and needles which will mean that when I eventually have my cataract operations I will have have it done under a general anaesthetic.
Separate names with a comma.