Discussion in 'Article Discussion' started by bit-tech, 12 Jul 2018.
Had to laugh at the on-stream 'demo'.
1) It's not a view through the optics, it's a completely synthetic view. Remember the Hololens 'demos' that used a high-def camera and a field of view MUCH higher than they actually achieved?
2) The tracking quality is frankly embarrassing. It's well behind ARKit and ARCore, let alone Hololens' tracking (which itself is just barely adequate for it's small FoV)
3) Objects have a very loose relationship with the environment.Notice they only demo two objects: the character who is locked to a flat plane (and only shown floating over featureless flat level surfaces so any relative shift - particularly z-shift - is difficult to discern), and the rocks that can float through space without interaction
4) Complete failure of any sort of occlusion (no object segmentation). Even with a fixed camera feed and not needing to deal with translucency issues of a view through display, and not needing to deal with optical alignment (due to the synthetic image), and with the supposedly already tracked hands, the hands completely failed to even attempt to occlude any virtual objects
5) Hand tracking. Fails to even meet the standard of the Leap Motion running Orion (much less the more recent version using the Dragonfly hardware)
In short, what they demonstrated off is many years behind what everyone else has had available, and what they haven't shown off the is one element (display & optics) that Magic Leap started up to capitalise on their supposed revolutionary advancement in.
Even as a devkit, the Leap Motion Northstar embarrasses the hell out of it in every respect: better tracking, better software (check out the hand-self-interaction demo series on RoadToVR), and better displays (higher FoV & angular resolution).
I'm not loving the el cheapo early 2000s design.
Separate names with a comma.