I’m sure you’ll be happy to hear that Google ARCore is already fully supported
Lume Pad has 6dof already!
The only limitation currently is from Google: we can’t do stereoscopic camera passthrough AND also do horizontal plane detection at the same time. Meaning when 6dof is on, you have to have a fully virtually rendered scene, or if you choose to use camera passthrough of the camera feed it is going to be monoscopic.
the only issue i could foresee with a browser implementation vs streaming thru to a native app is that i was having a hard time getting chromes fullscreen mode to line up with the shader.
it’s like there is some kind of scaling or cropping applied that causes the interlaced image to not line up exactly with desired screen pixels, which results in a fuzzy mess
i’m sure it can be ironed out, or maybe isn’t an issue, but that’s a wall i hit when diving into web gl and three.js demos
My guess is piping the left and right eye view into an interlaced output that would allow the user to spectate what a VR camera sees in the game
Using something like reshade, it might even be possible to fudge 4v views in games with easily accessible geometry or depth buffers
This could even be augmented and with say a valve tracking puck on the back of the tablet, letting you use the tablet like a virtual camera into a realtime space (web gl, unity, unreal)
Some kind of inside out tracking fused with device gyro could also allow this to be done sans external tracking. I know you can do this with apples AR kit but I haven’t looked into what the SOTA is for Android AR apis