Is there currently software that allows the Lume pad to function as a stereoscopic monitor for pc either wirelessly or over USB?
Is there API to convert any of the hdmi 1.4 standard 3d formats (sbs etc) to a format that can be displayed on the Lume pad in realtime; either on the pc side or the lume pad side?
The feedback has been heard and the team has been working on a solution to this for quite some time. However, it’s a very large feature and will take us some time, so I don’t have an ETA for you.
HDMI 3D almost always has DRM, so there’s not really a good way to get (for example) 3D games on PlayStation 3 running on Lume Pad.
There has been a solution that the awesome @jakedowns made to play back SBS video streamed from a PC to Lume Pad, though this is also probably not exactly what you’re looking for.
So I guess the official answer is: we’re working on it. Sorry it’s not ready yet.
That repo is from 9 years ago It almost assuredly wouldn’t work on modern Android.
We’re looking at a lot of things. We’re hoping our solution is as elegant as possible and really enables a ton of professional tools and awesome content.
Until then, we do make the Leia SDK available. So anyone can take a swing at making their own solution to this.
We have an internal tool called Go4V that converts from SBS to 4V in real-time.
Unfortunately, that’s not something that we’ve found a way to release as a part of our SDK yet.
For now what you could do is simply duplicate the left eye and right eye twice so that you’re streaming stereo content in a 4V format. FFMPEG or other tools can be used to do this.
If I can’t do sbs to 4v yet on the tablet i’ll just modify the shader to do four samples from the single sbs texture; should be lower bandwidth and quicker in the shader.
Has your team investigated using the lume pad as a VR device?
3 DOF is trivial but do you have a stereoscopic slam solution for 6 DOF in the works?
I’m sure you’ll be happy to hear that Google ARCore is already fully supported
Lume Pad has 6dof already!
The only limitation currently is from Google: we can’t do stereoscopic camera passthrough AND also do horizontal plane detection at the same time. Meaning when 6dof is on, you have to have a fully virtually rendered scene, or if you choose to use camera passthrough of the camera feed it is going to be monoscopic.
Should be able to play SteamVR games on the lume pad with this.
I already have a vr driver to deliver the texture to the streamer which means it’s mostly just input problems to solve but I have a framework in place for that too.
i’d also love to see this streaming setup. theoretically, you could pipe video data thru to MPV, but webrtc seems to be the much more sensible solution
the image interlacing shader is available in a couple of my forked repos on github:
are you aware of the special url you can navigate to from a browser to flip the screen into 4V mode?
the only issue i could foresee with a browser implementation vs streaming thru to a native app is that i was having a hard time getting chromes fullscreen mode to line up with the shader.
it’s like there is some kind of scaling or cropping applied that causes the interlaced image to not line up exactly with desired screen pixels, which results in a fuzzy mess
i’m sure it can be ironed out, or maybe isn’t an issue, but that’s a wall i hit when diving into web gl and three.js demos
My guess is piping the left and right eye view into an interlaced output that would allow the user to spectate what a VR camera sees in the game
Using something like reshade, it might even be possible to fudge 4v views in games with easily accessible geometry or depth buffers
This could even be augmented and with say a valve tracking puck on the back of the tablet, letting you use the tablet like a virtual camera into a realtime space (web gl, unity, unreal)
Some kind of inside out tracking fused with device gyro could also allow this to be done sans external tracking. I know you can do this with apples AR kit but I haven’t looked into what the SOTA is for Android AR apis
It is a way to connect an android VR headset wirelessly to a PC.
I am not sure how it works, but it seems that there is a possibility of using Lumepad as a stereoscopic display?
Or a solution for a wired method can be what oculus is using?
Where you can use an android device as a display with just a usb cable, without the need for a hdmi in.
Just a +1 on this request. Currently I create vr180 content but my main view derives from a stereoscopic 3d sbs in after effects and at home I can use a TV set but on the go I have no way to continue working on said projects. The lume pad could be the perfect monitor and honestly glasses-free could even beat out using a tv, I purchased about 3 tvs to finally land on a passive set because it was more hassle-free to jump on it. If you do find a way to make the Lume Pad a monitor via usb-c you could change my workflow entirely, the lightfield approach is amazing