Lume pad as stereoscopic monitor for pc?


Is there currently software that allows the Lume pad to function as a stereoscopic monitor for pc either wirelessly or over USB?

Is there API to convert any of the hdmi 1.4 standard 3d formats (sbs etc) to a format that can be displayed on the Lume pad in realtime; either on the pc side or the lume pad side?

For example I’d like to do something similar to:

(PC (3D Source image) → (Conversion?) → (h264 encode)) → (WIFI/USB) → (LumePad (h264 decode) → (Conversion?) → (Display))

I see there’s a unity remote but performance is very poor and not a solution for this problem.


1 Like


The feedback has been heard and the team has been working on a solution to this for quite some time. However, it’s a very large feature and will take us some time, so I don’t have an ETA for you.

HDMI 3D almost always has DRM, so there’s not really a good way to get (for example) 3D games on PlayStation 3 running on Lume Pad.

There has been a solution that the awesome @jakedowns made to play back SBS video streamed from a PC to Lume Pad, though this is also probably not exactly what you’re looking for.

So I guess the official answer is: we’re working on it. Sorry it’s not ready yet.



Thanks for the reply

Have you investigated using wifi display sink api on android?

Seems like a good fit.

GitHub - ivygroup/miracast-sink: Android miracast sink


Very little work to make this happen over wifi it seems?


That repo is from 9 years ago :joy: It almost assuredly wouldn’t work on modern Android.

We’re looking at a lot of things. We’re hoping our solution is as elegant as possible and really enables a ton of professional tools and awesome content.

Until then, we do make the Leia SDK available. So anyone can take a swing at making their own solution to this.

1 Like

Yeah after doing some more reading it looks like miracast isn’t widely supported anymore.

Thanks, I’ll share my solution once I’m able to test on device, probably in a month.


1 Like

@Nima Hey, I have this mostly working over webrtc now.

the last major step is the stereo to interlaced conversion.

I see that it’s handled in the webgl sdk as a shader effect but there’s only a 4 view shader?

Is there docs somewhere that explains how you are converting 2x1 to 2x2? Thats really all i’m missing now.


1 Like

We have an internal tool called Go4V that converts from SBS to 4V in real-time.

Unfortunately, that’s not something that we’ve found a way to release as a part of our SDK yet.

For now what you could do is simply duplicate the left eye and right eye twice so that you’re streaming stereo content in a 4V format. FFMPEG or other tools can be used to do this.

1 Like

Great thanks @Nima

If I can’t do sbs to 4v yet on the tablet i’ll just modify the shader to do four samples from the single sbs texture; should be lower bandwidth and quicker in the shader.

Has your team investigated using the lume pad as a VR device?

3 DOF is trivial but do you have a stereoscopic slam solution for 6 DOF in the works?

I’m sure you’ll be happy to hear that Google ARCore is already fully supported :sunglasses:

Lume Pad has 6dof already!

The only limitation currently is from Google: we can’t do stereoscopic camera passthrough AND also do horizontal plane detection at the same time. Meaning when 6dof is on, you have to have a fully virtually rendered scene, or if you choose to use camera passthrough of the camera feed it is going to be monoscopic.

1 Like


Should be able to play SteamVR games on the lume pad with this.

I already have a vr driver to deliver the texture to the streamer which means it’s mostly just input problems to solve but I have a framework in place for that too.

Exciting! Thanks!


If you get it working I’d love to see it! Good luck, and let me know if we can help in any way.

1 Like

i’d also love to see this streaming setup. theoretically, you could pipe video data thru to MPV, but webrtc seems to be the much more sensible solution

the image interlacing shader is available in a couple of my forked repos on github:

are you aware of the special url you can navigate to from a browser to flip the screen into 4V mode?

check my readme notes here if you’re interested mpv-android/ at jakedowns/LitByLeia · jakedowns/mpv-android · GitHub

the only issue i could foresee with a browser implementation vs streaming thru to a native app is that i was having a hard time getting chromes fullscreen mode to line up with the shader.
it’s like there is some kind of scaling or cropping applied that causes the interlaced image to not line up exactly with desired screen pixels, which results in a fuzzy mess

i’m sure it can be ironed out, or maybe isn’t an issue, but that’s a wall i hit when diving into web gl and three.js demos

Very interested in this also. Also how would one go about using VR with lume pad?

1 Like

My guess is piping the left and right eye view into an interlaced output that would allow the user to spectate what a VR camera sees in the game

Using something like reshade, it might even be possible to fudge 4v views in games with easily accessible geometry or depth buffers

This could even be augmented and with say a valve tracking puck on the back of the tablet, letting you use the tablet like a virtual camera into a realtime space (web gl, unity, unreal)

Some kind of inside out tracking fused with device gyro could also allow this to be done sans external tracking. I know you can do this with apples AR kit but I haven’t looked into what the SOTA is for Android AR apis

1 Like

Is it possible to adapt this for Lume pad?

It is a way to connect an android VR headset wirelessly to a PC.
I am not sure how it works, but it seems that there is a possibility of using Lumepad as a stereoscopic display?
Or a solution for a wired method can be what oculus is using?
Where you can use an android device as a display with just a usb cable, without the need for a hdmi in.


Hi Ash,

Thanks for the question and welcome to the forum :slight_smile:

The answer is Yes, its possible to adapt ALVR for Lumepad.
It’s something we’re exploring internally, but have no concrete plans for at the moment.

There’s just a fair amount of plumbing involved (wireless or tethered), but our android SDK can be setup to handle the inputs coming from any source.

Our SDK’s are nimble and modifiable, they can easily support a 2 view input.


Just a +1 on this request. Currently I create vr180 content but my main view derives from a stereoscopic 3d sbs in after effects and at home I can use a TV set but on the go I have no way to continue working on said projects. The lume pad could be the perfect monitor and honestly glasses-free could even beat out using a tv, I purchased about 3 tvs to finally land on a passive set because it was more hassle-free to jump on it. If you do find a way to make the Lume Pad a monitor via usb-c you could change my workflow entirely, the lightfield approach is amazing

1 Like