real-time monocular depth estimation

I’m making a monocular depth estimation app for autostereoscopic viewing like this. However, when I built this app for LumePad, the load was too high and it stopped after only one frame. You guys have a very nice technique for local depth estimation and consistent depth on Android with LriaPlayer and LeiaTube. I want to incorporate your real-time depth estimation technology into my app. can it? If not, I would like information about your depth estimation model. Is the model published on OSS? Is it your own research model?

3 Likes

Hey @acidys230! Thanks for coming by the forum! We’ve passed along your questions to the team. What type of app are you building? Cheers, Marlon - Community Manager

1 Like

If this is for game content or real time 3D in an engine like Unity, you can use the color + depth shader I wrote and open sourced. Basically you just need access to the regular back buffer and the depth buffer and it can produce a stereo image pretty fast (around 1ms). The code is for anaglyph 3D in Godot, but would not be hard to port to other engines.

3 Likes

Hey Andres! Welcome! Thanks for chiming in here! Maybe this can help @acidys230? Would be fun to hack together sometime. LumePad2 Hackathon anyone??? :wink:

Oops, I forgot to put a link.

I’m creating an app with Unity. I’m excited about your company’s depth estimation technology.
I would also be very happy if Unity provided a way to access the camera’s color map and depth map.

3 Likes

That looks really cool.