How to convert two 2D video streams to real 3D

I have two network cameras, and I can play their video streams using the RTSP protocol. Both cameras are pointed at the same object, and my goal is to display the real-time 3D image of the observed object on the Lume Pad. How can I achieve this?

It depends what you’re trying to do. However, you’ll need to align, converge, and rectify the stream into a single 3D image (probably SBS) in that order. I would recommend doing so on a computer or server before sending to the tablet, but it can be done on the tablet as well.

Then you will need to create an app using the Leia SDK that plays that video stream. That’s it, relatively simple outside of constructing the 3D image.

This is my solution, and it’s almost perfect. The app decodes the video from both cameras and combines the two images together. Then, the SDK handles the 3D synthesis. However, there are issues with flickering and lag in the images. This is how I’m passing the data to the SDK. Is there a better way to achieve this?

// bitmap is the SBS image from the two cameras. App will do this every 15ms.
val asset = InputViewsAsset.createSurfaceFromLoadedBitmap(bitmap, true)
binding.fullscreenContent.setViewAsset(asset)

I tried InputViewsAsset.createSurfaceForVideo instead of createSurfaceFromLoadedBitmap. Then draw the bitmap on the surface. It’s perfect now.

I have a very similar objective. I use the “USB Camera” Android app on my android phone with a stereoscopic USB camera connected to it. The USB Camera app then streams RTSP in side-by-side format. You can view the stream using VLC or on a VR headset but I would like to view it on the Leia Lume in 3D. No “conversion to 3D” would then be needed as the stream is in SBS.
This seems to be a very similar problem to the one you describe except where you use two separate cameras, I use a very nifty stereoscopic USB camera(google: “Stereo USB Camera”) with a free android app to create the RTSP stream.
Unfortunately, LeiaTube does not accept rtsp:// URLs - does this mean I have to write an RTSP reader using the Leia SDK? If so, I would be grateful for a pointer or two on how to go about this. What tools are needed? Any example code? Thank you.

I’m sorry for the late reply. My recommended solution is to use InputViewsAsset.createSurfaceForVideo to create a Surface within the callback function. Then, use an RTSP player to decode the YUV images. Finally, draw the YUV images onto the Surface, which can be achieved using OpenCV’s cv::cvtColor function.

Thank you very much for your suggestion.
I am a programmer used to Microsoft Windows and MFC, etc. however, I have now installed Visual Studio 2022 Community Edition and the Unity SDK. Is that the right toolkit?

Is there any example/starter code for a Leia Lume 2 application that will display in lightfield mode?

Also I would not know how to go about getting an “RTSP Player”. Is such code available as a library?

Thank you for any assistance.

  • If you intend to develop applications for the Lume Pad 2, it is advisable to download Android Studio and the Core Native SDK.
  • The Core Native SDK already includes code examples.
  • You can find many Android RTSP players on GitHub, and ffmpeg might be one of them.