@Nima I’m trying to figure out how I could connect two external cameras to form a stereo pair for my project, and using Lume Pad as real-time stereo monitor. I have installed Unity and successfully connected Unity and Lume Pad as a secondary screen, thinking that maybe a solution would try to connect cameras thru Unity, but I’m really not experienced enough to understand if this is the best way to start. Any suggestion? Is there already something I could exploit?
Thanks in advance
Hey @m.farina,
I don’t know that Unity is the best way to build an app like this. I’d probably recommend making a native Android app using our Native SDK.
Either way, you’ll have to capture the two video feeds, merge them into an SBS, then use the Leia MediaSDK and Exoplayer to decode the 2x1 stream into a 4V Lightfield.
Nima
Dear Nima,
I have progressed a bit with Unity (to avoid to be confined to Android, I have decided to try this strategy). So far I have succeeded in several steps, namely capturing more video feeds that are seen on two RawImages of canvas, and using the Leia library to connect Lumepad and see 3D features thanks to Leiaremote. Now I’m stuck with the last step: I have inserted a LeiaMediaViewer in the scene and I can use it correctly (e.g. if I load a static SBS image, it correctly show in the 3D lumepad), but I should render the stream no longer in the rawimages but in the texture of Leiamediaviewer. My ignorance of C# (I’m a Delphi user…so Pascal oriented…) makes the task difficult. Any suggestion would help.
Hi m.farina,
I am not very familiar with RawImage but if I understand correctly,
- RawImage has a .mainTexture property of type Texture.
- the LeiaMediaViewer has a method SetTexture(Texture2D texture, int rows, int columns).
To try to hack this together, you might be able to move your pixels from your RawImage.mainTexture into a Texture2D, then set the LeiaMediaViewer.SetTexture(Texture2D, int,int):
// Texture2D to apply to LeiaMediaViewer
Texture2D texture2D = new Texture2D(mainTexture.width, mainTexture.height, TextureFormat.RGBA32, false);
// cache ref to previously active RT; we will have to re-set it after ReadPixels operation
RenderTexture currentRT = RenderTexture.active;
RenderTexture renderTexture = new RenderTexture(myRawImage.mainTexture.width, myRawImage.mainTexture.height, 32);
Graphics.Blit(myRawImage.mainTexture, renderTexture);
// stack on the new renderTexture with the pixels that were blitted into it
RenderTexture.active = renderTexture;
// read pixels from active RT into the texture2D
texture2D.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
texture2D.Apply();
// pop off the active RT, revert back to cached previous RT
RenderTexture.active = currentRT;
Thanks to your help, I’ve been able to finalize the program (really thank you very much!) and it works fine: at the moment I have two external cameras attached to the PC (that I use as SBS cam) and I’m able to see correctly a corresponding 3D video stream in the lumepad, where Unity Remote is running. So far, so good.
Final step would be either 1) to have the app directly in the lumepad, or 2) running from PC but without needed Unity running (so a compiled software).
Against possibility 1), even if I have correctly built the app for android, the problem seems that if I attach a webcam to the lumepad, it is not detected; not to mention the problem of having just one USB connector (and using USB hubs for use two webcams (once a single is detected) is not trivial. So the question: is this a problem of the specific webcam I’m using, or generally that USB will not work for external cams on lumepad?
Anyway possibility 2 would be fine for my use; however at the moment I’m exploiting Unity Remote that, in turns, assumes that I’m running the app from Unity Editor; in your opinion is it possible to use Unity Remote or something similar not needing the editor? I haven’t found an answer in the web.
Hey @m.farina,
There are actually a couple of stereo webcams that show as a single SBS stream when plugged into devices, and Lume Pad should work with external connected webcams that are Android-compatible with apps that expressly support it.
That said, your specific situation probably has a more elegant solution, such as the ability to use Lume Pad directly as an external monitor for your PC. But it may take us a while to release a solution for something like that.
For now I’d recommend you try to experiment with the SBS webcams and let us know how that goes. There should be a couple of models on AliExpress.
Here are some examples:
Nima
@Nima Thank you very much! It is a good suggestion, even if, since I have to plug webcams in optical microscopes (I will use this system as an aid for landing of a modified Atomic Force Microscope that I use as a Microwave Microscope) I will have probably to use specific webcams (right now I’m just playing with a couple of very economical webcams in the week ends ) . But if the problem is just the kind of webcams, I will solve it with a fast usb hub and appropriate webcams. It is a pity that Unity does not support a kind of Unity Remote without editor, since in that way the system is very funny (one can address at the same time cameras of Lumepad and the ones plugged in the PC). Later I will have also to solve how to integrate Unity software with my microwave microscope software that I wrote in Embarcadero Delphi… But this will be another story
Marco
I would add that actually I have found some link where they report fundamental problem with Android 10 USB cameras connected to Android 10 device do not work |
I don’t know if this could affect Lume Pad.
Hey @m.farina,
I don’t believe we’ve done any testing yet but maybe I can ask our firmware team to confirm one way or another. If you have any USB-C to USB-A converters you could test and let us know too.
Nima
Ciao @Nima : sure. I have tried these cameras from Amazon https://www.amazon.it/dp/B087JBLS12?ref=ppx_pop_mob_ap_share with the usb-a/usb-a adapter included with the lumepad, and did not work (in the connection only “charge” is available). I have contacted their support, but they look quite confused (they stated it is only for pc, but in the broshure they mention Android). Then I have tested a old Hercules deluxe webcam, but same result. However I’m both cases I cannot exclude that the webcam is the issue. Most webcam have quite fuzzy specs…
Marco
Hey @Nima I can confirm that Lume Pad works with external webcam. One of the apps found on PlayStore, after many others failing, correctly detected one of the cams bought on Amazon that I have linked above. So apparently problems are only app dependent and not a problem of Lume pad, even if in the Options webcam simply appears as a device that can only be charged.
Just in case someone needs, in the following the script to connect two external webcams. At the moment it is fine with Unity Remote and webcams attached to the PC. Of course I have not handled possible exceptions, so it is not a “nice” code, but it works
//_________________________________________
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
using LeiaLoft;
// simple system to cast to Lumepad/Hydrogen one a 3D stream from two webcams M. Farina
//It assumes that the project, correctly settled with Leia SDK, has a canvas and a Leiaviewer
//The script has to be attached to a component, e.g. main camera, and in turn drag-&-drop canvas’ rawimage
//in the rawimage property of the SBSCameraRender script
public class SBSCameraRender : MonoBehaviour
{
Texture2D texture;
private LeiaMediaViewer leiamediaviewer;
WebCamTexture wctOne;
WebCamTexture wctTwo;
public int numofcameras;
public RawImage rawCamImage;
private Texture2D texture2D;
private RenderTexture renderTexture;
private GameObject InstScript;
private int index;
private string s;
public void Awake()
{
GameObject InstScript = GameObject.Find("LeiaMediaViewer");
leiamediaviewer = InstScript.GetComponent<LeiaMediaViewer>();
}
void Start()
{
WebCamDevice[] devices = WebCamTexture.devices;
numofcameras = devices.Length;
if (numofcameras >= 3) //this is the only condition we need actually: we have to sort cameras; we assume that the name is "webcam" for the ones used
{
index = 0;
s = devices[index].name;
while ((!s.Contains("webcam")) & !(index > numofcameras - 1))
{
index++;
s = devices[index].name;
}
wctOne = new WebCamTexture(devices[index].name, Screen.width / 2, Screen.height, 30);
index = index + 1;
s = devices[index].name;
while ((!s.Contains("webcam")) & !(index > numofcameras - 1))
{
index++;
s = devices[index].name;
}
wctTwo = new WebCamTexture(devices[index].name, Screen.width / 2, Screen.height, 30);
}
else //just to test the software when only integrated cam is being used
{
wctOne = new WebCamTexture(devices[0].name, Screen.width, Screen.height, 30);
if (numofcameras == 2)
{ wctTwo = new WebCamTexture(devices[1].name, Screen.width, Screen.height, 30); }
else
{ wctTwo = wctOne; }
}
wctOne.Play();
wctTwo.Play();
texture = new Texture2D(2 * wctOne.width, wctOne.height, TextureFormat.RGB24, false);
rawCamImage.texture = texture;
rawCamImage.material.mainTexture = texture;
rawCamImage.enabled = false;
texture2D = new Texture2D(rawCamImage.mainTexture.width, rawCamImage.mainTexture.height, TextureFormat.RGBA32, false);
renderTexture = new RenderTexture(rawCamImage.mainTexture.width, rawCamImage.mainTexture.height, 32);
StartCoroutine(StartCams());
}
IEnumerator StartCams()
{
while (wctOne.width < 100 || wctTwo.width < 100)
//to handle the known delay of Unity in providing correct info
yield return null;
while (true)
{
texture.SetPixels(0, 0, wctOne.width, wctOne.height, wctOne.GetPixels());
texture.SetPixels(wctOne.width, 0, wctTwo.width, wctTwo.height, wctTwo.GetPixels());
texture.name = "render_full_2x1";
texture.Apply();
// next part from Leia
// cache ref to previously active RT; we will have to re-set it after ReadPixels operation
RenderTexture currentRT = RenderTexture.active;
Graphics.Blit(rawCamImage.mainTexture, renderTexture);
// stack on the new renderTexture with the pixels that were blitted into it
RenderTexture.active = renderTexture;
// read pixels from active RT into the texture2D
texture2D.name = "render_full_2x1";
texture2D.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
texture2D.Apply();
leiamediaviewer.SetTexture(texture2D, 1, 2);
// pop off the active RT, revert back to cached previous RT
RenderTexture.active = currentRT;
yield return null;
}
}
Apparently the list “WebCamTexture.devices”, that while running Unity or in the PC includes any possible camera, once built in Android does not include external webcams, at least in my version of Unity (2019.4.11); if there was a simple way to “enrich” this list with the external camera, this script would be able to run directly on Lumepad or Hydrogen. My ignorance of Unity stopped me at this point, and looking in the web it seems that it is not a simple to solve it. Any suggestion…
This would be a great feature if it was standalone, to allow for greater Stereo Separation on Landscape shots!
I’ll look into this. Maybe i can find a way to expose them to the unity side
or a standalone way to do this
That would be wonderful!!
Would be great to have it as stand alone app… this could be used also for FPV in robotics and RC hobbies… I was using the stereopi solution that could maybe help with your app design. They have nice opensouce code to combine two external raspberry pi cameras into sbs.??
That’s really interesting. After I have shared the code, I stopped working on it (too busy…) since I have spent some time in seeing if there was a simple way to see two usb cameras and haven’t found (actually there is the possibility of purchasing some codes but not sure about the quality). I hope to have time to play a bit again on coding in near future