The Leia Tablet

I’m happy that Leia is working on a new hardware device after the Red Hydrogen.

However, I’m concerned that the cameras will again be set too close together, which will limit the human-scale experience of the 3d effect. Let’s face it, there are way too many pictures of food on Holopix. What is different from a tablet device vs. a phone sized device is the size. A tablet device could easily allow for camera placement that emulates the interocular distance of the average person. I am a landscape photographer and it always felt like I had trouble shooting “3d feeling” landscape shots.

Any thoughts?

1 Like

A phone-sized device has sufficient “room” to provide better/wider/more appropriate interaxial spacing. It might make grasping it difficult (ref: Fuji W3).

1 Like

Not only that, if they calculate disparity maps instead real stereo pairs is crazy they use so small lens separation. If you make a real big separation, let’s say10 cm, you can calculate depth easily for far objects, but also for nearer objects and correct the stereo base so you can avoid too much depth even if an object is so close it only appears partially on the other lens (unless you make a macro photo of course).

Great point, Stefan. Thanks for asking (and appreciate the kind words about our upcoming Lume Pad)!

To answer your question, we’ve got to first look at the design and go from there.

OK, let’s talk about camera (and other component) placement. It’s kind of like a game of Tetris getting all the pieces into place to create this high-end tablet. We’ve got to make room for everything – including our 10.8 inch display – and still balance out the thermal management. That’s why we’re laying out the camera the way you see [here].

Leia has developed disparity guidelines for the 3D Lightfield content to look good on the display. Without getting too into the weeds, the rule of thumb is that the optimal stereo camera baseline to record content at distance d is about d/100.

Most commonly, people are taking portrait shots, pictures of pets and (your favorite :wink:) food. In those cases, the 15mm baseline is pretty ideal.

Now, let’s assume for a second the board design allowed us to go wider, say human interocular 63mm. The optimal recording distance would be 6m (~20ft).

Then all those portrait and food shots would be 4x too large. We use computer vision tricks to synthesize views at the comfortable disparity level, but at that 4x level, you start seeing artifacts and things can get messy.

NOW FOR THE GOOD NEWS: Leia’s developed several techniques to record landscapes with a very wide effective baseline. We’re talking using orbiting drone aerial shots and producing stunning results….but we’re going to have to leave it at that tease for now. We will have more to reveal very soon on that front.

And, yes, you always have the option to use your favorite 3D recording device and visualize the content on the pad as we are fully compatible with SBS, both for photos and videos.

Hope that helps address your concern and that you’re able to join the Lume Pad family.

Again, thanks for the comment!

-DarrenG from the Leia team

5 Likes

So we will can make horizontal movement like when making manually 2 photos to make a stereo pair but in only one movement and without defects because something just moved between one shot and the other?

@Kano3D we won’t correct for moving objects between frames but the technique tolerates some – see example I posted on FB user group last week…

2 Likes

This is amazing!

2 Likes

@Kano3D yeah it’s freaking cool, you can do the same with 360 product shots. We are offering a certification process to teach photography studios how to produce this content and service the growing industry needs in retail, hospitality, digital marketing etc…

3 Likes