Questions before buying

I don’t have a Lume Pad yet, but I’m thinking about buying one. Perhaps some questions can be answered in advance. I shoot a lot in S3D. I already have a Freevi 3D Commander. But the tablet is because of Android 4 and the Specs hardly usable. Does anyone happen to have both devices? What is the difference in image quality when viewing S3D MPO images? What can I expect in terms of quality from the Lume Pad when I look at 3D photos? What is the sweetspot like? With the 3D Commander you had to be very exactly in the middle and not move your head or the tablet very much. Is that better with the Lume Pad? Can you watch MPOs with the Lume Player in the normal way? Can I watch SBS or HOH Cinema films on the pad?Does the pad work smoothly and quickly when switching photos?Thank you in advance for your answers

I can only say that I watch MPO from Fuji real w3 and they are good. On the specs you can see about viewing angle, and even for this I can say just it is excellent. Very smooth and powerful, I love it. I haven’t seen a movie so far, only some trailer, and they are amazing since they are very natural.

How can the Lume Pad achieve good stereo separation when the space between the rear-facing cameras is so much less than average human inter-pupillary distance?

Would be very pleased to hear a detailed technical answer, as this is a foundational question.

Sorry for posting as a reply.

Thanks!

@apiariste :
With its small lens spacing, the Lume Pad camera can capture depth only in close-up photos. It is a good camera for that purpose, but if you want to capture any binocular depth beyond a couple of meters, you will need a camera with lenses spaced farther apart (or use a mono camera and move it between shots).

To put simply, it’s due to our computer vision algorithms doing Lightfield image synthesis. The disparity between the cameras gives us fantastic data up to about 10 feet away, good data up to 20 feet away, and then anything beyond that is mostly estimated using computer vision.

This is only useful, however, when displaying the content in Lightfield. As others have said, if you export the images and display them in raw stereo SBS (where we do no additional work on the image besides just stereo alignment) you’ll only get good 3D from close up or up to 10 feet away maximum.

David, Nima—

Good information, thank you.

Wish I had enough Physics to rig a mirrored adapter such that data could be collected from greater than average human inter-pupillary distance (AHIPD), to enable “dollhouse images” of real-world scenes, in stereo.

Regards,
Katie