ALL USERS VOTE: Enable Eye Tracking!

I know the Lume Pad has all the hardware needed for Eye Tracking.

I think its a must feature. I have the patience to look for the sweet spot because yes the Lume Pad suffers from that but when I want to give it to a new person with zero knowledge of 3D I have to couch them how to look at the screen and 80% they will say “my eyes hurt”.

*Eye tracking is for single person viewing? Yes! Perfect, I hand anybody the tablet and feel good they are seeing 3D and NOT double images then they start to swing the tablet left to right like the photo is a GIF or something saying that famous Lume Pad phrase “my eyes hurt”

So everybody flood this forum till we get this feature.

IF A LUME PAD employee does not respond and make this feature happen happen. There is no hope in this company :slight_smile: for real.

1 Like

I am not a Staff member of Leia, but I know for sure that everyones IPD ist different, that could eventually be taken Care of with eye Tracking, but as far as I know only in strictly controlled Media, since the Source, distance to the object, and therefore the Parallax of the individual Pictures or Video Clips are from many different sources this is probably not possible for all the Media that’s available right now.

I think setting the Parallax Slider manually shouldnt be a problem. Also a lot of the Media has way too big Parallax to be viewed comfortably, but that’s in the Hands of the Creators more than anything Else.

Go search for Parallax and Viewing Distances, there are spreadsheets for creating 3d Content that are true since the beginning of 3d Content over 100Years ago and continue to be.

You will never be able to do a great Closeup with a Fuji W3d without the macro attachment for example.

If You take these measures while creating Content You should be fine with any Viewing Option available.
Bad Media looks bad on any device, no matter which one.

1 Like

Hey @adigital,

We have a partner SDK to enable eye tracking using the ToF sensor in apps, but for a variety of reasons (performance and heat are big ones) we can’t enable it system-wide.

The Lume Pad has a much wider sweet spot than the vast majority of autostereoscopic devices. It has the absolute widest viewing angle of any 3D tablet.

We’ve demo’d Lume pad to thousands of people at our office, in person, and at conferences and people always comment on how comfortable it is compared to other 3D they’ve tried. The only people who mention any sort of eye discomfort are people who have had issues with other 3D tech like RealD 3D in movies and VR headsets (I always make sure to ask and always hear the same thing). Adding eye-tracking will not resolve people’s optometric health issues in the vast majority of scenarios.

5 Likes

Thank you @Nima for your reply.

Do you think a photo only beta app would come out soon for eye tracking? Maybe photos only is less taxing than video? Are there other plans for the TOF sensor on the Lume Pad?

I like the Lume Pad. 4V mode does make 3D more comfortable. A note that 4v does weird stuff to some of my SBS photos like raise some areas from flat grass for example, any news on better depth maps generated from imported SBS to 4v mode?
The SBS photos I import are good 3D I know because they look fantastic on my Sony 3D TV. Most of them are from the Fujifilm w3.

Trust me all the people I gave it to do not have heath issues. Is there a recommend distance to view?

1 Like

Hey @adigital,

I don’t know of any app like that, but your feedback has been heard.

The hard part isn’t the eye-tracking in this case, it’s the novel view synthesis. This is one of the few situations where real-time content like games actually would be easier than photos or videos, as accurate information for distant views already exists, versus photos and videos which would require either trying to synthesize the distant views ahead of time, in real-time, or just use pure stereo and throw away the goal of additional views altogether (which causes a major timing issue as switching the location of the view needs to be significantly faster when there isn’t another view waiting to the left or right of the current one for the user to land on).

Generally, Leia’s focus is Lightfield content, as that’s really what we’re the best at. Though we’ve recently released some stereo-focused features due to some enthusiasts making requests, it’s a very tiny fraction of our user base using enthusiast features, so we have to weigh that against what the majority of users will enjoy. But again, your feedback has been heard. When we have the chance, I’ll ensure the team explores what’s possible.

We are always improving our algorithms, so it could get better automatically in the future.

To give some context as to what causes issues in the algorithm, there are a few things that could be a problem:

  1. There’s alignment issues in the stereo image (even if not visible to a human eye or in a stereoscope, the algorithm works on a pixel level and can “see” things we can’t)

  2. The camera is very different from our ground truth training data. We’ve primarily trained our algorithms using content shot on RED Hydrogen One, as well as from paid stereo data sources which use a variety of cameras and rigs, however, only photos in which have been verified as having perfect alignment. If the camera has a wide baseline or the content has a large amount of disparity, or the content has little to no disparity, these are all situations where our algorithm doesn’t do a great job of estimating correctly.

  3. There are any repeating patterns like fence posts or tiles. The algorithm is bad at identifying such patterns and makes mistakes due to it.

As mentioned above, not all alignment issues are visible to the eye. FinePix W3 is actually known for having poor alignment in many cases (very inconsistent between individual cameras, yours may have very good or very poor alignment). Because of this, many W3 users use StereoPhoto Maker to automatically align the MPO’s. I personally found a pretty large difference in the quality of how Lume Pad processed MPO’s from my W3 when I A/B tested SPM’s alignment. I recommend you give that a try if you haven’t already. That said, if the content has a LOT of disparity, there’s no easy way for it to be translated into a 4V image with a clean parallax effect without massive view jumps, so when the algorithm tries, it will usually end up flattening the image and creating artifacts. High disparity content should probably be shown in ST mode or be displayed on another 3D display more optimized for it.

In the rare cases when people tell me that Lume Pad makes them uncomfortable, I ask them and they admit that they have astigmatism or another eye health issue. If that’s not the case for people you’re demoing to, I’d be curious as to how they feel about other 3D displays, such as polarized 3D glasses-based systems.

What content are you showing them to start? I usually start with the home screen showing the immersive wallpaper, as it’s a very simple effect, without much disparity, so people’s eyes can get adjusted before I show them photos or games or video which might have much more disparity. I don’t show people high-disparity stereo content unless they explicitly ask to see something with more depth, as most people in general I’ve found can’t handle very high-disparity content (that’s the exclusive realm of enthusiasts like us).

Yes, it depends on your IPD, but in general people should be about 30cm-36cm away from the display at the dead center of the display to start. From there it’s much more comfortable tilting the device to see other view zones and move it back.

7 Likes

I had someone say that to me just the other day - after they bobbed their head left and right repeatedly - they looked like a pigeon in the park! LOL Say to them, “I’ll hold the tablet steady for you - please keep your head still too and give your brain time to adjust to seeing something it’s never seen before!”

I hope that advice helps. If the dev team is reading this, I don’t see eye tracking as a high priority but if it is for you, there was a news story I read awhile back that a company in Israel was developing a 12 view autostereosopic computer display with eye tracking x 12 built in - I wonder if they’d just liscense or share their code to save you a bunch of time and work? I’ll see if I can find the news story on the waybackmachine. Happy Easter/Passover

  • wowits3d