This might be obvious for some, but as someone new to 3D displays it took me some research to figure out. It was this post from Nima which got me on the right path:
All of the VR180 footage from my Canon had severe ghosting when displayed on the Lumepad 2, despite looking fine in my VR headset. Turning the black levels up a bit completely solved it for me!
In my experience, there are two situations where I experience ghosting:
Black levels are too dark, causing content to bleed into the feed meant for the other eye.
When there’s a large amount of disparity, or 3D depth. This is completely normal, and it’s how we experience the world through our 2 eyes. When our eyes aren’t focused on an object, we see double.
Turning up the black levels solves #1, and being careful about how I shoot solves #2. Making sure objects don’t get too close to the camera greatly reduces #2. This is easily accomplished with the built in Lumepad 2 cameras, but it’s not too hard even with the Canon fisheye VR180 lens.
Feature request: Implement an anti-crosstalk (ghosting) feature which optimally turns up the black levels for us. This would be easier than me re-encoding all of my content, and the Lumepad itself should be better than me at determining how much of a change is necessary to avoid ghosting.
Yeah, the ghosting exists primarily with high contrast scenarios. It’s not terrible, but there is a good amount of crosstalk. Anyone know an autostereoscopic display with less crosstalk?
I reopen this old thread, since I had not played a lot with LP2; but recently I have downloaded a lot of my 3D contents in this device, and also converted some 2D pictures. I have to say that often there is an annoying ghosting in Lumepad 2 which also can results in details that flickers over time; of course it is worse with high disparity images, but seems to affect even images that are converted in 3D by Lumepad 2 itself and not that much 3D, or shot by Lumepad 2. As matter of fact, while in the old Lumepad and RH1 you have to move to find the sweet spot, I find somehow easier to see 3D content in this old technology in general, even if resolution in LP2 makes generally better and more visible images. I am not sure that the reasons are the ones described here, and I wonder if there is any other possibility. Is there any manual adjustment that can be done to the eye tracking system?
according to al the information I have read on this EVERY auto stereoscopic display has this problem
that’s Y nintendo put in an adjustment slider in to the 3DS to help resolve the issue. BUT it dont work to well and the 3DS XL was even worse and it failed.
EVERY movie with the exception of a few has high disparity.
disparity = depth the more depth you got the greater the disparity
and it cant be adjusted on the fly.
only programs that can do this are studio based
programs like NUKE, PFtrack, resolve studio fusion and maybe myia. you might be able to use AE but not sure on that
these programs allow the disparity to be re adjusted but its NOT as simple as just loading them in and doing it.
plus leia inc is still WAY behind in software tec. loads of WAY better FREE options for 3D converting and STILL a projector is far better of a choose for 3D viewing than the hardware they sell as you don’t have to deal with any cross talk
They all use either active shutter or passive polarized glasses, which both have cross-talk issues. The vast majority of 3D projectors use active shutter which have higher cross-talk than polarized solutions.
If you don’t know that, then you don’t seem to know much about this topic.
I just moved from the LP1 to the LP2 and crosstalk was the first thing I noticed. So what’s the big difference between them? Obviously, the eye-tracking cameras are staring you in the eyes but if the room is dark and it can’t identify where you are looking that might be a clue. With the LP1 you simply change the orientation (intuitively) and it goes away. But that doesn’t work for the LP2 because of the eye-tracking so simply TURN ON A LIGHT and the camera will find your eyeballs. I don’t know who remembers the FujiFilm W3 camera - it came with a hardware adjustment button to control the display - maybe that’s what the LP3 needs? Just a thought - I’m getting old so I sometimes doubt myself. Any thoughts, my friends?
Well, as matter of fact the light is on, so tracking is not a problem since other 3D pictures are pretty good, especially those that being shot by H1 do not have a huge disparity. Also, I seem unable to get pictures from Lumepad 2 which are perfectly in focus (I mean 3D focus), at least with the autofocus enabled: it seems that the autofocus adjust correctly the 2D focus but most of time the resulting picture cannot be seen correctly (huge ghosting), unless adjusting the 3D focus after shooting; however even in that case part of the scene in background experiences a quite significant ghosting. It is a pity and I wonder if there can be some hardware or software issue in my unit, since apparently everyone is very happy with the picture shot by Lumepad 2; in terms of colors, definition and sensitivty to light, I am very happy, but as 3D pictures they need always some post-process adjustment. It is better when I put focus manually before shooting.
Edit: I have just noticed that if I see in Stereo in my Hydrogen One a picture that I have shot in my Lumepad 2, when seeing left eye I see some ghost of right eye and vice versa.
Good question, I have still to try and will update soon. Yesterday I have made an experiment: shot a quite high disparity object that when seen created huge ghosting; I have exported as SBS and both, LIF and SBS, when seen either in H1 or LP2 showed large ghosting, almost identical. Then I have edited both by Leiaplayer in order to adjust and reduce ghosting on the main subject (so there is some acceptable ghosting only in the background), both becoming LIF ; at the end the LIF edited image still had some ghosting on the main subject, while the SBS edited (and saved as new LIF) was way better. I put here the unedited SBS and LIF original images.