For future lightfield devices that have a built in 3D camera it would be really great to have the ability to take a photo so that it can be viewed from any left to right angle within a certain range while correctly reflecting the true angle the users is viewing it from.
For example taking a photo straight on of my dog with its face displayed coming out of the screen, but then being able to rotate the device horizontally or adjust my own viewing position relative to the device horizontally so that the object holds its position and remains in focus, but is viewed from the true angle that I am viewing it from while still maintaining the same level of depth in and out of the screen.
The way I imagine this might work is using the camera to take a 3D video made by centering the camera on an object and moving in a U shape around it, then having the images captured by both lenses in the video stitched together and depth adjusted relative to one another to form an image that can be viewed smoothly from any angle within the captured range.
Is this type of feature something that is planned or under consideration? I am really looking forward to seeing everything that is included in feature devices, and this feature would be a really nice one to have!