Swipe Control

Can swipe controls be added to LeiaPlayer app just like LeiaTube has when playing VR content? Gyro is good but we also need the option to use swipe controls.

1 Like

Hey @martinisangry,

Different teams work on the two apps, which is why the feature is in LeiaTube but not LeiaPlayer. But yes, adding swipe to LeiaPlayer is currently on the roadmap. It’s planned to come to LeiaPlayer in the version after the next version we’re planning to release.

That makes sense, looking forward to all the updates. Is it possible to view VR180 videos in 3D from an external USB drive or microsd card? I think I recall seeing that SBS 3D files, as long as they are already properly tagged, should play in 3D from external memory but is the same also possible for VR videos?

For VR180/360 video files, it currently only works for files on the internal storage. Starting with this upcoming release of LeiaPlayer, it will work for VR180/360 photo and video files that are on the MicroSD card or USB-C storage.

1 Like

Nice! This coming update and the following update with the swipe control on the Leia player will be much appreciated updates for the VR users like myself.

One other quick question that has come up now that I have had some time to test out VR footage. I noticed something that is not apparent with regular SBS 3D files but mainly only affecting VR footage. Depending on how much depth or how close a moving object gets to the camera when watching VR content, the screen seems to “struggle” deciding what it will “focus” on, it jitters when subjects get too close. Is there a way to minimize this from happening? I know VR is probably something you have not invested as much time as regular 3D but any optimization you can do for VR would be great. Not sure if there is some sort of “threshold” that can be programed when viewing 3D VR files that can keep the monitor from attempting to adjust in order to minimize the amount of jittering that happens during scenes with subjects closer to the camera.

Yes. This is called “Auto-Reconvergence” and it’s our AI trying to make things look good. It’s off by default on SBS but on by default for every other 3D filetype.

Starting with this next version of LeiaPlayer, you can add “_reconv” to SBS filenames to force Auto-Reconvergence on, and you can add “_noreconv” to the filenames of all other filetypes including VR180 to force Auto-Reconvergence off.

Be warned: it’s on by default for VR180/360 because we’ve tested hundreds of VR media files and found that it usually makes content look better. There’s no guarantee that disabling it and just going with the files native convergence will make it look good. But you’re welcome to try it, and if it doesn’t look good you can of course manually adjust the convergence when stitching/mastering the VR footage yourself.

Lume Pad 2 is an auto-stereoscopic screen, and because of that content will not look identical between a VR headset and Lume Pad 2. These differences should be expected.

1 Like

Beautiful, and on the next release already! Bravo! I love having at least the option to control it, thank you!

Speaking of VR, will there be a possibility for you to add additional VR FOV profiles? Right now you only have 180 and 360. Any chance to add 190 FOV for the spherical native FOV of the canon VR lens and a 200 FOV option for the native Zcam K2 pro footage?

I found I can still “watch” 190 and 200 FOV videos using the 180 tag but it creates the expected warping at the top and bottom. Would be nice if you can support those other 2 formats.


No, the expectation is that footage from those would be encoded into the Google VR180 format. The extra degrees on those cams are really extra for stabilization, not meant to be used raw.

Are you using the Canon EOS VR Utility to stitch and encode to VR180? If so, there should be no issue and no warping (unless there was insane camera shake when capturing). If not, well, there’s your problem.

No, sorry let me clarify, the footage from the cameras is left in its dual fisheye format and not converted to 180 equirectangular. Not only is the footage not equirectangular, its in fisheye format and has an extra 10 or 20 degrees. I guess the better question is to ask if you can add a tag for fisheye format, specifically 190 and 200 FOV. Let me know if that makes sense. Thanks

I understand. I’m saying that we expect all footage to adhere to the standard Google VR180 format. Canon offers an official tool which takes their Dual-Fisheye capture and turns it into Google VR180.

We want to support the standard format, and not formats unique to every model of camera.

In addition, even if we did, it wouldn’t turn out good. Each Canon Dual-Fisheye lens has different calibration data, which is encoded into the capture file. Their tool decodes it and uses it for alignment and calibration when generating a VR180, but the raw fisheye capture doesn’t have that.

So even if we did the work to make raw capture from the Dual-Fisheye lens work, it would only look perfect for the specific lens and camera we captured it on, not other people’s cameras like yours which have a different alignment and calibration. We worked directly with Canon on this for a while, and both parties came to the conclusion that to do it right, Canon would have to implement their own app on Lume Pad 2 that can decode the calibration data from each camera.

Understood, I get the whole calibration of each camera being different and even though it would not be perfect, it would still look a lot better than using the 180 tag on 190 and 200 fisheye videos. There are studios out there that still publish in this 190 and 200 format to have slightly larger FOVs for VR headsets, and so content out there does continue to be made. Anyways, thank you for clarifying that, I just wanted to see if that was something else we could eventually also have the option of changing ouselves.

To be clear, the FOV of the VR headset won’t matter. No headset, not even PiMax, can show a full 180 degree video at once. Whether the video is 180, 190, or 200 degrees, you’ll always only see a fraction of the total FOV when looking at it.

The only difference is how far you can turn your head in any given direction. And obviously, if that’s what you want, 360 is superior.

The 190 and 200 degree fisheye “formats” are not standardized, even between different cameras. The only app I know of that allows you to try to make it look good is Virtual Desktop, but I haven’t had luck getting Canon Dual-Fisheye content to look good even when I adjust the FOV to match.

Getting everyone on-board with a standardized VR180 is the best way to ensure intercompatibility everywhere on every device.

I will have to disagree regarding 360 being a superior alternative to a slightly larger FOV of 200 in order to get more to the left and right. 360 cameras, much less stereoscopic 360 cameras, have bad tolerances around the stich points for closeup subjects and you need at least 12k for good 360 stereo that comes close to the level of fidelity we currently get with 8k 180 cameras and currently there is no hardware on the market that I am aware of that can even play 12k video, let alone on a VR headset. If all you are shooting is subjects more than a meter or two away, then sure, a stereo 360 cameras that shoot 12k and then down sampled to 8k is a good alternative, but dismissing the additional 10-20 degrees you can get with these other cameras editing in fisheye format is not something I can agree with you. There are plenty of players that support these formats just fine like DeoVR and Heresphere which is fully customizable and has no issues. But ok, I understand you have no plans to support fisheye format. I’m still happy to hear about the other updates on their way. Thank again.

I was playing 12K 3D 360 VR video on the original Vive Focus back in 2017 using a software technique called Adaptive Viewporting. Basically, there’s no reason to decode the entire 12K video file if the user isn’t looking at the full 12K at a time, which you’re not when viewing VR180/360 video. There were a few solutions, but the one we used then was from Visbit.

If you can provide a GitHub (like the one below) to an agreed upon standard format that multiple manufacturers use, I will highly consider adding it to the roadmap. If not, it doesn’t really seem like it would make sense for us.

And just FYI, we do fully support the Canon Dual-Fisheye Lens output on Lume Pad 2, it’s the device we primarily test with. But we of course test with the processed VR180 images that come through their Canon EOS VR Utility, not the raw capture.

You are definitely correct about Adaptive Viewporting, it’s just not a process the average VR consumer of media uses, my point was mainly to the decoding of the full file and whether that will be possible with the next major codec release.

Again, I am not arguing or insisting 180 equirectangular is NOT the standard, we all agree it is. All I am saying is that some studios do chose to render out in fisheye instead with the full FOV of the camera. I want to make clear that I am not talking about the raw footage from the camera in fisheye, I’m talking about fisheye footage processed and aligned in mistika and exported in fisheye, you mention raw capture, i just want to make sure you understand that I dont mean raw capture footage in any of this.

Anyways none of this is relevant, I just simply wanted to know if there was a possibility for your profiles to also support fisheye or not, answer is loud and clear. Thanks

By the way, here is a video from the people at DeoVR ding a Mistika tutorial on how to align, mask lens, and render out in the full 190 FOV of the camera in fisheye, just so you see its not something I’m making up or that I’m talking about the raw capture. I dont personally render this way but there are creators and studios who do chose this render style and DeoVR fully supports it just fine and it plays perfectly fine on their player, you would never know the difference other than having some extra video to the sides. Again, I’m not saying this is the standard or that 180 isn’t the standard, all I’m saying is that there is content out there like this and IF there was a possibility your player could support it, that’s all.

Canon R5C SGO Mistika Boutique Workflow Tutorial from DeoVR. No equirectangular conversion. - YouTube

I didn’t know that people actually processed video into fisheye, thank you for teaching me. I will look into this format and see if there’s a standard way for us to support this without dealing with a slider (like Virtual Desktop) or having to type in the FOV manually or something.

Yeah I mean, I’ve run into this content too, especially in the early days of VR (before Google announced VR180) and rarely since then. I just assumed the reason people did it was because you could just capture from any fisheye lens and it would “just work”. I know a lot of adult video creators do non-standard stuff for various reasons (I believe DeoVR is actually a subsidiary of an adult VR company) but I think most consumers and platforms have (thankfully) finally organized around a few standards.

No need to mention it

Yes they are a subsidiary of an adult company but their team has been doing a lot of innovative work that warrants the attention of more people in the VR space. I actually got one of their team members to order one of your tablets, they seemed very interested in your tech when I shared my experience with it so far, they are very creative people, so I’m excited to see if they can come up with some creative uses.

I dont know how early you joined the canon VR system, specially with the release of the R5C body, but you may recall that for a long time canons VR utility and plugin did not even support 60fps RAW, which almost defeated the whole purpose of upgrading to the R5C, so the mistika and DeoVR team helped come up with a way to process the fisheye footage without needing to convert to equirectangular format and still get full 8K at 60fps. The amount of time and required hardware to not only render 8K RAW but also convert to equirectangular was grueling and very hardware intensive, specially if you wanted to render anything longer than 5-10 minutes. This fisheye format was a way to cut back significantly on the rendering time by not needing to convert to equirectangular. Since then the canon VR utility was upgraded to finally support 60fps RAW so those days have since past, but some creators still use it and even the people at DeoVR still produce using this method.

The DeoVR team has also recently done a hardware modification on the canon VR lens to extend its IPD to 64-65mm which is the standard IPD instead of the 60mm that the lens natively has, why canon chose 60mm I dont know, but it affects the scale of the image. If you look at comparisons with footage shot with a K2 Pro, FM DUO, or any other camera using a 64mm IPD the scale on the canon lens is slightly larger. I previously owned a K2 Pro system so I have a lot of footage that I was able to use to compare and its definitely there, its one of those things that once you notice it, you will never not notice it again. Anyways, they somehow managed to modify the lens physically to expand the IPD and the scaling issue has been corrected. You will have to use mistika because the canon utility software will no longer work properly but the footage looks great. I have been in contact with Hugh Hou to see if he can talk to Ivan from DeoVR to “speed up” the process of releasing a white paper for the mod. Regardless of how noticeable the scaling issues to most casual VR consumers, it’s still very impressive that they managed to successfully modify that damn lens haha