Bryant Frazer did a Q&A with Ross Shain on Mocha VR and tracking for 360 video.. Based on Imagineer’s Academy Award-winning planar tracking algorithm, Mocha VR is the first plug-in to bring native 360 optimized motion tracking, masking, object removal and horizon stabilization tools to Adobe Premiere Pro CC, After Effects CC, Avid Media Composer, The Foundry’s Nuke and Blackmagic Design Fusion.
Click here to watch the video on YouTube
Will you be doing stitching in Mocha VR?
We looked at stitching, and we thought we had some interesting tools that could be applied. But I personally think stitching is something that will be covered more and more by the camera technology. We’ve seen it on the low-end consumer cameras — just a two-camera system as opposed to a six-camera system — but I don’t think stitching, over the long term, is going to be the biggest hurdle. One of our customers, Koncept VR, uses Mocha a little bit to fix the stitching. They’ll do a pass or two of stitching in GoPro Kolor, but say someone on screen is moving diagonally toward the camera. You’ll see them ghosting as they go across a stitch. This customer is doing a lot of rotoscoping in Mocha, and they’ll get a clean area from either side of the stitch from the original camera so they can rotoscope across it, replacing a person as they go across the stitch so it’s not as noticeable.
So the tools can clean up some artifacting after the stitching takes place.
Yeah. We didn’t design it for this purpose, but because Mocha is a workhorse VFX tool used for all kinds of clean-up, people are employing it to fix stitching.
The last thing to mention is that — and you brought this up earlier — a lot of the negatives in 360 are nausea-inducing experiences. When people have a bad first-time experience with a headset, they’re never going to want to put it back on. And a lot of filmmakers are moving the camera around on drones, on rigs, and on cars, so jittery footage can be a real problem. That was one of the big things customers told us when we were in beta. They were looking for new ways to stabilize. So we came up with something we call Horizon Stabilization. We use the planar tracker to track an area on the horizon. A lot of times things on the horizon are out of focus or occluded, and it takes advantage of the power of the planar tracker, so we might track some clouds or some area on the horizon. And then we stabilize the relationship between the horizon motion and the camera itself. It’s cool because in 360 you have access to all the pixels. In traditional stabilization you end up having to scale the image if you’ve locked something down. But in 360, we use the seamless pixels to stabilize the motion. We’ve had a lot of great feedback on this.
Click here to read the full article on StudioDaily