We’ve been experimenting with multiple Kodak PixPros!
The PixPro is a single camera with a single super-wide-angle lens. It’s advertised as “360” in that it captures a full great circle at the edges of its field of view, but it does not capture anything close to a full sphere. One camera can, however, cover the complete field of view of a human eye, and a much greater field of view than any VR headset currently on the market, with just the default lens and default settings.
To get a full sphere, you can use two at once, and stitch the footage together. We taped two PixPros back-to-back and stitched them using Kolor Autopano Pro, shown in comparison with footage from the Ricoh Theta in our “Back-to-Back PixPros vs Ricoh Theta” test. As you can see, the resolution of two PixPros is slightly higher than that of a single Theta, but the stitching distance is much greater (and having to stitch in Autopano is harder than the Theta’s automatic stitching software).
If you want to use just one PixPro to capture a section of the sphere and view it in VR, you can use the PixPro app to “unwrap” the footage into equirectangular footage (the app calls it “YouTube Format”), though at the moment the app doesn’t do a very good job of it. There’s a significant amount of distortion around the edges of the lenses. The app also currently can only “unwrap” with the assumption that the center of the footage is facing straight up or down, so you’ll have to manipulate the footage in another program if you want to change where the horizon is (such as by stitching the equirectangular footage to nothing in Autopano Pro, then changing the horizon before export).
I recommend Kodak’s own “James” tutorials for things like using the camera and changing its settings: https://youtu.be/3BKuPYBrvD8
“MarkHawkCam” has some on the unfolding software and its quirks: https://youtu.be/egsU-sdvUrE
Luckily, while there’s a lot of distortion in the final equirectangular format video, the distortion is at least consistent, which means two cameras side-by-side can create working stereo that fills an entire human field of view! So we put velcro on the back of each of our cameras, and started sticking them to everything.
We’ve been wanting to experiment with two ultra-wide-angle side-by-side cameras for semi-spherical stereo for a while, and the PixPro provided the perfect opportunity. We started with two cameras on the floor facing up, meant to be watched while lying down in a particular orientation. We were happy to find that the two camera’s views do indeed mesh (similar experiments with the Ricoh Theta failed, due to the two Theta’s stitching distortion being slightly different, possibly due to inconsistencies in the gyroscopes).
We show example footage, as well as explaining some of our initial findings, in “Stereo Spherical Experiments: Side-By-Side PixPros in the Dome“:
The “unwrapping” done by PixPro’s app is visibly stretched too tall around the edges of the camera’s vision, something easy to see when inside a geodesic dome. For stereo, this can create problems with exaggerating the stereo and making it un-meshable. I recommend keeping the cameras as close together as possible if you want to be able to mesh stereo that’s within a few feet of the camera.
Because the PixPro software can easily do a projection assuming the camera is facing straight up or down (though I’ve heard there will be more options soon), we wanted to try both. We did a test called “Ceiling Person“, and found that downward-facing semi-spherical stereo is really effective. It feels natural to view something that you’re looking down at, whether you’re sitting or standing. Conversely, looking up for long periods of time is really only comfortable when you’re lying down, and definitely not great when sitting in front of a computer.
In the first half of Ceiling Person, we simply velcro the cameras to the ceiling of our sound room and hang out a bit. In the second half, we try widening the distance between the two cameras. Everything close to the cameras can’t be meshed, but things further away are visible in hyperstereo, like we’re tiny people in a box rather than a room.
If you do watch this in stereo, don’t strain your eyes trying to mesh unmeshable things in the experimental footage in the second half.
In general we’ve found the absolute minimum meshable distance viewers are capable of is somewhere between 2 and 5 times the distance between cameras, depending on the person, and it’s not comfortable for long periods of time (though our tests are pretty informal). The industry standard for regular stereo film is a factor of 30 or more as the minimum distance (about 6 feet). I think for now we’ll see minimum distance standards for 3D 360 film settling in at more like a factor of 6 to 10 as absolute minimum (a compromise between the limits of current spherical camera technology, the unique effectiveness of close objects in VR, and the strain of viewing), but the usual distances of things-you’re-supposed-to-be-looking-at should be much further. Maybe we’ll go down that rabbit hole in a future post.
Both the above stereo tests require a specific orientation for the stereo to work, different from that of most current “3D 360” videos (stereo when oriented upright for a full turn, but mono when looking up or down). These tests definitely have a specific forward direction, and for some films and audience expectations, keeping one expected direction will be the thing.
Sitting in a non-swivel chair, facing one way and looking down to see crisp perfect stereo below you, is nice. Lying down and looking up and seeing crisp perfect stereo above you is also nice. And being able to look behind you is sometimes nice, but not always necessary. All these different expected stereo orientations will have their place in the near future of VR video.
The PixPro was perfect for these tests. Potentially it could be inexpensive and easy enough to use that we could recommend using them rather than better cameras with non-default lenses and 3rd party software, but first Kodak needs to put a lot more resources into their software.