Read Tech

iPhone 15 Pro Owners Can Now Record Spatial Video for Apple Vision Pro

Today’s iOS beta release adds the new video recording feature and we got hands-on

Courtesy of Apple

With the announcement of Vision Pro Apple also introduced Spatial Video, their format for immersive 3D video. And when the iPhone 15 Pro was unveiled Apple shared that in a future version of iOS the devices would be able to capture Spatial Video for playback in Vision Pro. Today, with the latest iOS beta release, owners of the latest Pro phones can start capturing content to be viewed in Vision Pro once it’s available early next year. I had a chance this week to demo capturing Spatial Video and playing it back in Vision Pro alongside a deeper dive into how headset users will be able to interact with and immerse in photos and videos. My second experience using Vision Pro left me just as excited as my first, if not more-so.

Courtesy of Apple

Once turned on in iOS settings, capturing Spatial Video is quite simple. There’s a toggle in the Camera app’s video mode to record in spatial which enables simultaneous recording from the main and ultra-wide cameras. When the two feeds are compared and combined depth is revealed but will only work when viewed in Vision Pro because of the headset’s use of separate screens for each eye. Just like other 3D film formats, the depth is perceived in our brains while looking at slightly offset images in each eye. When viewing a video recorded in Spatial mode on the iPhone depth is not perceivable and it looks and feels just like any other video.

Courtesy of Apple

Recording Spatial Video only works when the phone is in landscape orientation. This is because the two cameras in use need to be side-by-side to capture stereoscopically. Also though, landscape viewing is more immersive than portrait and thus better for headset use. Once holding the phone horizontally and enabling Spatial mode recording is the same familiar process as always. It’s advised to keep the device level and avoid quick or jerky movements which makes sense given how unnerving it can be in any headset to sit still and have a world rendered in front of you that’s moving wildly. I had the opportunity to film a sushi chef at work and tested a bit of vertical panning and moving in and out of different elements of the scene to better understand filming for depth and how movement might feel once played back in the headset.

Courtesy of Apple

After the filming I put on a Vision Pro, did a quick eye-tracking calibration and then navigated to the Photos app. By this point it was already clear that visionOS has evolved significantly since my first demo in June. It’s even faster, smoother and more responsive. Once in the Photos app I learned some new look, pinch and stretch gestures and zoomed into incredibly high resolution images I’d placed onto the wall in the room where I was sitting. As someone who’s easily distracted and visually sensitive, the ability to look at images without distraction is a gift. I then watched a selection of Apple-produced content as well as my own test recording in Vision Pro and the experience was delightful. Compared to the 3D formats used in cinemas today and other VR headsets, the quality and clarity here was far superior—likely because of the impressively bright, fast and high resolution screens in Vision Pro. One video was following two people as they hiked along a grassy trail and though the camera was moving sitting still and watching didn’t create any motion sickness. Neither did my short film of the sushi chef. To be fair, both clips were short.

Apple still has not announced a release date for Vision Pro and simply says early next year. I can’t wait to test it in the wild.

Leave a comment

Related

More stories like this one.