Skip to main content

PSA: iOS 26 Spatial Scenes will work on iPhones 12 and up [U]

Update: A previous version of this post incorrectly stated that the feature works with any iOS 26-compatible iPhone. It has been updated to clarify that the feature is only supported on iPhone 12 and later.

Today, Apple announced Spatial Scenes, a new iOS 26 feature that turns 2D photos into immersive 3D effects. And here’s the good news: even if you don’t have an iPhone compatible with Apple Intelligence, you’ll still get access to it. Here’s how it works.

It’s AI, but it’s not Apple Intelligence AI

As Apple explains it, the Spatial Scenes feature works by using advanced computer vision techniques running on the Neural Engine to reconstruct depth from flat images.

The result is a spatially reactive version of your photo that subtly shifts and animates as you move your phone, giving it a dynamic, almost video-like presence.

Apple first showed off the feature as part of its updated Lock Screen, but later explained that the same 3D effect will be built right into the Photos app, meaning you can relive any of your past moments with a surprising new sense of depth and motion.

While it uses on-device AI to create the effect, Spatial Scenes doesn’t rely on the full Apple Intelligence stack. So, iPhone 12 users and up will still be able to enjoy it. What’s more, it also works with pretty much any photo you might have in your Photo Library.

Spatial Computing makes its way to iOS

Top comment by Blurft

Liked by 4 people

It’s AI, but it’s not Apple Intelligence AI

As Apple explains it, the Spatial Scenes feature works by using advanced computer vision techniques running on the Neural Engine to reconstruct depth from flat images.

So it's computer vision techniques running on the Neural Engine to estimate depth from flat image?

Can we just say that instead of slapping the increasingly-meaningless "AI" label onto it?

View all comments

In a way, Spatial Scenes in iOS 26 can be seen as a direct offshoot of the Spatial Photos format introduced with Apple Vision Pro. But instead of relying on dual-camera depth data or stereoscopic image pairs, it reconstructs 3D depth using advanced monocular computer vision techniques on-device.

From a platform perspective, this is a meaningful bridge: it brings a core spatial computing experience to millions of users who haven’t yet used Apple Vision Pro. By adding depth and motion to everyday photos, Apple is quietly turning the Photos app into something more immersive and emotionally engaging, mirroring one of the headset’s biggest selling points.

For ongoing updates and full coverage of WWDC25, head over to our news hub.

Are you excited about Spatial Scenes? Let us know in the comments.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Marcus Mendes Marcus Mendes

Marcus Mendes is a Brazilian tech podcaster and journalist who has been closely following Apple since the mid-2000s.

He began covering Apple news in Brazilian media in 2012 and later broadened his focus to the wider tech industry, hosting a daily podcast for seven years.