I think I wildly underestimated the impact of iOS 16 and its new Lift Subject from Background feature. This is next-level image stuff that fundamentally changes how you can interact with the 15-year-old platform.
Let’s start by getting something clear: Apple’s next big mobile platform update, iOS 16, is still months away from final release and is currently only in developer beta. The public beta could arrive as soon as next week (the week of July 3). This means that, while I can talk about what I’ve learned, I can’t show you any more than what we all saw during Apple WWDC 2022 keynote last month.
Granted, the demo of someone grabbing a bulldog out of a photo and casually dropping it onto a Message was pretty cool on its own. Actually using it, though, is something else.
Hold it
From what I can tell, it doesn’t matter what kind of photo from your library you use, or even its age. Virtually any photo with a clear subject (or subjects) is game for the Lift Subject from Background feature.
In my library, I opened photos shot with my iPhone 13 Pro, iPhone 8 Plus, iPhone 7, and iPhone 6 and was able to select subjects in all of them.
As demonstrated in the keynote, you open the photo on the iPhone and place your finger on the subject (or multiple subjects, as it’s happy to let you grab a group of people). You know your iPhone is finding the subject thanks to a cool visual effect that appears to marquee the subject and transport it to your finger’s control.
As Apple told me last month, the ability to identify subjects is all part of the company’s rapidly developing image-segmentation technology. Apple uses it on the lock screen to put just your image subject in front of the time. In the case of Lift Subject from Background, it lets you select and move the photo subject almost anywhere.
It’s more
I think I understood what I saw during the WWDC keynote demonstration, but it wasn’t until I tried the Lift Subject from Background feature myself that I understood the radical iOS change that comes along with it.
Look, it’s cool that iOS 16 can identify and lift any subject (person, flower, bird, dog) from a photo. What I didn’t understand is how you might move that subject elsewhere. This is not a cut-and-paste feature; it’s also not a photo-editing feature, à la the Google Pixel’s Magic Eraser. It’s more like a mobile platform magic carpet ride.
Once I had a subject selected, I paused for a moment as I tried to figure out what to do with the floating image under my finger. How would I get it to Messages as they did during the WWDC demo?
Instinctively, I kept one finger on the subject and with my other hand I touched the screen and swept up from the bottom to access my home screen. Then I selected Messages.
I found I could hover with the captured image over my messages list and drop it into one of the threads, or go directly to an open message conversation.
Alternatively, I could open a different app like Notes or Keynote and drop in. As long as I held my finger on the captured subject, I could do whatever I wanted with my other hand, including launching new apps or swiping up one-third of the way from the bottom of the screen to access all my open apps and choose the one where I wanted to drop in my subject.
I couldn’t recall ever seeing iOS 16 work in this fashion before, like a multi-window system.
It’s weird, cool, and a distinct departure from previous versions of iOS. We’ve always had multi-touch, but this is like multi-modal touch — and with a pretty wild new image feature to boot.
It’s possible that Lift Subject from Background will undergo many changes before Apple launches the final version of iOS 16 in the fall, but I don’t see it going backward from this near-revolutionary change (which also happens to work in iPadOS 16). It’s the start of something big.