Apple is finally building the AI Photo editor that Google and Samsung have had for years


Google’s Photos app has been doing things that Apple’s Photos app couldn’t, for years, and the iPhone-maker has noticed. Bloomberg’s Mark Gurman, in his latest report, claims that iOS 27, iPadOS 27, and macOS 27 will come with a dedicated “Apple Intelligence Tools” section inside the Photos editing interface. 

The “Apple Intelligence Tools” section will include three new AI-powered photo-editing features: Extend, Enhance, and Reframe. Before we begin with what the features actually do, all of them will run entirely on-device, and, in a typical Apple fashion, complete their edits in seconds. 

What will the new Apple Intelligence photo editing tools do?

Extend, as the name suggests, extends a picture’s boundaries by generating new imagery and seamlessly stitching it to the existing one. You should be able to use the feature to add some surrounding to close-up shots or add some negative space to either side of the subject. 

Enhance, on the other hand, works as a one-tap enhancement button, which immediately adjusts the color, lighting, and the overall image quality, without going through different editing options and fiddling with various sliders. 

Reframe is designed primarily for spatial photos captured for the Vision Pro headset. It lets users shift the perspective of a 3D image after it’s already been taken, allowing you to move from a front-facing to a side-facing view. 

Is Apple actually ready to release all three features?

Not at the moment, no. Per Gurman, both the Extend and Reframe features are producing inconsistent results in the internal testing. If the underlying AI models don’t adapt or the results don’t improve significantly before the September launch event, Apple might delay them or scale back. 

While I’m a big fan of the Apple Photos app myself, it currently offers only one AI-based editing feature, Clean Up, and that doesn’t work as well as the feature does on other smartphones like the Galaxy S or the Pixel flagship series. 

I remember when Google released its Magic zeditor in 2023, and Samsung’s Galaxy AI followed quite aggressively in the coming years. In response, the best Apple could come up with was Clean Up. In my opinion, Apple genuinely needs the Extend, Enhance, and Reframe features to work, and work in time for a showcase at the WWDC 2026 and a public release in September. 



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


There’s something oddly brilliant about outsourcing your curiosity to an AI that doesn’t get tired or awkward. After all, if an AI agent can call thousands of pubs and build a Guinness price index, why stop there? Why not send one loose into the wild to track the cost of your daily caffeine fix or your late-night ramen cravings?

I’m sold — I want one of those

That’s exactly the kind of domino effect sparked by a recent experiment inspired by Rachel Duffy from The Traitors. A developer built an AI voice agent that sounded natural enough to chat up bartenders and casually ask for Guinness prices, compiling the data into a public index. It worked so well that most people on the other end didn’t even clock that they were speaking to a machine. And just like that, a slightly chaotic, very clever idea turned into something surprisingly useful.

Now imagine applying that same idea to coffee and ramen. Because if there are two things people are oddly loyal and sensitive about, it’s how much they’re paying for a flat white or a bowl of tonkotsu.

A “CaffIndex,” for instance, could map out the price of cappuccinos across cities, highlighting everything from overpriced aesthetic cafés to hidden gems that don’t charge $3 for foam. Similarly, a “Ramen Radar” could track where you’re getting the most bang for your broth, whether it’s a premium bowl or a spot that somehow gets everything right. Don’t giggle, I’m serious.

The appeal isn’t just novelty. It’s scale. Calling up a handful of places yourself is tedious. Getting real-time, city-wide data? Nearly impossible. But an AI agent doesn’t mind dialing a thousand numbers, repeating the same question, and logging every answer with monk-like patience. What you get in return is a living, breathing map of prices.

It’s not all sunshine and roses

Of course, it is not all smooth sipping and slurping. There is a slightly uneasy side to this, too. Questions around consent and transparency start to creep in, and you cannot help but wonder if every business would be okay with being surveyed by an AI that sounds just a little too real. In the original experiment, the AI was designed to be honest when asked directly, but let’s be real: most people aren’t going to question a friendly voice casually asking about prices. It feels harmless in the moment, and that is exactly what makes it a bit tricky.

Still, there is something genuinely exciting about the idea. Not in a scary, robots-are-taking-over kind of way, but in a way that makes you pause and think, this could actually be useful if handled right. Prices are creeping up everywhere, from your rent to that comforting bowl of ramen you treat yourself to after a long day. Having something that keeps track of it all feels like a small win.

Maybe that is the real takeaway here. Today it is Guinness. Tomorrow it could be your morning coffee or your go-to ramen spot. It makes you wonder how long it will be before your phone steps in, calls up a café, asks about their espresso, and saves you from spending more than you should. Because honestly, if AI is willing to do the boring work for you, the least it can do is make sure your next cup and your next bowl actually feel worth it.



Source link