Apple Intelligence

You could say that Apple Intelligence takes a handful of ChatGPT-3 chatbot class features, exposes each with a button to press rather than a text box to type in, and weaves in some pretty animations and colourful effects when you invoke them. That’s a curt summary, but it is not wrong necessarily. Everything Apple Intelligence does, we’ve seen before.

However, what makes it profound is the intentionality of the design, and the way in which these features are being realised. The marketing is straightforward and easy for people to understand, and the features are integrated naturally into the operating system surfaces that people already use. In fact, most of the ‘new’ features are things that the OS already ostensibly does; things like text editing and manipulation, notification management, smart replies, transcriptions, and — yes — emojis. Apple isn’t trying to convince people on wholesale new dimensions of what a phone is capable of. It’s taking what users already do, but made better by using modern AI techniques, so that users can extract more value out the other end.

When watching the demos and looking at the screenshots, all of these new buttons and panels fit in naturally as a component of the iPhone experience, as we know it. That’s a really significant point and speaks to how well they have executed here. They could have very easily screwed this up and announced a bunch of whizz-bang, discombobulated, AI-powered stuff that would ultimately land as extraneous gimmicks, go unused in practice, and feel like they were bolted on just for the sake of it.

The exception is the Image Playground stuff. I don't think it is completely pointless, but it's the hardest to find long-lasting value in. It might just be a rarely used gimmick, like Memoji.

I don’t think that’s the case at all. These features are compelling and — assuming they work as advertised — will be used en masse. I was pleasantly surprised that Apple is offering all of this for free. Despite being powered in part by the Private Compute Cloud server infrastructure, they have not used it as an opportunity to slap another subscription fee upsell onto its user base (at least not yet).

That means every iPhone 16 buyer this fall is going to be delighted by notification summaries of group text threads when they wake up in the morning, appreciate that when they scan their list of emails they can actually get a sense of what each message is about before tapping through, and enjoy creating emojis that have never existed before and sending them to their friends.

I am personally looking forward to all the new Siri improvements, although it remains a little murky as to exactly what will get better. The semantic index stuff isn’t shipping until next year, and it doesn’t seem to cover everything. New Siri might be able to dig up information from my emails and text messages, but will it still get confused if I ask it to turn off two HomeKit lights at the same time? I don’t know yet if Siri’s general knowledge base and understanding of user intent is evolved. I hope so, but Apple was evasive on specifics. My impression is that while new Siri makes progress on that front, paths to “I found this on the web” answers will still be a relatively common occurrence.

Perhaps my biggest disappointment of the entire endeavour is there is no indication as to how any of this could conceivably come to products like the Watch or HomePod, Apple’s most voice-oriented devices. Maybe they are working on solutions behind the scenes, but as it stands right now, it’s a big gap in the Apple Intelligence strategy.