After much hype, Apple Intelligence, the company’s suite of AI features, finally shipped to users this week with the iOS 18.1 update. I used these features for months through beta software and realized that the feature set rolling out this week is more about creating convenience for users rather than letting users plan or brainstorm ideas with ChatGPT or search the web with Perplexity.
One of my favorite features in the rollout is the ability to type and talk to Siri. This allows me to easily set timers or ask for currency conversions without Siri going “I didn’t quite understand that.” You can enable this through Settings > Apple Intelligence & Siri > Talk & Type to Siri > Type to Siri toggle and then double tap on the bottom bar anytime to invoke Siri.
This is more of a convenience. The new Siri will possibly understand you better when you stutter or change the timer from 10 minutes to 15 minutes. However, for a lot of queries — such as “What can you use as a pine nut substitute?” — Siri will still get knowledge from the web or redirect you to a website.
One of my most used features in the last few weeks is Apple’s photo cleanup feature. This allows people to remove certain objects in photos, such as a passerby photobombing your selfie, through auto selection or manual selection.
The removal is not flawless, but if you look closely at the removed part, you will see blemishes. At times, these artifacts are more noticeable. But through some cropping and clever filter use, these photos are useful for posting on social media at least. After I demoed this feature to a few friends, they started routinely sending me photos to clean them up. Notably, Google has its own photo cleanup feature called Magic Eraser, which is available on Pixel phones and Google Photos.
Notification summary is a tricky feature. You can choose to summarize notifications from all, none, or selected apps. The summary is mostly accurate, but at times, it reads weirdly. For example, my friend said in separate messages that she didn’t know why her feet hurt. The end result was this:
Admittedly, the summary feature helped me remove messages from the notification tray that were informational or didn’t need my attention at that point.
I am not alone in finding the notification summaries funny sometimes. Plenty of people on social media posted about these summaries, including a person who allegedly said he learned he was being dumped through these summaries.
I don’t know what the right combination of useful and funny, but Apple’s Craig Federighi told The Wall Street Journal that, at times, the summary might not be funny for users. Plus, the system won’t automatically summarize notifications for contexts that might be sensitive.
Apple’s Writing Tools feature is something that I haven’t used a lot on the iPhone. That’s partly because I do a ton of writing on a desktop and partly because I didn’t feel the need to have AI make my email more “professional.” I used the proofread function on occasion, and it was good enough for a basic check. Early in my testing, I noticed the tool stumble in handling swear words and other topics like rugs, killing, or murder. It might not be useful if you are writing a thriller plot.
Apple’s first set of features does not generate texts or images for users in a way a tool like ChatGPT, Gemini, or Claude does. The second set of AI features available on developer beta might do some of the magic tricks while also integrating with ChatGPT for answers and text generation. For now, Apple Intelligence might make your life a tiny bit better, but it might not be enough to convince you to buy the new iPhone that is “built for Apple Intelligence.”