Apple, AI and Intelligence
Digest more
On-screen Visual Intelligence: After taking a screenshot, users can analyze the on-screen content using Apple Intelligence and take actions, such as searching for similar products to purchase, adding an event to their calendar, or asking ChatGPT for more info about what’s shown.
The Verge’s new senior AI reporter, Hayden Field, noticed we didn’t hear the name of Apple’s assistant very much during the WWDC 2025 keynote on Monday. Even as presenters discussed opening up Apple Intelligence to third-party developers and new AI features for other apps,
We’re getting Live Translation in iOS 26 across a number of apps, improved Visual Intelligence that can now read your screen, Call Screen and Hold for You in the Phone app and an AI-supercharged Shorcuts app.
If you want to watch Apple's keynote presentation for yourself, check out how to watch WWDC 2025 for details on the various places you can find the livestream. For those who prefer following along a third-party liveblog that's largely text- and image-based, scroll down for our coverage right here!
Follow along with the Gizmodo crew as we unpack everything Apple announces at its annual developer conference in Cupertino, Calif.
Explore more
Apple's 2026 operating systems will be unified and see a major visual upgrade called the Liquid Glass design language.
Apple revealed iOS 26 featuring Liquid Glass at WWDC 2025, but the lack of an AI killer feature left analysts questioning its competitive edge.