At the 2025 Worldwide Developers Conference (WWDC25), Apple introduced a handful of AI features under the banner of “Apple Intelligence”, but kept everything at a moderate level.
There were no huge declarations or tools at the WWDC25, just practical tweaks aimed at improving user experience.
From live call translation to on-device smart summaries, what Apple announced was measured and minimal. Siri didn’t get the overhaul many had expected after last year’s promises.
Instead, Apple focused on more secure system integration and privacy-led enhancements, nothing that screamed innovation.
Craig Federighi, Apple’s senior vice president of Software Engineering, confirmed developers would now be able to access the on-device large language model embedded in Apple Intelligence.
“We’re opening up access for any app to tap directly into the on-device, large language model at the core of Apple,” he said. That’s important for privacy, but not necessarily performance.
The on-device model itself is limited. It runs on about 3 billion parameters, small when compared to the power of cloud-based systems used by Microsoft or Google. It can’t process large, complex tasks, which means it is limited in areas where true generative AI is beginning to thrive.
Still, Apple’s strategy is to stay local, stay secure and features like call screening, where iPhones can pick up unknown calls, ask why the person is calling, and then transcribe the response before the phone even rings, are clever.
So is the live call translation that doesn’t require the other caller to own an iPhone. It’s thoughtful tech, but hardly disruptive.
The redesigned operating systems, featuring what Apple calls a “liquid glass” aesthetic, are another example. It looks sleek, but it’s not revolutionary. It’s enabled by improved Apple silicon, and now all OS platforms, from iPhone to Mac, will adopt a consistent naming convention. The move to names instead of numbers is a way to streamline branding.
Image Playground, which now allows users to generate pictures using ChatGPT, was also showcased. Apple says user data won’t be shared with OpenAI unless the user consents. This cautious collaboration highlights how Apple is trying to balance innovation with its longstanding privacy-first ethos.
What’s missing from all of this is clarity on vision. A year ago, Apple spoke of intelligent agents and a new era of AI. That talk has all but disappeared. Analysts are taking note.
“In a moment in which the market questions Apple’s ability to take any sort of lead in the AI space, the announced features felt incremental at best,” said Thomas Monteiro, senior analyst at Investing.com. “It just seems that the clock is ticking faster every day for Apple.”
That ticking clock got louder as OpenAI, during Apple’s WWDC25 event, announced it had hit $10 billion in annualised revenue. One company accelerating into the AI future, the other inching forward.
Even within the developer community, there are questions. Apple’s Foundation Models Framework allows developers to plug into Apple Intelligence, but only the on-device version. The high-powered, cloud-backed models that could have taken these tools to the next level are staying behind closed doors.
Ben Bajarin, CEO of Creative Strategies, said: “You could see Apple’s priority is what they’re doing on the back-end, instead of what they’re doing at the front-end, which most people don’t really care about yet.”
Investors, it seems, agreed. Apple shares fell 1.2% by the end of the day, hardly a collapse, but a sign that the market wasn’t impressed.
If Apple is laying the foundation for bigger things, it’s doing so without noise. This is a deliberate approach we hope will pay off, not leave the company behind competitors.