Apple rumored to be in talks with Anthropic and OpenAI about powering new Siri
A rumor around Apple's AI initiatives suggests the company could turn to third parties instead of relying on its own Apple Intelligence for an upgraded Siri, but there's likely a different story.

Apple could bring ChatGPT to Siri in a more integrated way
Apple has been in some hot water optically since it announced Apple Intelligence upgrades that never shipped. More zealous customers and shareholders feel betrayed by the company, and lawsuits have even sprung up because of it.
However, according to a report from Bloomberg, Apple is exploring several options around how it can bring more powerful AI tools to its platforms. Specifically, it seems Apple could be looking to Anthropic or OpenAI to run the backend of an AI-powered Siri.
Or, at least, that's what anonymous "people familiar with the matter" have shared. The report leans into this reasoning, flawed as it is, though there's likely a better explanation for Apple's discussions with the companies.
What the secret people are saying
It seems Craig Federighi and Mike Rockwell believe Apple Intelligence is inferior to other models and are seeking out external help. The people familiar with the situation suggest multiple talks are underway to develop special versions of ChatGPT and Claude for Apple's Private Cloud Compute servers.
The move has allegedly caused frustration among Apple employees, which threaten to take some of Meta's or other competitor's lucrative deals and leave. The team consists of about 100 people, and Apple has even had to allegedly make special deals to keep some of the staff on.
Of course, as with anything shared by some anonymous tipster, it needs to be taken with a grain of salt. Employees that speak out in such ways usually have an axe to grind, and there's nothing more salacious than a rumor that Apple's teams are crumbling.
Whatever the people's reasons for leaking, let's examine this information from a less jaded perspective.
Apple's third-party AI gambit
Apple already works with OpenAI and Google on features that coincide with Apple Intelligence. Rumors have consistently pointed to Google Gemini eventually coming to iPhone as an alternative option to ChatGPT when speaking to Siri.
There's also the new partnership with Anthropic to integrate Claude Sonnet into Xcode. It is clear Apple has already been investing heavily into third-party AI uses, so this report shouldn't be a surprise.
Judging by how Federighi and Greg Joswiak discussed Apple Intelligence during WWDC interviews, it is hard to believe that they'd have a completely different opinion internally. Instead, while they are disappointed with being seen as behind other companies in AI, they've asserted they're running a completely different race.

Apple Intelligence already works with third party AI companies
They even hinted at Apple's partnership with Google for search as how it was handling AI. Which is bringing in an external party that's the best at what it does to power something on iPhone.
It is certainly likely that Apple is in discussions with Anthropic and OpenAI to build custom models that can run on Private Cloud Compute servers, but not to replace Apple Intelligence as a backend, but to support it.
Today, when a user asks Siri to contact ChatGPT with a request, it is reaching out to the web and uploading data. Instead, custom versions running in Apple's servers would be ironclad with privacy restrictions and guarantee user data is safe.
It would also mean enabling more private kinds of data being shared to the AI tools.
While the report suggests this is Apple's failure in AI and reads as the usual doom and gloom expected in Apple reporting, the reality is likely much less severe. Apple's willingness to work with other AI companies to ensure users have private and secure access to many popular models should be seen as a boon and a victory for users.
Apple Intelligence is running its own race, and Apple is at the forefront of ensuring private and secure AI can exist without siphoning critical and private user data. The Siri powered by Apple Intelligence is no doubt still coming, but it may have some third-party plugins from OpenAI, Anthropic, Google, and others that give users plenty of choice while remaining private.
Rumor Score: Possible
Read on AppleInsider
Comments
Second: ChatGPT already runs via Siri without getting data from Apple or the user. Private Cloud Compute is server side, but independently verified to not be storing user data.
Third: How is giving users access to more private and secure models using Apple technology a bad thing for Apple?
If people want to hang out at the Nazi bar and use the Nazi-endorsed AI platform, by all means, but let's be realistic here. Apple isn't touching that politically charged garbage with a ten foot pole. I mean, how can anyone take it seriously when Musk says Grok is broken when it tells the truth, asserting he's going to have it fixed so it doesn't tell the truth again later?
I’ve been using Claude 4.0 with Xcode over the past week. I started a new Xcode project, then opened the folder in VSCode—and that was all I needed. I built a working MVP desktop app over the weekend for work.
I use ChatGPT every day but I do not trust ChatGPT to the same degree that I trust Apple. ChatGPT may in fact be an equal to Apple when it comes to privacy and security, but I have a history with Apple and so far they've lived up to my expectations. ChatGPT still has to prove itself to me and I have no evidence one way or the other because our relationship has only just started.
I absolutely do not need a version of Siri that is a replacement for ChatGPT or Copilot. I already have those tools in my toolbox and I know they cannot go as deep into my hardware, software, services, data, activities, preferences, etc., that Apple can. I will keep using those tools for what they do best. I want Siri, or whatever it's called, to be able to bring automation, control, insight, and most importantly general "help and assistance" in areas that affect my life, my lifestyle, my health, my awareness, my safety, my time management, my entertainment. etc. Apple's devices and services are closer to me in many more facets of my life than any other "intelligent" agent or chatbot I can think of.
Apple needs to play to their strengths. The original Siri was never built to take on the broad scope of quality-of-life assistive tasks that can truly make a difference in our lives. The last thing I want Apple to do is to direct their efforts towards displacing the AI assistants that we now see as best-of-breed. That would be setting their sights way too low. If Apple integrates AI services like OpenAI/ChatGPT into their products it would definitely fill out the sorts of things that take place at that level, but it wouldn't complete the much larger scope of what Apple is able to do based on their intimate understanding of their customers as non-artificial individuals and what they can do to improve their customer's lives.
Think Different doesn't work every time.
Clearly being behind other solutions flooding the market, clearly being late, clearly scrambling to offer something etc.
My personal take was that Gruber (with his piece on Apple Intelligence) was simply giving a voice to some of the dissenters within Apple so I imagined the divide to be fairly important.
It's been clear that, at an executive level, the problems were important. The car project might even have been a resource hog and the AVP was also brewing during the same period.
Now we are hearing about entire teams demanding changes, which only cements the depth of the problem.
The WWDC interviews were not inspiring at all when it came to promoting the state of Apple Intelligence.
China is proving very tough for Apple in terms of product and Ai.
The EU is an obvious cause for concern too.
Then of course there is the business impact of Trump and his wacky ideas.
It's not a doomsday scenario but it is serious.
For me, it is a toss-up between subscribing to ChatGPT and Perplexity. However, if Apple does take over/ partner with Perplexity, then I may not subscribe to anything. I can wait and see how this pans out.
I haven't used Gemini and Deepseek and don't plan on trying either of them, unless Apple partners with them and adds a modicum of data security and privacy.
Apple should use all 3 LLMs Google Gemini, OpenAI GPT and Anthropic Claude as backends for Apple Intelligence.
Each model has unique strengths that Apple Intelligence can make use of.
If Apple has to choose one, I would say go with Anthropic Claude.
"Claude excels in long-context understanding, ethical AI, and detailed analytical work." That is exactly what SIRI needs
Apple can be successful using inputs from other firms. They sold a lot of Macs using Intel processors, for example. For an older example, the enabling technology for the original iPod was a 1.8" hard drive from Toshiba.
Historically, the key to Apple's success has been the successful designs of *products* that combine those various inputs in a compelling way.
To my way of thinking, this is the real concern with Apple right now. They are having product design difficulties. I wouldn't be concerned if they designed a compelling new Siri that relies on someone else's LLM. I am concerned that they haven't designed a compelling new Siri.
Apple Silicon M5 coming out in the fall or Siri being 20% smarter which would you choose and which one do you think the public would care more about right now? AI which is currently half hype, half fad and is unprofitable currently for everyone other than Nvidia, or Apple Silicon and the five operating/ecosystems being better than they were last year without AI is probably more important long-term than Siri being like Robbie the robot in Forbidden Planet.
Apples path is going to be longer because they are carrying the load of software and hardware in house that’s just the way it is and there is no shortcut.
Well, the biggest problem with this statement is that they've got a deal saying that they won't access or use any data submitted by users via Apple. Second, I really don't understand what your statement is trying to convey... that OpenAI has some kind of magical superpowers thanks to its AI? That's not really what AI is or does.
Unless Apple fudge up the privacy, or your single anonymous query contains pretty direct personally identifiable information, they've got nothing.