The BBC has complained to Apple after a new feature in the tech giant’s iPhone generated false headlines about a high-profile murder in the US.
Launched in the UK earlier this week, Apple Intelligence uses artificial intelligence (AI) to summarize and group notifications.
This week, an AI-powered summary incorrectly reported that BBC News had published an article claiming that Luigi Mangione, who was arrested in New York after the murder of health insurance CEO Brian Thompson, had shot himself. He isn’t.
A BBC spokesperson said the company had contacted Apple “to raise this concern and resolve the issue.”
Apple declined to comment.
“BBC News is the world’s most trusted news outlet,” a BBC spokesperson added.
“It is essential to us that our viewers can trust all information and journalism published in our name, including notices.”
The notices that made false claims about Mangione were accurate except for summaries about the overthrow of Syria’s Bashar al-Assad regime and updates on South Korean President Yoon Seok-yeol.
But it appears the BBC isn’t the only news publisher whose headlines were reported incorrectly due to Apple’s new AI technology.
On November 21, three New York Times articles on different topics were combined into one notice, with one section reading “Netanyahu Arrested,” referring to the Israeli Prime Minister.
It was not a report on Netanyahu’s arrest, but an inaccurate summary of a newspaper report that the International Criminal Court had issued an arrest warrant for him.
The mistake was pointed out on Blue Sky by a journalist from the US investigative reporting website ProPublica.
The BBC has not been able to independently verify the screenshots, and The New York Times declined to comment to BBC News.
“Embarrassing” mistake
Apple says one reason people love its AI-powered notification summaries is because they reduce interruptions from ongoing notifications and allow users to prioritize more important notifications. .
This is only available on certain iPhones, i.e. iPhones using iOS 18.1 system version or later on recent devices (all iPhone 16 phones, 15 Pro, and 15 Pro Max). Also available on select iPads and Macs.
Petros Iosifidis, a professor of media policy at London’s City University, told BBC News that Apple’s mistake was “embarrassing”.
“I understand the pressure to be ahead of the market, but I’m surprised that Apple would put their name on such an obviously half-baked product,” he said.
“Yes, there are potential benefits, but the technology doesn’t exist yet and there is a real risk of spreading disinformation.”
Grouped notifications are marked with a specific icon, and users can report concerns in the notification summary on their device. Apple did not say how many reports it had received.
Apple Intelligence not only summarizes articles from publishers, but it has been reported that summaries of emails and text messages can sometimes be off-target.
And this isn’t the first time a big tech company has discovered that AI summaries don’t always work.
In May, Google told some users looking for a way to stick cheese on pizza to consider using “non-toxic glue” in what it described as an “isolated example” in its AI summary tool for internet searches. I told you.
The search engine’s AI-generated response also states that geologists recommend that humans eat one rock a day.