© 2020 – 2023 AEA3 WEB | AEAƎ United Kingdom News
AEA3 WEB | AEAƎ United Kingdom News
News

AI chatbots distort and mislead when asked about current affairs, BBC finds

Most answers had ‘significant issues’ when researchers asked services to use broadcaster’s news articles as source

Leading artificial intelligence assistants create distortions, factual inaccuracies and misleading content in response to questions about news and current affairs, research has found.

More than half of the AI-generated answers provided by ChatGPT, Copilot, Gemini and Perplexity were judged to have “significant issues”, according to the study by the BBC.

Microsoft’s Copilot falsely stating that the French rape victim Gisèle Pelicot uncovered crimes against her when she began having blackouts and memory loss, when in fact she found out about the crimes when police showed her videos they had confiscated from her husband’s devices.

ChatGPT said Ismail Haniyeh was part of Hamas’s leadership months after he was assassinated in Iran. It also falsely said Sunak and Sturgeon were still in office.

Gemini incorrectly stated: “The NHS advises people not to start vaping, and recommends that smokers who want to quit use other methods.”

Perplexity falsely stated the date of the TV presenter Michael Mosley’s death and misquoted a statement from the family of the One Direction singer Liam Payne after his death.

Continue reading…

Related posts

Dominion Voting Systems sues Giuliani for $1.3bn over baseless election claims

AEA3

Boris and Carrie Johnson expecting third child in ‘just a few weeks’

AEA3

Boris Johnson promised to tear up NI protocol, says DUP MP Ian Paisley

AEA3

Pin It on Pinterest

Share This