© 2020 – 2023 AEA3 WEB | AEAƎ United Kingdom News
AEA3 WEB | AEAƎ United Kingdom News
News

Google refuses to reinstate man’s account after he took medical images of son’s groin

Experts say case highlights well-known dangers of automated detection of child sexual abuse images

Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a societal problem.

Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sexual abuse material.

Continue reading…

Related posts

National Grid warns households could face three-hour power cuts this winter

AEA3

Labour MP Christina Rees stripped of party whip after bullying allegations

AEA3

Man who murdered mother and siblings in Luton jailed for at least 49 years

AEA3

Pin It on Pinterest

Share This