© 2020 – 2024 AEA3 WEB | AEAƎ United Kingdom News
AEA3 WEB | AEAƎ United Kingdom News
Image default
IT

AI voice clone startup faces legal action from music industry

Jammable, a London-based AI startup that generates cloned voices of popular celebrities and characters, is facing legal action from the British music industry over alleged copyright infringements.

The British Phonographic Industry (BPI), a trade body for UK music, has accused Jammable, formerly Voicify AI, of infringing on the copyright of artists to create its voice clones.

BPI sent a legal letter outlining its grievances and stated that it is preparing legal action.

The use of copyrighted information to create or train AI models has become a common industry practice, despite a lack of clarification on its legalality.

A handful of lawsuits have been launched recently against AI companies for alleged copyright infringement, including from the New York Times, which has accused OpenAI of using its content without permission to train ChatGPT.

Getty Images is engaged in a similar lawsuit against London-based Stability AI.

“Music is precious to us all, and the human artistry that creates it must be valued, protected and rewarded,” Kiaron Whitehead, general counsel at the BPI, told UKTN.

“But increasingly it is being threatened by deepfake AI companies who are taking copyright works without permission, building big businesses that enrich their founders and shareholders, while ripping off artists’ talent and hard work.

“Jammable, and a growing number of others like them, are misusing AI technology by taking other people’s creativity without permission and making fake content.  In so doing, they are endangering the future success of British musicians and their music.”

UKTN has contacted Jammable for comment.

Deepfake fears

The creation of AI clones of real people is among the more controversial use cases of the technology. Commonly referred to as deepfakes, the practice has the potential to spread misinformation as the technology becomes more and more lifelike.

There has been limited government action over deepfakes in the UK, save for questions being raised in the House of Lords last month.

Scientists for Labour, an affiliate group of the opposition party, has compiled a report detailing its recommendations to combat deepfake misinformation.

The recommendations include a requirement for businesses generating AI content to demonstrate to Ofcom that its content is identifiable as AI-made.

The report also called for the criminalisation of the creation and sharing of deepfakes where the intent is to deceive. The implementation of AI labels from content hosting sites, as well as the tech companies generating the content, was also recommended.

Labour has already had high-profile encounters with deepfake misinformation. Fake audio clips of party leader Sir Kier Starmer and London’s Labour mayor, Sadiq Khan, have both recently circulated on social media.

Read more: Most generative AI models likely ‘illegal’, says former Stability VP

The post AI voice clone startup faces legal action from music industry appeared first on UKTN.

Related posts

Police require radical reform to deal with 21st century challenges

AEA3

Public data should not be held by US tech giants

AEA3

Carhartt shifts old data to the cloud with Komprise

AEA3