It has been easy for bookspammers to release dozens of titles in a day using Amazon’s Kindle Direct Publishing (KDP) system, which enables authors to self-publish ebooks and printed books. Amazon could not say how many books it prevents from being published or how many were taken down. In August, Jane Friedman, who writes about publishing, forced it to remove five bogus titles in her name that appeared to be AI-generated.
Since Alice v. CLS Bank, the high likelihood of an AI invention being found to be directed to an abstract idea has created the worrisome possibility of precluding an entire field capable of generating worlds of foundational technology from patentability. Indeed, it is hard to fathom a world where things like the Star Trek computer, with its endless knowledge base and control capabilities, could be patent ineligible,
A clearer picture of how studios intend to incorporate artificial intelligence into the screenplay production apparatus is coming into view. The agreement requires writers to obtain consent if they want to use generative AI and allows studios to “reject a use of GAI that could adversely affect the copyrightability or exploitation of the work.”
The authors note that OpenAI’s detailed interpretation of fair use in an AI context is irrelevant, at least at this stage. Fair use is a defense that is typically not used to dismiss copyright infringement claims before they’re properly argued. “Fair use is an affirmative defense, and is inappropriate to resolve on a motion to dismiss,” the author’s said in their reply brief. Given that, OpenAI’s arguments regarding fair use are “wholly misplaced.”
The decision by U.S. Circuit Judge Stephanos Bibas sets the stage for what could be one of the first trials related to the unauthorized use of data to train AI systems. Tech companies including Meta Platforms, Stability AI and Microsoft-backed OpenAI are also facing lawsuits from authors, visual artists and other copyright owners over the use of their work to train the companies’ generative AI software.
The new Splits: Priority Payouts feature will allow artists to prioritize the order in which collaborators are paid. It grants creators the ability to allocate and distribute revenues for each track according to their preferences. Through the Splits program, self-releasing artists can now designate a specific sum for a collaborator, guaranteeing full payment before percentage-based splits are disbursed to other contributors, TuneCore said in a press release.
A group of major music publishers insist that Elon Musk’s X is liable for the widespread music piracy that takes place on its platform. X asked the court to dismiss their lawsuit, but the music companies say the social media platform is clearly in the wrong. With roughly $250 million in damages on the line, this legal battle should run its course, they argue.
Founding partner Ted Kalo spoke at last week’s Ivors Academy Global Creators Summit in London, explaining more about the campaign, its seven core principles for AI applications, and how he sees the regulatory environment. He certainly didn’t pull any punches. “The stakes are high and things are moving quickly. And there’s something that’s distinctly Orwellian that I think is going on with generative AI,” said Kalo in his introduction.
ASCAP has embraced new and emerging advances in technology, and we have the capacity and infrastructure to manage it at scale. But it has remained painfully clear that any new technology needs to respect existing copyright law. Music creators are concerned about the threat to their livelihood and 8 out of 10 believe A.I. companies need better regulation.
Stephen Fry recently revealed at the CogX Festival that his voice from the “Harry Potter” audiobooks was taken by AI software and replicated without his consent, much to the horror of both himself and his agents. He said the discovery of AI mimicking his voice led him to warn his agents, “You ain’t seen nothing yet. This is audio. It won’t be long until full deepfake videos are just as convincing.”