Federal lawmakers have introduced the bipartisan “COPIED Act” to regulate unauthorized AI deepfakes and training, with broad support from the RIAA and other industry groups.
Overview
Senators from both sides of the aisle recently introduced the “Content Origin Protection and Integrity from Edited and Deepfaked Media Act,” commonly known as the COPIED Act.
Endorsed by the Recording Industry Association of America (RIAA), the Artist Rights Alliance, and other organizations, this measure arrives about ten weeks after Warner Music CEO Robert Kyncl testified before Congress in support of federal AI regulations.
Background and Legislative Context
To date, AI regulations have been proposed through the No Fakes Act and the more comprehensive No AI Fraud Act. Despite slow legislative progress, unauthorized soundalike works continue to proliferate.
Additionally, several leading generative AI systems still claim the right to train on protected media without authorization, posing significant challenges.
Key Provisions of the COPIED Act
The COPIED Act, introduced by Senators Marsha Blackburn, Martin Heinrich, and Maria Cantwell, addresses both deepfake and training issues.
The 18-page bill calls for establishing a public-private partnership to develop standards for determining content’s origin and whether it’s synthetic or synthetically modified with AI. This content includes music, images, audio, video, text, and multimodal content, reflecting the measure’s broad support from various industries.
The National Institute of Standards and Technology (NIST) would lead these efforts, with input from the Register of Copyrights and the Director of the U.S. Patent and Trademark Office (USPTO). The legislation outlines voluntary, consensus-based standards and best practices for watermarking and automatic detection of synthetic content and the use of data to train AI systems.
Implications for AI Systems and Platforms
The bill mandates that generative AI systems enable users to label media outputs as synthetic and attach content provenance information, which documents the origin and history of digital content. Major search engines, social media platforms, and video-sharing sites generating at least $50 million annually or with over 25 million monthly users would be barred from tampering with this provenance information.
Most significantly, the COPIED Act would prohibit generative AIs from knowingly training on any media with provenance details without permission. The only exception would be if a platform obtained express, informed consent from the content owner and complied with any terms of use.
Enforcement and Legal Recourse
he Federal Trade Commission (FTC), state attorneys general, and rightsholders would be empowered to sue for alleged violations under the act.
However, the content provenance requirements would not take effect until two years after the law’s enactment, and litigation would need to commence within four years of discovering the alleged violation.
Industry Reactions and Support
Various organizations representing publishers and artists have expressed approval of the bill’s introduction, including SAG-AFTRA, the News Media Alliance, and the Artist Rights Alliance.
The Recording Academy’s Todd Dupler praised the senators’ commitment to ethical AI use, while NMPA boss David Israelite highlighted the act’s role in ensuring clear identification of AI-generated content.
RIAA chief Mitch Glazier emphasized the importance of provenance requirements for accountability and enforcement of creators’ rights.
Why This Matters
For music producers, the COPIED Act represents a significant step towards protecting their intellectual property from unauthorized use by AI systems.
By establishing clear guidelines and legal recourse, the act aims to safeguard the creative works of musicians and other content creators.
This legislation could potentially reshape the landscape of AI-generated content, ensuring that artists receive proper recognition and compensation for their work.
As the AI industry continues to evolve, these regulations will be crucial in maintaining the integrity and originality of creative content.
For more information on AI regulations and their impact on the music industry, check out related articles on the RIAA website and Music Ally.
isn’t it kinda hard to draw the line on what counts as unauthorized use tho? like where do we put creativity and transformation in this
Really insightful piece, Daniel. It’s about time legislation caught up with technology. Deepfakes are a real threat to intellectual property.
This act seems like a step in the right direction for artists. Protecting our work from being repurposed without consent is crucial.
Sounds great in theory, but I bet it’ll be tough to enforce. People find a way around everything.
I understand the concerns but what about the implications for AI development? We need to balance innovation with protection.
Exactly my thought. There’s always a tipping point where regulation starts stifling creativity. Have to be careful.
It’s interesting to see the FTC given this kind of power. Enforcement will be crucial to the act’s success. Details on the implementation are necessary.
wait till they make a deepfake of the law itself haha imagine that
Laws like these always sound good until they hit the real world. The devil’s in the details, and I’m not convinced they’ve thought this through.
Optimism’s dead, huh? Maybe give it a chance before writing it off?
imagine making a deepfake of a deepfake law, lawception lol