The Sexual Violence Prevention Association (SVPA) commends the bipartisan House Task Force on Artificial Intelligence (AI) for their work this congress and for publishing the 2024 Bipartisan AI Task Force Report. This report is comprehensive and provides critical insights into artificial intelligence with the stated goal of acting as a blueprint for future legislative actions to address advances in AI technologies.
The Problem
The report acknowledges that while tools like generative AI hold potential for innovation, their misuse poses immediate and devastating harms.
“One of the most pervasive harms from synthetic content generated by contemporary AI systems is the creation and distribution of nonconsensual intimate images (NCII),” (p.139)
“Between 90 and 95% of online deepfake videos are non-consensually generated pornography, 90% of which target a female victim.” (p.130)
“There is growing evidence that perpetrators of deepfake pornography are increasingly using these tools to exploit victims for extortion, blackmail, and public humiliation.” (p.160)
“Harmful uses of this technology are not limited to adults. In one month in 2023, the Internet Watch Foundation found that 3,000 AI-generated images of illegal CSAM were posted to a dark web forum, with the vast majority of these being realistic pseudo-photographs. More than 99% of these images were of girls, primarily aged 7-13.” (p.140)
Need for Legislation
The Task Force report finds that there are “notable gaps” in state and federal legislation addressing non-consensual explicit materials (NCEMs). NCEMs, formerly known as deepfake pornography, cannot be adequately covered by state laws or existing federal laws.
“The effectiveness of state laws is limited by jurisdictional constraints and inconsistencies among statutes, creating challenges for individuals seeking redress,” (p.131).
A report by the United States Copyright Office (USCO) is cited saying, “USCO found that state laws are inconsistent and insufficient to address the problems exposed by AI. The Office also found existing federal laws to be insufficient,” (p.131).
“The USCO concluded in a recent report on digital replicas, that ‘new federal legislation is urgently needed’” (p.131)
The Task Force report explicitly recommends the passage of federal legislation to “appropriately counter the growing harm of AI-created deepfakes,” (p136). This aligns with the SVPA’s work on the DEFIANCE Act and the Take It Down Act.
Defiance Act
The Task Force report demonstrates the deeply harmful impact of NCEM on victims, and the lack of legal support available to them, supporting the need for a federally instituted civil right of action:
“Victims of non consensual intimate content face significant risks, including emotional trauma, reputational damage, extortion, and long-term social isolation.” (p.160)
“Congress should investigate whether victims have sufficient ability to seek redress for harms from digital content, such as NCII, from those who create or distribute these forgeries, as well as legal barriers preventing such redress. Congress should consider other redress mechanisms for victims, such as civil penalties for cases involving AI fraud and NCII.” (p.154)
These findings reinforce the need for the DEFIANCE Act, a federal bill that would empower victims with a civil right of action to seek justice against perpetrators of NCEM.
Take It Down Act
The Task Force report also supports the need for stronger regulatory tools for NCEM, including holding platforms accountable for moderating and addressing harmful AI-generated content:
“NCII can be used for harms such as emotional and reputational harm, extortion, silencing political participation, and other malicious or criminal activities. (p.140)”
“Technical solutions to various content authentication challenges are stymied by a lack of implementation by online platforms.” (171)
“Watermarks can be removed, faked, or rendered ineffective without undue effort, making moderation tools essential for content integrity.” (152)
These findings support the need for the Take It Down Act, which criminalizes the creation and distribution of NCEMs and holds platforms accountable for failing to remove this harmful content.
Call to Action
“As a victim of deepfake pornography, I am deeply appreciative of the AI Taskforce’s work and bold recommendations for legislation. We urgently need the DEFIANCE Act and the Take It Down Act to prevent these harms and to empower myself, and countless victims across the country, with recourse.” -Omny Miranda Martone, Founder and CEO of the Sexual Violence Prevention Association (SVPA)
The SVPA strongly supports the Task Force’s recommendations. In alignment with these recommendations, we urge Congress to pass the Defiance Act and the Take it Down Act to ensure protections for victims and prevent further harms of AI.