U.S. Senator Cynthia Lummis (R-WY) joined Senator Ted Cruz (R-TX) in introducing the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Actto protect and empower victims of non-consensual intimate image abuse, also known as “revenge pornography.” The bill would criminalize the publication of non-consensual intimate imagery (NCII), including AI-generated NCII (or “deepfake pornography”), and require social media and similar websites to have procedures in place to remove such content upon notification from a victim.
“Artificial intelligence is the future, and the way this technology can be used to improve lives around the globe is truly infinite. With any new industry comes the need to ensure it is not being used by bad actors, and AI is no different,” said Lummis. “I am proud to join my colleagues in introducing legislation to protect people against malicious and dangerous deepfakes that also allows innovation to continue in the United States.”
The bill would criminalize the publication of non-consensual intimate imagery (NCII), including AI-generated NCII (or “deepfake pornography”), and require social media and similar websites to have in place procedures to remove such content upon notification from a victim.
By requiring a notice and takedown process from websites that contain user generated content, including social media sites, the TAKE IT DOWN Act will ensure that, if the content is published online, victims are protected from being retraumatized again and again.
The TAKE IT DOWN Act would protect and empower victims of real and deepfake NCII while respecting speech by:
- Criminalizing the publication of NCII in interstate commerce. The bill makes it unlawful for a person to knowingly publish NCII on social media and other online platforms. NCII is defined to include realistic, computer-generated pornographic images and videos that depict identifiable, real people. The bill also clarifies that a victim consenting to the creation of an authentic image does not mean that the victim has consented to its publication.
- Protecting good faith efforts to assist victims. The bill permits the good faith disclosure of NCII, such as to law enforcement, in narrow cases.
- Requiring websites to take down NCII upon notice from the victim. Social media and other websites would be required to have in place procedures to remove NCII, pursuant to a valid request from a victim, within 48 hours. Websites must also make reasonable efforts to remove copies of the images. The FTC is charged with enforcement of this section.
Protecting lawful speech. The bill is narrowly tailored to criminalize knowingly publishing NCII without chilling lawful speech. The bill conforms to current first amendment jurisprudence by requiring that computer-generated NCII meet a “reasonable person” test for appearing to realistically depict an individual.