Skip to main content
Ray Bogan Political Correspondent
Share
Politics

Cruz’s ‘Take It Down Act’ will force social media to pull deepfake intimate images

Ray Bogan Political Correspondent
Share

Elliston Berry was a 14-year-old high schooler when another student used AI to create deepfake sexually explicit images of her and other girls, then posted them on Snapchat. It ultimately took a phone call from Sen. Ted Cruz, R-Texas, to get the images removed from the social media app. 

“I dreaded school and was anxious to even step foot on campus,” Elliston said during a press conference on Capitol Hill Tuesday, June 18. “Although the student left school, which allowed me to feel more confident about attending classes, there will always be a fear that these photos will resurface.”

QR code for SAN app download

Download the SAN app today to stay up-to-date with Unbiased. Straight Facts™.

Point phone camera here

According to Elliston’s mother, the perpetrator acted with malice and intended to “ruin” the girls. He was charged with sale, distribution, and display of harmful material to a minor and given probation. 

“When he turns 18, his record will be expunged,” Anna McAdams, Elliston’s mother, said. “He will walk away unscathed. However, our girls will forever live in fear that when they apply for a job or college these pictures might resurface. There needs to be consequences for what he did.”

Cruz introduced the Take it Down Act to create harsher penalties for both the perpetrator and tech companies if they don’t remove the content. 

Those convicted of creating and posting the intimate material could be sentenced to two years in prison if it’s an image of an adult and three years if it’s an image of a child.

The bill would also require social media companies to remove the images within 48 hours after receiving a request from a victim. If the company fails to make a good faith effort, it can be considered an unfair or a deceptive act under the Federal Trade Commission Act, which is enforced by the FTC. 

“It can be maddening dealing with big tech and trying to get these images, these fake images of your child, taken down,” Cruz said. “And Big Tech, over and over again, has demonstrated an arrogance, an imperiousness, a lack of accountability.”

Cruz said the bill is partially modeled off U.S. copyright laws because when someone posts copyrighted material on a platform, it is taken down almost immediately and the user can be subject to a lifetime ban. 

“If you upload a clip from Disney or pop song, it’s gone in a heartbeat, risking an immediate ban from the platform,” Dawn Hawkins, CEO of the National Center on Sexual Exploitation, said. “Yet if you upload a rape, or hide a camera in a locker room or create a deep fake pornographic image, you can do so without scrutiny.” 

States have their own laws protecting people from non-consensual intimate imagery, but only 20 state laws specifically deal with deepfake images. This bill will cover deepfakes on the national level and adds the requirement that social platforms remove the content. 

“Think back to what it’s like to be in junior high and high school, and that insecurity, that discomfort that we all have in our own skin, and then ask you to consider what these girls have been through,” Sen. Cynthia Lummis, R-Wyo., said. 

Multiple bills addressing non-consensual intimate images have been brought forward for consideration. It is unclear which, if any, have enough support to become law. Cruz’s bill has 12 bipartisan co-sponsors. Whichever bill ultimately moves forward will likely have parts of each proposal and become part of a larger package to protect kids online.

Tags: , , , , , ,

[RAY BOGAN]

Elliston Berry was a 14 year old high schooler when another student used AI to create deep fake sexually explicit images of her and other girls, then posted them on SnapChat. 

Elliston Berry, high school student:I dreaded school, and was anxious to even step foot on campus, although the student left school, which allowed me to feel more confident about attending classes, there will always be a fear that these photos will resurface.”

[RAY BOGAN]

According to Elliston’s mother, the perpetrator acted with malice and intended to “ruin” the girls. He was charged with sale, distribution, and display of harmful material to a minor and given probation. 

Anna McAdams, Elliston’s mother:When he turns 18, his record will be expunged. He will walk away unscathed. However, our girls will forever live in fear that when they apply for a job or college. There needs to be consequences for what he did.”

[RAY BOGAN]

Senator Ted Cruz just introduced the Take it Down Act to create harsher penalties for both the perpetrator and tech companies if they don’t remove the content. 

Those convicted of creating and posting the intimate material could be sentenced to two years in prison if it’s an image of an adult, three years if it’s an image of a child. It would also require social media companies to remove the images within 48 hours after receiving a request from a victim. If the company fails to make a good faith effort, it can be considered an unfair or a deceptive act under the Federal Trade Commission Act, which is enforced by the FTC. 

Sen. Ted Cruz, R-TX:it can be maddening dealing with big tech and trying to get these images, these fake images of your child, taken down, and big tech, over and over again, has demonstrated an arrogance and imperiousness, a lack of accountability.”

[RAY BOGAN]

Cruz says the law is partially modeled off US copyright laws. Because when someone posts copyrighted material on a platform, it is taken down almost immediately and the user can be subject to a lifetime ban. 

Dawn Hawkins, National Center on Sexual Exploitation:If you upload a clip from Disney or pop song, it’s gone in a heartbeat, risking an immediate ban from the platform. Yet, if you upload a rape or hide a camera in a locker room or create a deep fake pornographic image, you can do so without scrutiny.” 

[RAY BOGAN]

States have their own laws protecting people from non-consensual intimate imagery, but only 20 states cover deep fake images. This bill will cover deep fakes on the national level and adds the requirement that social platforms remove the content. 

Sen. Cynthia Lummis, R-WY:Think back to what it’s like to be in junior high and high school, and that insecurity, that discomfort that we all have in our own skin, and then ask you to consider what these girls have been through.”

[RAY BOGAN]

Straight Arrow News speaks with members of Congress about their efforts to regulate big tech on a regular basis, so download the Straight Arrow News app for coverage straight from the Capitol.