[RAY BOGAN]
Elliston Berry was a 14 year old high schooler when another student used AI to create deep fake sexually explicit images of her and other girls, then posted them on SnapChat.
Elliston Berry, high school student: “I dreaded school, and was anxious to even step foot on campus, although the student left school, which allowed me to feel more confident about attending classes, there will always be a fear that these photos will resurface.”
[RAY BOGAN]
According to Elliston’s mother, the perpetrator acted with malice and intended to “ruin” the girls. He was charged with sale, distribution, and display of harmful material to a minor and given probation.
Anna McAdams, Elliston’s mother: “When he turns 18, his record will be expunged. He will walk away unscathed. However, our girls will forever live in fear that when they apply for a job or college. There needs to be consequences for what he did.”
[RAY BOGAN]
Senator Ted Cruz just introduced the Take it Down Act to create harsher penalties for both the perpetrator and tech companies if they don’t remove the content.
Those convicted of creating and posting the intimate material could be sentenced to two years in prison if it’s an image of an adult, three years if it’s an image of a child. It would also require social media companies to remove the images within 48 hours after receiving a request from a victim. If the company fails to make a good faith effort, it can be considered an unfair or a deceptive act under the Federal Trade Commission Act, which is enforced by the FTC.
Sen. Ted Cruz, R-TX: “it can be maddening dealing with big tech and trying to get these images, these fake images of your child, taken down, and big tech, over and over again, has demonstrated an arrogance and imperiousness, a lack of accountability.”
[RAY BOGAN]
Cruz says the law is partially modeled off US copyright laws. Because when someone posts copyrighted material on a platform, it is taken down almost immediately and the user can be subject to a lifetime ban.
Dawn Hawkins, National Center on Sexual Exploitation: “If you upload a clip from Disney or pop song, it’s gone in a heartbeat, risking an immediate ban from the platform. Yet, if you upload a rape or hide a camera in a locker room or create a deep fake pornographic image, you can do so without scrutiny.”
[RAY BOGAN]
States have their own laws protecting people from non-consensual intimate imagery, but only 20 states cover deep fake images. This bill will cover deep fakes on the national level and adds the requirement that social platforms remove the content.
Sen. Cynthia Lummis, R-WY: “Think back to what it’s like to be in junior high and high school, and that insecurity, that discomfort that we all have in our own skin, and then ask you to consider what these girls have been through.”
[RAY BOGAN]
Straight Arrow News speaks with members of Congress about their efforts to regulate big tech on a regular basis, so download the Straight Arrow News app for coverage straight from the Capitol.