Last October, Elliston Berry, a 14-year-old teenager, woke up to a horrifying situation. Her phone was inundated with calls and messages informing her that someone had shared fabricated nude images of her on Snapchat and other social media platforms. Berry, hailing from Texas, expressed her fear and anxiety as the news spread throughout her entire school. These distressing images were what is known as deepfakes, artificially generated media that appears remarkably realistic. Deepfakes have become increasingly prevalent in recent years, often used to impersonate prominent public figures or create fake explicit content. However, they can also inflict significant harm on ordinary individuals.
Now 14 years old, Berry is urging lawmakers to establish legal penalties for those responsible for creating and distributing deepfake images in order to safeguard potential future victims. Upon discovering the situation, Berry promptly confided in her parents. Her mother, Anna McAdams, recognized the falseness of the images and made multiple attempts over an eight-month period to have them removed from Snapchat.
Although the deepfakes of Berry were eventually taken down, McAdams expressed her frustration to CNN regarding the lack of consequences faced by the classmate who distributed them. She highlighted that this individual would only receive minimal probation and have their record expunged once they turned 18, leaving the true impact of their actions unknown.

This week, Republican Senator Ted Cruz, along with Democratic Senator Amy Klobuchar and several colleagues, introduced a bill that would mandate social media companies to remove deepfake pornography within two days of receiving a report. Known as the Take It Down Act, this legislation would also classify the distribution of such images as a felony. Perpetrators targeting adults could face up to two years in prison, while those targeting children could face even harsher penalties.









