Survivor written testimony for Jan. 31 Senate Judiciary Committee hearing

alvaro-serrano-hjwKMkehBco-unsplash

United States Senate Committee on the Judiciary

WRITTEN TESTIMONY BY Christine Almadjian, End Online Sexual Exploitation and Abuse of Children (OSEAC) Coalition

For a Hearing Entitled: Big Tech and the Online Child Sexual Exploitation Crisis

Wednesday, January 31st, 2023

Dear Chairman Durbin, Ranking Member Graham, and esteemed members of the Senate Judiciary,

My name is Christine Almadjian. I’m a graduate student at Columbia University and a member of the End Online Sexual Exploitation and Abuse of Children Coalition’s Survivors’ Council. It is with great honor that I stand here before you to share my opinions and thoughts, not forgetting the absolute urgency this issue requires from all these companies.

I may be 22, but I distinctly remember life at 11. As middle schoolers, many of us were entering a new realm of interaction as access to smartphones grew exponentially. Alongside the group chats with friends and interactive game apps came the hustle and bustle of social media. Some apps, such as Snapchat, Twitter (X), and Meta, were among the most popular. I remember my friends and I sitting around one another at lunch, retweeting posts about One Direction, and taking selfies with filters to send one another on Snapchat.

What we initially considered fun took very dark turns for many of us. As children, we did not have the foresight to detect dangerous situations the way we might now or the way I at least do. As we were invited to chat groups and private messaging avenues on the apps, exposure to predators became concurrent with our social media experience.

I can personally recall the predators that forced their way into my online existence, threatening my livelihood and my inability to discern danger from safety. These experiences became incredibly common, almost an unfortunate, defining factor of our early years navigating through social media. Although we would try to take action to defend ourselves, it was no use. Dangerous figures continued to remain in our vicinity, whether that was on the duplicate accounts or new ones they created quickly. I grew older and slowly started understanding what had happened to me and so many of my peers. As children, we had the right to a safe and fun social media experience. All we wanted to do was enjoy this new interactive life with one another.

When I browse social media currently, it is unfortunate to see the same, and arguably heightened, behavior showcased towards other minors on these same platforms. Time and time again, I have seen concerns for the safety of children present on these platforms, to no avail. Instances of child sexual abuse materials (CSAM) circulating on these very apps, the luring and grooming of children, and sextortion continue to rise exponentially. So, how do these companies claim they have taken further measures and action to prevent this?

Discord: Your head of Trust and Safety, John Redgrave, said in an October 2023 interview that technologies for CSAM detection on your platform “could be extended to video with enough effort.” I demand that this effort become more than just words. As the spread of CSAM grows, it is imperative that these detecting technologies also apply to videos and not just stills.

META: An unredacted version of the New Mexico lawsuit provided new information to the public regarding META’s knowledge of just how extensive exploitative material on the platform was. Another acknowledgment was of the popularity of your app with minors as young as six years old and how Meta “leveraged that” to achieve the goal of Facebook Messenger becoming the most popular for those audiences by 2022. But, META employees also flagged “sex talks” as 38 times more common on Facebook’s Messenger. With this in mind, the platform must consider the popularity of its app with children and commit to more than just acknowledging the existence of these issues on its platform.

Snapchat: Snapchat is among one of the most named platforms for CSAM. In 2021, the Vice President of Snapchat Global Public Policy, Jennifer Stout, claimed that “Snapchat was the anecdote to social media” as it “focuses on connecting people who already know each other” and “focuses on privacy by making photos and messages delete by default.” This is incredibly flawed logic when approaching the issue of rampant child sexual exploitation and predators’ access to children on Snapchat. It is incredibly easy, then, for predators to establish and “sustain” relationships they have with children, and to send explicit content secretly with said default delete. So, how will Snapchat respond to this increased risk?

TikTok: TikTok claims to have a “zero tolerance policy” for CSAM on its platform. But, according to numerous investigations by Forbes, TikTok’s “post-in-private-accounts” feature makes it easy for predators to meet and send sexually explicit images to children. Other features, like an easy workaround for banned accounts, showcase the weak spots in this zero-tolerance policy. As more minors continue to gain access to TikTok, how will the platform further advance its efforts in safeguarding children from predators? TikTok must implement better technology to monitor accounts, photos, and video content effectively.

X (Twitter): According to X’s transparency reports in 2022, they have been unwilling to cooperate with their own transparency mandates. This runs concurrent to issues like a decreased trust and security team. In the fall of 2022, when Elon Musk acquired X, and after he stated “removing child exploitation was his number one priority,” the team responsible for regulating and reporting CSAM on X was cut from 20 to 10 people. Furthermore, Musk has reportedly disbanded X’s “Trust and Safety” Council, a group of volunteers who would offer the company advice about online safety. Issues surrounding X’s database PhotoDNA, which reportedly detected and flagged CSAM, had ineffectiveness issues, as accounts flagged for CSAM were still up and running. Stanford’s Internet Observatory also reported 128 accounts advertising the sale of self-generated CSAM, and although most accounts were taken down, “a reported 22 of the 128 were still active over a month later.” Will X commit to furthering its efforts to overcome these hurdles and make detecting, reporting, and taking down these dangerous accounts easier?

I ask that these platforms further evaluate the effectiveness of their current technology when detecting CSAM and other predatory behaviors on their platforms. The present safety features are just not sufficient. I also ask that they listen to survivors and a very concerned society. This request can no longer be merely “considered” and once again neglected. Social media is only continuing to advance and grow, and it is your absolute responsibility to ensure the safety and security of your users. I do not want another child’s experience on social media to be one marked by a lack of safety and risk of exploitation. You have the capability to make these changes efficiently, and I demand that you do so to ensure a safer, more equitable future for social media users, especially those who are children and survivors.

Thank you,

Christine Victoria Almadjian

Subscribe to our newsletter

Sign up to receive updates, promotions, and sneak peaks of upcoming products. Plus 20% off your next order.

Promotion nulla vitae elit libero a pharetra augue

Nullam quis risus eget urna mollis ornare vel eu leo. Aenean lacinia bibendum nulla sed