Social media CEOs testify on child safety before the Senate Judiciary Committee: Key Highlights

Lisa Battaglia
5 min readFeb 1, 2024
CEOs at Discord, Snapchat, TikTok, X, and Meta at the US Senate Judiciary Committee Hearing. Alex Wong/Getty

Today marked the first time a group of big tech CEOs testified on online child sexual exploitation before Congress. Throughout the three-plus hours, Congress grilled CEOs from Discord, X, Snap Inc., Meta, and TikTok on each of their actions to combat child exploitation on their platforms.

Below, I outline key highlights from the hearing and my predictions on what changes if any might come from this hearing.

  1. Senator Durbin started by calling out Discord, Snap Inc., and X for not attending voluntarily, but because they were subpoenaed. TikTok and Meta joined voluntarily. Senator Durbin then explained that US Marshals were forced to deliver subpoenas personally to X CEO Linda Yaccarino and Discord CEO Jason Citron after the companies declined to accept the summons.
  2. Senators made clear their opinions on Section 230. Senators advocated for repealing Section 230 because it eliminates liability for social media companies when they host CSAM (child sexual abuse material). While Senators asked for the CEOs’ support in repealing Section 230, the big tech CEOs were reluctant to agree because of its nuance.
  3. On that same note, Senators continued to request support from CEOs on legislation like the SHIELD Act, Stop CSAM Act, and the EARN IT Act which all aim to hold social media companies liable for hosting and distributing CSAM. CEOs (with the exception of X’s Linda Yaccarino who voiced support for the “EARN IT Act,” and Snapchat’s Evan Spiegel who has backed the “Kids Online Safety Act”) were reluctant to support the legislation because the implementation of those acts is more complicated.
  4. Speaking of complicated, the senators seemed to continue to have a limited understanding of how these social media companies work. Senators kept asking the tech CEOs whether they support the legislation, who they fired, why they have yet to compensate victims’ families without allowing the executives to explain the nuances and ramifications of that type of liability. Senators also kept noting the companies’ lack of action without noting their own.
  5. Each tech CEO noted their personal connection to this issue as a parent in their opening statements.
  6. Senator Durbin called out X for implementing changes in the last week in anticipation for the hearing. Last week, X announced they plan to implement a new content moderation team in Austin, TX. X CEO kept emphasizing that the company is 14 months old to justify these rapid changes (although Twitter has been around for much longer).
  7. Speaking of hiring changes, the executives emphasized their continuation of spending billions of dollars on platforms safety efforts, despite massive layoffs across the industry. TikTok intends to invest $2 billion in trust & safety this year and has 40,000 trust & safety professionals working at TikTok across the globe. Senator Welch inquired about the drastic amount of layoffs in tech and whether that impacted the social media companies trust & safety prioritization. All CEOs claimed that their investment has been consistent (even though layoffs have been impacting trust & safety teams the most).
  8. Many senators compared social media as the tool that kills children (not the bad actors themselves). Senators made the comparison to regulating guns, cigarettes, and recalling Boeing 737s after the door flew off the plane during an Alaskan Airlines flight. Senators claim that social media should undergo the same type of restrictions and regulations as any other industry.
  9. Speaking of fairness, the Senators’ questions were primarily directed at Meta CEO Mark Zuckerberg and TikTok CEO Shou Zi Chew. X maintained that they do not have a direct line of business for children, emphasizing that only 1% of their users are under 18. Very few questions were directed at Snap CEO Evan Spiegel despite Senator Graham stating that Snapchat is the go-to tool for CSAM.
  10. Many senators used this opportunity to raise separate issues like China’s access to US data through TikTok, TikTok’s promotion of pro-Hamas content, and Snapchat’s past incidents of children obtaining fentanyl-laced drugs on Snapchat. Snapchat emphasized its commitment to working with law enforcement and the DEA under the Cooper Davis Act to eliminate any drug exchange through its platform. Senator Corbin focused his line of questioning on the national security risk of TikTok, Senator Hawley kept asking Zuckerberg who he fired and whether he apologized or compensated the victims’ families, and Senator Cruz did not really let anyone answer any of his questions at all.
  11. Senator Tillis recognized the responsibility lies with Congress to follow through on the legislation. Senator Tillis also recognized that these tech leaders did not intend for this to happen when they invented these apps in their college dorm rooms, but hopes they continue to improve safety every waking minute of their lives. He also worries that if these social media companies are eliminated, the bad actors will turn to even worse platforms that are not regulated at all.
  12. Congress urged the tech CEOs to provide it with clear explanations and data on how effective the safety tools actually are. With Senator Padilla asking how do we ensure that parents are aware of these tools and X CEO advocating for law enforcement education, both demonstrate the complexity of enforcing safety tools.

Overall, Congress and big tech agree online child exploitation is rampant and needs to be managed. They cannot, however, agree on the right actions to take next. Congress voiced simple solutions: repeal Section 230, hold platforms liable for hosting any CSAM, and open the court system to allow families to sue social media companies.

Social media companies, on the other hand, understand those changes would create even more problems and the solution is more nuanced than the legislation’s solutions. Social media companies also claimed they were taking as much action as they possibly could. Congress said they clearly are not doing enough.

With X implementing a new child safety center in Texas after subpoenaed for this testimony, the Senate Judiciary Committee hearing was, if temporary, helpful in forcing X to increase their safety efforts. So maybe Congress can be the ones to hold big tech accountable, but the question remains who is going to hold Congress accountable.

Let me know your thoughts on this hearing in the comments below.

For communications consulting and marketing services for trust & safety and tech policy companies, email me at lisa@lisabtag.com

For more tech policy content, follow me here on Medium or subscribe to my podcast The Elevated Podcast wherever you listen to your podcasts.

--

--