Big Tech was in the crosshairs yet again Wednesday as a handful of executives fielded blistering questions in a Senate hearing about the impact of their products on children.
Facing the Senate Judiciary Committee, the chief executives of Meta Platforms Inc.
META,
TikTok, Discord, Snap Inc.
SNAP,
and X — Mark Zuckerberg, Shou Zi Chew, Jason Citron, Evan Spiegel and Linda Yaccarino, respectively — testified.
“Discord has been used to groom, abduct and abuse children. Meta’s Instagram helped connect and promote a network of pedophiles. Snapchat’s disappearing messages have been co-opted by criminals who financially sextort young victims,” Sen. Dick Durbin, a Democrat from Illinois and the chair of the committee, said in a scorching opening statement. “TikTok has become a ‘platform of choice’ for predators to access, engage and groom children for abuse, and the prevalence of [child sexual abuse material] on X has grown as the company has gutted its trust and safety workforce.”
Durbin, who pointed out that several tech solutions were carried out by the companies in the days leading up to Wednesday’s hearing, confronted executives about their platforms’ harmful effects.
A handful of senators — led by Amy Klobuchar, D-Minn., and Lindsey Graham, R-S.C. — threatened to make social-media companies liable to legal action from victims, so they could be sued for damages in court.
Several victims and their families were presented in a brief video and put the blame squarely on the tech companies. The audience was full of families and parents whose children died because of bullying and drug sales on social-media platforms, hundreds of whom sent a letter to pressure Congress to act urgently.
In 2013, the National Center for Missing & Exploited Children received approximately 1,380 cyber tips of material related to child sexual abuse per day. By 2023, the number was 100,000 reports per day.
“Big Tech executives have been making the same hollow promises for years as kids are suffering horrific harms online,” the senators said in a joint statement. “Without real and enforceable reforms, social-media companies will only continue publicly pretending to care about young people’s safety while privately prioritizing profits.”
At one point, when pressed by Missouri Republican Sen. Josh Hawley, Zuckerberg turned around and apologized to the families of victims.
“I’m sorry for everything you have all been through. No one should go through the things that your families have suffered,” Zuckerberg said, adding that Meta is working on ways to better protect children.
Several bills are circulating in Congress to address the issue. The bipartisan Kids Online Safety Act, co-authored by Sens. Richard Blumenthal, a Connecticut Democrat, and Marsha Blackburn, a Tennessee Republican, is aimed at providing children and parents with better tools to protect themselves online, hold Big Tech accountable for harms and provide transparency into black-box algorithms. The bill has the support of nearly half of the U.S. Senate.
“Mr. Zuckerberg, do you believe you have a constitutional right to lie to Congress?” Blumenthal said at one point.
Sen. Chris Coons, D-Del., noted the CEO’s lack of support for any kids-related safety bills as a “yawning silence.”
Coons and others’ frustration underscores years over which there has been no substantial legislation that protects Americans from harms caused by digital platforms, amid divisions in Congress and resistance from tech companies. The last major law was the Children’s Online Privacy Protection Act of 1998.
In a series of opening statements, the tech CEOs outlined the steps they are taking to protect children and vowed to work with legislators. Zuckerberg testified that Meta has built more than 30 tools, resources and features to help protect teenagers and give parents oversight and control over how their children are using the company’s services, as well as to provide special protections for teen accounts.
Zuckerberg even suggested that the app stores of rivals Apple Inc.
AAPL,
and Google parent
GOOGL,
GOOG,
Alphabet Inc. should make it harder for youths to download potentially harmful apps, and leave the decision to their parents.
In her response, Yaccarino said that the platform formerly known as Twitter is not the platform of choice for children and minors. Users between the ages of 13 and 17 account for less than 1% of U.S. daily users, she said, adding: “We have made it more difficult for bad actors to share or engage with CSE [child sexual exploitation] material on X, while simultaneously making it easier for our users to report CSE content.”
TikTok, Chew said, “is continuously working to provide a safe app experience for our community, and we aim to be a leader in this area.” He added: “We recognize, however, that technology is ever-evolving and that we need to be prepared to address unexpected trends and challenges as they arise.”