CEOs Snap, Meta, X, and TikTok has all testified before the United States Senate Judiciary Committee. They had to explain in detail what their company will do to handle child exploitation content and answer questions on each of their current actions of new programs designed to augment protection for youngsters. Lashing out at the platform, some of these senators did not hesitate.
An extension of an earlier session, today the Senate heard from child safety experts on the harms of social media apps in regard to their role in causing harm to children. Originally scheduled late last year, today's hearing had to be rescheduled so that all CEOs could be present.
Today, the company chiefs themselves had the chance to present their side of the story and detail what each is doing to combat child sexual abuse material (CSAM).
First off, each of the CEOs shared a pre-prepared statement, which provided an overview of their efforts and plans.
Meta CEO Mark Zuckerberg explained the protective systems Meta has, that is comprised of 40,000 dedicated staff members working on safety and security while Zuckerberg said Meta has also invested over $20 billion in this element since 2016.
Zuckerberg further responded to the criticisms expressed in the previous session as to the harms brought by social media apps:
"A new report by the National Academies of Sciences looked at data from more than 300 studies and found that the studies 'did not support the conclusion that social media causes changes in adolescent mental health at the population level.' It also found that social media can be highly beneficial when youth use it for self-expression, exploration, and connection."
Zuckerberg also reiterated Meta's recently announced proposal that the app stores be made liable for underage downloads.
"For example, 3 out of 4 parents support app store age verification, and 4 out of 5 parents want legislation requiring app stores to get parental permission before teens download apps.".
While Zuckerberg is not unwilling to take his share of the heat, he also set a tone early that he believes that there are counterpoints to those that have been put forth by child safety advocates.
X CEO Linda Yaccarino emphasized her viewpoint, as a mother, in outlining X's work around broader protections for its youngest users.
"In the last 14 months X has made material changes to protect minors. Our policy is very clear – X has zero tolerance toward any content that contains or endorses child sexual exploitation."
X suspended more than 12 million accounts for violating its CSE policies, while it also sent 850,000 reports to The National Center for Missing & Exploited Children (NCMEC), via a new automated reporting system designed to streamline the process.
Yaccarino covered the same in a recent post on X, but the automated reporting aspect, specifically, could create more problems as far as incorrect reports go. On the other hand, it may help lessen the labor burden on X and it has 80 percent fewer employees than the old Twitter team, so it must make use of automated answers where possible.
Yaccarino added that X is introducing a new moderation team of 100 individuals based in Texas, with the sole duty of looking at CSAM content.
Snapchat CEO Evan Spiegel cited the service's underlying approach to privacy in his response:
Snapchat is private by default, which means that folks have to opt-in so they can add friends or choose who can contact them. When we built Snapchat, we decided we wanted the pictures and videos that go over our network to be deleted by default. Just as previous generations have leveraged the privacy to take a private phone call not recorded for posterity, our generation has taken the capability to share fleeting moments on Snapchat that lack the perfect shot but manage to capture an emotion without duration.
The report also cites spillover data from the NCMEC, where it points out that Snap made an average of 690,000 NCMEC reports last year.
The firm's CSAM detection program is in the words of TikTok CEO Shou Zi Chew to be transformed soon. In fact, huge investments in new initiatives.
"We now have more than 40,000 trust and safety professionals worldwide working hard to protect our community, and this year alone we'll spend more than two billion dollars on trust and safety-with a big share of that investment in our US business.".
Perhaps the toughest position is that of TikTok, as several senators are already pushing for a ban on the app due to its perceived linkage with the Chinese government. But Chew argued that the platform leads the way on many CSAM detection elements and looks forward to building on them wherever it can.
The session featured a litany of pointed questions from the Senate floor, including this remark from Senator Lindsey Graham:
"Mr. Zuckerberg, you and the companies before us, I know you don't mean it to be so, but you have blood on your hands. You have a product that's killing people."
Much of the angst landed on Zuckerberg, who, by the way, runs the two most used social media platforms in the world.
Senator Josh Hawley pushed Zuckerberg to apologize to families that have been harmed by his company's apps, which, somewhat surprisingly, Zuckerberg did, turning to the gallery to issue a statement to a group of parents that were in attendance:
"I'm sorry for everything you all have endured. No one should ever have to endure the things that your families have suffered and this is why we invest so much and we are going to continue doing industry wide efforts to make sure no one has to endure the things your families have had to suffer."
However, simultaneously, there's a new report which claims that in 2021 Zuckerberg denied requests to expand Meta's safeguard resources and that too came from the employees.
The New York Times reported:
"In 90 pages of internal emails from fall 2021, top officials at Meta, which owns Instagram and Facebook, debated the addition of dozens of engineers and other employees to focus on children's well-being and safety. One proposal to Mr. Zuckerberg for 45 new staff members was declined."
Zuckerberg faced his challenges well, but much has still to be achieved about this matter by Meta.
Several senators took advantage of today's session to push for changes in the law, particularly in Section 230, to further limit the protections afforded social platforms over harmful content. Repeals of Section 230, which protects social apps from lawsuits for the content that users share, have been rebuffed so far; it will be interesting to see if this angle moves the discussion forward.
In terms of specifics on platforms, Yaccarino was questioned about X's reduced staff levels and how it is impacting its CSAM detection programs, while Spiegel was pressed on the role Snap has played in facilitating drug deals, and in particular, fentanyl trading. Both provided sanitized assurances that more was being done to up their efforts.
It was a contentious session, as Senators aimed to push their case that social platforms need to do much more to protect young users. I am not convinced that any of the proposed changes to the law will actually hold up because of this grilling today, but it is interesting to see the various elements at play and how the major platforms are looking to implement solutions to address the concerns.