A newly unredacted version of the multi-state lawsuit against Meta alleges a pattern of deception and minimization in how the company handles kids under 13 on its platforms. Internal documents appear to indicate that its practice with regard to this allegedly forbidden demographic is far more laissez-faire than it publicly has claimed.
Filed last month, the complaint alleges pervasive destructive practices by the company with regard to the health and well-being of younger users of its service. From body image to bullying, privacy invasion to engagement maximization, all the evils of social media are laid at Meta's door—perhaps rightly, but it also gives the appearance of a lack of focus.
In one respect at least, however, the documentation obtained by the attorneys general of 42 states is quite specific, "and it is damning," as AG Rob Bonta of California put it. That is in paragraphs 642 through 835, which mostly document violations of the Children's Online Privacy Protection Act, or COPPA. This law created very specific constraints around young folks online, and limits data collection and insists on things like parental consent for various actions, but most tech companies seem to treat it as more like a suggestion than requirement.
When they request pages and pages of redactions, you know it's bad news for the company.
This recently happened with Amazon, too, and it became apparent that they were trying to hide the existence of a price-hiking algorithm that skimmed billions off consumers. It's much worse when you are redacting COPPA complaints.
"We're very bullish and confident in our COPPA allegations. Meta is knowingly taking steps that harm children, and lying about it," AG Bonta told TechCrunch in an interview. "In the unredacted complaint we see that Meta knows that its social media platforms are used by millions of kids under 13, and they unlawfully collect their personal info. It shows common practice where Meta says one thing in its public-facing comments to Congress and other regulators, while internally it says something else."
Meta does not collect—or even try to collect—verifiable parental consent before gathering the personal data of children on Instagram and Facebook… But Meta's own records show that it has actual knowledge that Instagram and Facebook target and successfully enroll children as users.
In other words, while the problem of identifying kids' accounts created in violation of platform rules is certainly a tough one, Meta allegedly chose to look the other way for years rather than implement stricter rules that would necessarily affect user numbers.
Meta said that the suit "mischaracterises our work using selective quotes and cherry-picked documents," and that "we have mechanisms in place to remove those [i.e. accounts of under-13-years-olds] when we identify them. However, verifying one's age online is very difficult industry-wide.".
Here are a few of the most striking parts of the suit. While some of these allegations relate to practices from years ago, bear in mind that Meta (then Facebook) has been publicly saying it doesn't allow kids on the platform, and diligently worked to detect and expel them, for a decade.
Meta has internally monitored and reported on under-13s, or U13s, as part of its audience segments for years, according to charts in the filing. It said, for instance, in 2018 that 20% of 12-year-old Instagram users used it daily. And this wasn't in a presentation on how to eliminate them-it is speaking to market penetration. The other graph is of Meta's "knowledge that 20-60% of 11- to 13-year-old users in particular birth cohorts had actively used Instagram on at least a monthly basis."
It is hard to square this with the public position that users this age are not welcome. And it isn't because leadership wasn't aware.
In the same year, in 2018, the report had indicated that there were almost 4 million users under the age of 13 in Instagram in the year 2015. They estimated such numbers would mean a third of all 10- and 12-year-olds who were living in the United States. These numbers might be outdated; however, they are remarkable. We have never, to this date, found Meta revealing such massive numbers and proportions in its user count under 13 years.
Not externally, at least. Internally, the figures appear to be recorded very well. For instance, as the complaint alleges:
Meta has data from 2020 showing that, out of 3,989 children surveyed, 31 percent of child respondents aged between 6 and 9 years old and 44 percent of child respondents aged 10 to 12-years-old had used Facebook.
It's hard to extrapolate from the 2015 and 2020 figures to today's (which, as we have already seen from the evidence displayed here, will almost certainly not be the whole story), but Bonta pointed out that the large numbers are presented for effect, rather than as a source of legality.
The whole idea remains that their platforms are accessed by millions of children below 13 years old. Be it 30 percent, or 20 percent or 10 percent…it involves any child and is against the law," he said. "If they were engaged in such at any one time, then it went against the law then. And we cannot be assured they have altered their behavior pattern.
A slide titled "2017 Teens Strategic Focus" appears to target children under 13 years old, as it notes that children begin using tablets at 3 or 4 and "Social identity is an Unmet need Ages 5-11." Among the objectives listed in the lawsuit was to "grow [Monthly Active People], [Daily Active People] and time spent among U13 kids."
Let it be noted here that Meta doesn't allow accounts run by people under 13 years old, but, actually, there are multiple ways that are legal and secure Meta can interact with this very segment. Some kids only like watching videos on the SpongeBob Official and, of course, this also could be okay. Yet Meta is to check whether this child has its parent's permission, or the parent's one - and this will only partly limit the ways through which they can collect and process it.
Why 42 states came together to sue Meta over kids’ mental health
But the redactions suggest these under-13 users are not of the lawfully and safely engaged type. Reports of underage accounts are said to be automatically ignored, and Meta "continues collecting the child's personal information if there are no photos associated with the account." Of 402,000 reports of accounts owned by users under 13 in 2021, fewer than 164,000 were disabled. And these actions reportedly don't cross between platforms, meaning an Instagram account being disabled doesn't flag associated or linked Facebook or other accounts.
Testifying to Congress in March 2021, Zuckerberg testified that "if we detect someone might be under the age of 13, even if they lied, we kick them off." (And "they lie about it a TON," one research director quipped in another quote). Yet documents cited by this same lawsuit from the following month report that "Age verification(for under 13) has a big backlog and demand is outpacing supply, because of lack ofcapacity. How big a backlog? At times, the complaint alleges, on the order of millions of accounts.
A smoking gun may be found in a series of anecdotes from Meta researchers delicately avoiding the possibility of inadvertently confirming an under-13 cohort in their work.
One said in 2018: "We just want to make sure to be sensitive about a couple of Instagram-specific items. For example, will the survey go to under 13 year olds? Since everyone needs to be at least 13 years old before they create an account, we want to be careful about sharing findings that come back and point to under 13 year olds being bullied on the platform."
In 2021, another, who studied "child-adult sexual-related content/behavior/interactions" (!), stated that she "did not includ[e] younger kids (10-12 yos) in this research" even though there "are definitely kids this age on IG," because she was "concerned about risks of disclosure since they aren't supposed to be on IG at all."
Also in 2021, Meta instructed a third-party research company operating a survey of preteens to delete any data that would disclose that a survey participant was on Instagram, so the "company won't be told that under 13 are being surveyed."
In late 2021, independent researchers provided Meta with data demonstrating "of children ages 9-12, 45% used Facebook and 40% used Instagram daily."
Facebook is testing age verification technology on Facebook Dating in the US
During an internal 2021 study on youth in social media described in the suit, they first asked parents if their kids are on Meta platforms and removed them from the study if so. But one researcher asked, "What happens to kids who slip through the screener and then say they are on IG during the interviews?
"We're not collecting user names right?" Instagram Head of Public Policy Karina Newton responded.
In other words, nothing happens. As the lawsuit reads:
Even if Meta learns of specific children on Instagram through interviews with the children, Meta takes the position that it still does not have actual knowledge that it is collecting personal information from an under-13 user because it does not collect user names while conducting these interviews.
In this manner, Meta goes to great pains to avoid meaningful compliance with COPPA and look for loopholes to escape its duty of knowledge and excuse itself from the continued presence of users under the age of 13 on the Platform.
The other complaints in the complaint have softer edges, but they include the argument that use of the platforms has contributed to poor body image and that Meta has failed appropriately to respond. That's arguably not as actionable. But the COPPA stuff is far cut and dry. "We have evidence that parents are writing them notes that their children are on their platform, and they are not receiving any action. I mean what more do you need? It shouldn't even need to reach that point," Bonta said. "These social media platforms can do anything they want," he continued. They can be operated by another algorithm, they can have plastic surgery filters or not have them, they can give you alerts in the middle of the night or during school, or not. They do things that maximize the frequency of use of that platform by children, and the duration of that use. They could stop it all of this today, if they were to choose to, quickly prevent minors under 13 from accessing their platform. undefinedBut they're not.