Ireland's media and internet watchdog, Coimisiún na Meán, has taken on and published an Online Safety Code that will apply from next month to video-sharing platforms headquartered in the country, including ByteDance's TikTok, Google-owned YouTube, and Meta's Instagram and Facebook Reels.
Under the Code, in-scope platforms must have terms and conditions that ban uploads or sharing of a range of types of harmful content, including cyberbullying; self-harm or suicide; eating or feeding disorders, in addition to banning inciting hatred and/or violence, terrorism, child sex abuse material (CSAM), and racism and xenophobia.
The Code's purpose would be to fill in content types that do not fall within the scope of the pan-EU law known as the Digital Services Act (DSA), confirmed spokesman for Comisiún na Meán Adam Hurley.
The latter, which has broadly applied since mid February, is focused on online governance of illegal content-from such things as CSAM-to issue with broader sweep of harms targeted by Comisiún na Meán's Code.
One of the thoughts behind the Online Safety Code is addressing content that's more harmful rather than illegal, Hurley told us, adding: What we've done is broaden the scope to harmful content that they must prohibit uploading of and then act on reports against those terms and conditions.
It's a prohibition on uploading in their terms and conditions. So they have to prohibit the uploading of these types of content in their own terms and conditions, and then they'll have to enforce those terms and conditions," he said.
Code: The Code will apply directly only to video services offered to users in Ireland, including several major social media companies because their regional headquarters is here. However, tech companies may take the same measures everywhere else in the region in order to simplify compliance and avoid awkward questions about inconsistent content standards.
Notice and takedown
Another key point in here is that EU law does not permit a general monitoring obligation on platforms. So, the Online Safety Code to be enforced by Ireland on them will not ask them to employ upload filters, according to Hurley. Rather, he confirmed it's basically an extension of the notice and take down approach-by allowing users to report harmful content as well and waiting for such platforms to remove it.
The Code, therefore, much like the DSA, requires platforms to have ways for people to report the above-harmful content types so that they can act on reports in line with their T&Cs.
Age assurance for porn
The Code requires video hosting sites which permit pornographic content or gratuitous violence in their T&Cs to have "appropriate" age assurance in place in an attempt to prevent minors accessing such material.
Hurley said there are no approved age assurance technologies per se; however, the regulator will determine on a case-by-case basis what's appropriate.
The Code also requires video-sharing hosts of that content to have user-friendly content rating systems.
In addition, the platforms should provide parental controls over all content that might "harm the physical, mental or moral development of minors under 16 years of age," according to the Coimisiún na Meán statement.
Recommender algorithms
The Irish data regulator had considered requiring video-sharing platforms by default to have their profiling-based content recommendations switch off as a safety measure – which might put TikTok in a position where it has to switch off its algorithm by default.
However, last year following a consultation, the measure would not make it in the final Code, a spokesman from the Coimisiún na Meán affirmed. "It was considered a supplementary [to the Code] and we've come down on the position that the best way to deal with recommender systems—the potential harm of recommender systems—is through the [EU's] Digital Services Act," he said in an interview with TechCrunch.
We have pressed the regulator on how the Code will therefore mitigate harms driven through algorithmic amplification, another stated aim.
The finalised Code is part of Ireland's broader Online Safety Framework, seeking to make digital services responsible for safeguarding users against harm online – an obligation provided for by Ireland's Online Safety and Media Regulation Act.
The EU's DSA applies around the bloc so is also in force in Ireland, with the Coimisiún na Meán responsible for enforcing the regulation's general rules on any locally headquartered companies in scope — in addition to overseeing the new Online Safety Code.
We can only comment further on this statement: "The adoption of the Online Safety Code ends the era of social media self-regulation. The Code lays down binding rules for video-sharing sites to abide by so that those sites, with better regulatory restraint, minimize their capacity to harm users.". We will educate people on rights online, while holding the platforms accountable, and acting when platforms don't live up to their obligations.
With the adoption of the Online Safety Code, all the elements of our Online Safety Framework are now in place. Our focus now is on fully implementing the Framework and driving positive changes in peoples' lives online," executive chairperson of Coimisiún na Meán Jeremy Godfrey added in another supporting statement.
"Our message to people is clear: if you encounter something that you think is illegal or against a platform's self-prescribed rules for what they permit, you should report it directly to the platform. Our Contact Centre is there to advise and assist people if they need it."
Child safety concerns are driving a new generation of online safety initiatives on both sides of the Atlantic, ranging from the UK's Online Safety Act, enacted just over a year ago to an Age-Appropriate Design Code that began taking effect in the UK last fall; and now an analogous child-focused bill is moving through the US. (KOSA). This was first introduced back in 2022.