EU opens child safety probes of Facebook and Instagram, citing addictive design concerns
Facebook and Instagram are under formal investigation in the European Union over child protection concerns, the Commission announced Thursday. The proceedings follow a raft of requests for information to parent entity Meta since the bloc’s online governance regime, the Digital Services Act (DSA), started applying last August.
The development could be significant as the formal proceedings unlock additional investigatory powers for EU enforcers, such as the ability to conduct office inspections or apply interim measures. Penalties for any confirmed breaches of the DSA could reach up to 6% of Meta’s global annual turnover.
Meta’s two social networks are designated as very large online platforms (VLOPs) under the DSA. This means the company faces an extra set of rules — overseen by the EU directly — requiring it to assess and mitigate systemic risks on Facebook and Instagram, including in areas like minors’ mental health.
In a briefing with journalists, senior Commission officials said they suspect Meta of failing to properly assess and mitigate risks affecting children.
They particularly highlighted concerns about addictive design on its social networks, and what they referred to as a “rabbit hole effect,” where a minor watching one video may be pushed to view more similar content as a result of the platforms’ algorithmic content recommendation engines.
Commission officials gave examples of depression content, or content that promotes an unhealthy body image, as types of content that could have negative impacts on minors’ mental health.
They are also concerned that the age assurance methods Meta uses may be too easy for kids to circumvent.
“One of the underlying questions of all of these grievances is how can we be sure who accesses the service and how effective are the age gates — particularly for avoiding that underage users access the service,” said a senior Commission official briefing press today on background. “This is part of our investigation now to check the effectiveness of the measures that Meta has put in place in this regard as well.”
In all, the EU suspects Meta of infringing DSA Articles 28, 34, and 35. The Commission will now carry out an in-depth investigation of the two platforms’ approach to child protection.
The EU opened a similar probe into addictive design concerns on video sharing social network TikTok last month.
It also already opened two DSA investigations on Meta’s social networks: Last month the Commission said it would investigate separate concerns related to Facebook’s and Instagram’s approach to election integrity.
Reached for a response to the latest EU investigations, a Meta spokesperson emailed us this statement: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
The company also told us its approach to verifying the age of users of the two social networks relies on a combination of self-declared age and AI assessments to try to detect kids who may be lying about their age. Additionally, it said it allows people to report suspected underage accounts, adding that it trains its content reviewers to flag accounts that may be used by underage minors.
Users who are under 18 and attempt to edit their age on its platforms will be challenged to choose and submit to an age verification test, per Meta — however, the company did not specify which age verification checks it offers. But it claimed that internal tests of the efficacy of its approach indicate it’s working to prevent minors from accessing age-appropriate experiences. Since instigating the checks, Meta said it has been able to stop 96% of teens who attempted to edit their birthdays from under 18 to over 18 on Instagram from doing so.
This report was updated with comment from Meta