The European Commission has found, on a preliminary basis, that Meta may be breaching the Digital Services Act (DSA), as its platforms Facebook and Instagram do not take sufficient measures to prevent children under 13 from accessing their services.
According to a statement issued on Wednesday, although Meta’s terms of service set the minimum age for safe access at 13, enforcement mechanisms are considered inadequate. Existing systems fail to effectively block younger users or promptly detect and remove accounts created by underage children.
Weak age verification systems
The European Commission noted that during account registration, children can easily bypass age restrictions by entering false dates of birth, with no robust verification measures in place to prevent this.
At the same time, Meta’s reporting tool for under-13 users was described as cumbersome and ineffective. Users must go through up to seven steps to access the reporting form, which is not automatically populated with relevant account details. Even when reports are submitted, the Commission said follow-up checks are often insufficient, allowing underage users to continue using the platforms.
Incomplete risk assessment
The Commission also criticised Meta’s risk assessment process, describing it as incomplete. It does not adequately capture the risks of children under 13 accessing the platforms or their potential exposure to inappropriate content and harmful experiences.
Data cited by the Commission suggests that between 10% and 12% of children under 13 in the European Union use Instagram or Facebook. It also noted that Meta appears not to have sufficiently taken into account available scientific evidence showing that younger children are more vulnerable to online risks.
Call for stronger safeguards
The Commission said both platforms must revise their risk assessment methodologies to better identify and address these risks in the EU. It also called for stronger measures to prevent, detect and remove accounts belonging to under-13 users.
Henna Virkkunen, Executive Vice-President for Technological Sovereignty, Security and Democracy, said that while Meta’s own rules prohibit access for children under 13, the preliminary findings show the platforms are doing too little to enforce them. She stressed that the DSA requires companies to apply their own user protection rules in practice.
Next steps and possible penalties
Meta now has the right to review the case file and respond in writing to the Commission’s findings, as part of its defence. The opinion of the European Board for Digital Services will also be sought.
If the preliminary conclusions are confirmed, the Commission may issue a non-compliance decision. This could lead to fines of up to 6% of Meta’s total global annual turnover, as well as periodic penalty payments until compliance is achieved.
The case forms part of formal proceedings launched by the Commission against Meta on 16 May 2024 under the DSA. The investigation is ongoing and also covers other potential breaches related to child protection and the design of digital interfaces.
Source: CNA