Five Key Points on France’s Planned Ban on Social Media for Minors

The scope of the proposed restriction, the technical challenges of age verification, its legal enforcement under EU rules and how other European countries, including Greece, are approaching similar measures

Header Image

Five Key Points on France’s Planned Ban on Social Media for Minors

France is preparing to become the second country after Australia to ban access to social media for children under the age of 15. The draft law, backed by President Emmanuel Macron, has already passed the lower house of parliament and, if approved by the Senate, is expected to enter into force in 2026. The initiative reflects growing concern over the impact of platforms such as TikTok, Instagram, Snapchat and X on the mental health of minors.

When the ban is expected to take effect

The French government is targeting 1 September 2026 as the start date.

In a post on X, President Macron said that “from 1 September, children and adolescents will finally be protected”.

Before that, the bill must be approved by the Senate. If passed, a joint committee of the National Assembly and the Senate will be set up to finalise the text.

Platforms covered by the legislation

The law does not name specific platforms. The final decision will rest with the French audiovisual and digital regulator Arcom.

The rapporteur of the bill, an MP from the Renaissance party, has said the approach will be similar to Australia’s and is likely to include Snapchat, TikTok, Instagram and X. The possibility of restricting specific features is also under consideration, such as WhatsApp Stories and Channels, as well as chat functions on Roblox.

Who will oversee enforcement

Although France is the first EU member state to advance such a measure, enforcement will not be handled by Paris alone.

The ban will be implemented within the framework of the Digital Services Act (DSA). France may set a minimum age threshold, but the European Commission will be responsible for supervising platform compliance.

Breaches of the DSA can result in fines of up to 6 percent of a platform’s global annual turnover.

Technical challenges of age verification

A major obstacle remains the method of age verification. Platforms such as Meta argue that age checks should take place at operating system level, while companies like Apple place responsibility on individual applications.

France maintains that, under European Commission guidance, responsibility lies with service providers, meaning platforms such as TikTok and Instagram.

The state will not impose a specific technology. Companies may use tools such as France’s age verification system “Jeprouvemonage”, provided they meet requirements on accuracy and data protection.

The wider European context

France is not acting in isolation. Denmark has agreed on restrictions for users under 15, with parental consent required from the age of 13. Austria is considering a ban for children under 14, while legislative initiatives are under way in Spain and Italy.

In Greece, Prime Minister Kyriakos Mitsotakis has publicly expressed support for similar measures. At EU level, Commission President Ursula von der Leyen has established a group of experts to assess whether a Europe-wide ban could be implemented.

Source: Politico

Comments Posting Policy

The owners of the website www.politis.com.cy reserve the right to remove reader comments that are defamatory and/or offensive, or comments that could be interpreted as inciting hate/racism or that violate any other legislation. The authors of these comments are personally responsible for their publication. If a reader/commenter whose comment is removed believes that they have evidence proving the accuracy of its content, they can send it to the website address for review. We encourage our readers to report/flag comments that they believe violate the above rules. Comments that contain URLs/links to any site are not published automatically.