Impact Newswire

EU Charges Meta Over Child Safety Failures on Facebook, Instagram

European regulators have formally accused Meta of breaching key digital rules, escalating pressure on the tech giant to do more to protect children using its platforms.

EU Charges Meta Over Child Safety Failures on Facebook, Instagram

The charges, brought under the European Union’s Digital Services Act, follow a two-year investigation into how Facebook and Instagram handle underage users. Regulators say the company has failed to effectively prevent children under 13 from accessing its services, despite clear rules prohibiting their use.

At the centre of the case is what officials describe as weak enforcement rather than weak policy. Meta’s platforms already restrict users below 13, but investigators found that these safeguards are not working in practice. Children are still able to sign up, often by entering false birth dates, and systems designed to detect and remove such accounts have proven inadequate.

The scale of the issue is significant. European authorities estimate that between 10 percent and 12 percent of children under 13 in the region are currently using Facebook or Instagram, highlighting the gap between policy and enforcement.

Regulators argue that this failure exposes minors to harmful online environments, including inappropriate content and other digital risks. They also criticised Meta’s risk assessment processes, saying the company has not done enough to fully evaluate or mitigate the dangers faced by younger users.

The European Commission has made it clear that written rules are not enough. Platforms are expected to take concrete action to enforce age limits and actively protect vulnerable users. That includes improving systems to verify age, detect underage accounts and remove them more effectively.

Meta has pushed back against the findings, describing them as preliminary and stressing that age verification remains a complex, industry-wide challenge. The company says it already uses tools to identify and remove accounts belonging to children and plans to introduce additional safety measures in the near future.

Even so, the stakes are high. If the EU concludes that Meta has failed to comply with the Digital Services Act, the company could face fines of up to 6 percent of its global annual revenue, a penalty that could run into billions of dollars.

The case reflects a broader shift in how governments are approaching Big Tech. Across Europe and beyond, regulators are moving from voluntary guidelines to enforceable rules, particularly when it comes to child safety online.

This latest action signals that scrutiny of social media platforms is entering a tougher phase. Protecting minors is no longer treated as a secondary issue but as a core regulatory priority.

For Meta, the message is clear. It is no longer enough to set age limits. The expectation now is to prove they actually work.

Impact Newswire channel

Stay ahead of the stories shaping our world. Subscribe to Impact Newswire for timely, curated insights on global tech, business, and innovation all in one place.

Dive deeper into the future with the Cause Effect 4.0 Podcast, where we explore the ideas, trends, and technologies driving the global AI conversation.

Got a story to share? Pitch it to us at info@impactnews-wire.com and reach the right audience worldwide


Discover more from Impact Newswire

Subscribe to get the latest posts sent to your email.

"What’s your take? Join the conversation!"

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top

Discover more from Impact Newswire

Subscribe now to keep reading and get access to the full archive.

Continue reading