Home / World News / EU raises concerns over Meta’s ability to keep under-13 users off Facebook and Instagram

EU raises concerns over Meta’s ability to keep under-13 users off Facebook and Instagram

EU raises concerns over Meta’s ability to keep under-13 users off Facebook and Instagram

The European Union has issued a serious warning to Meta, stating that the company is not doing enough to prevent children under the age of 13 from accessing its platforms, Facebook and Instagram. This gap in enforcement could expose young users to inappropriate and potentially harmful content, raising alarm among regulators and parents alike.

The warning comes as part of a broader push by the European Union to strengthen online safety for children. In recent months, several EU member states have even considered banning social media access for users under 16. The EU is also evaluating the possibility of introducing a unified age restriction across the region, following global pressure and policy shifts such as Australia’s move to restrict social media use among minors.

According to preliminary findings, EU regulators believe Meta has failed to properly enforce its own rules, which clearly state that users must be at least 13 years old. Officials argue that simply having policies in place is not enough—they must be backed by effective systems that actively protect vulnerable users.

EU tech commissioner Henna Virkkunen emphasized that platform rules must translate into real-world action, especially when it comes to safeguarding children. The investigation found that underage users can easily bypass restrictions by entering false birth dates, highlighting a lack of robust verification mechanisms.

Additionally, the EU criticized Meta’s reporting tools, describing them as difficult to use and ineffective. In some cases, users reportedly need to go through multiple steps just to report underage accounts, discouraging timely action.

The findings also suggest that Meta has not fully acknowledged the risks associated with children using its platforms. Evidence across EU countries indicates that approximately 10 to 12 percent of children under 13 are already accessing these services, despite the rules.

Meta, however, has pushed back against the claims. A company spokesperson stated that both Facebook and Instagram are designed strictly for users aged 13 and above, and that systems are in place to detect and remove underage accounts. The company also confirmed it will continue to cooperate with EU authorities as the investigation progresses.

This probe, launched in May 2024 under the Digital Services Act, is part of a wider effort by the EU to regulate major tech companies and ensure safer digital environments. If the EU’s concerns are confirmed, Meta could face fines of up to six percent of its global annual revenue.

The investigation also touches on broader issues, including the potential “addictive” design of social media platforms and their impact on users’ mental and physical well-being. Meanwhile, the EU is developing new tools, such as an age-verification app, to strengthen online protections and replace outdated self-declaration systems.

This situation highlights a growing global challenge: balancing technological innovation with user safety, especially for younger audiences. As governments tighten regulations and scrutiny increases, tech companies are under mounting pressure to not just set rules—but to enforce them effectively and responsibly.

Tagged:

Leave a Reply

Your email address will not be published. Required fields are marked *