The European Union has launched an investigation into Meta’s child safety measures on Facebook and Instagram.
The investigation, announced on Thursday, centers around concerns that the platforms may not be doing enough to protect the mental and physical well-being of young users.
The probe specifically examines whether Facebook and Instagram’s design and algorithms contribute to “behavioural addictions” and “rabbit hole effects” in children.
These terms describe how users, especially minors, can get sucked into endless loops of content, potentially leading to harmful behaviour or negative mental health impacts.
Again, the EU is questioning the effectiveness of Meta’s age verification tools, suspecting the platform might be too easily bypassed.
The investigation will also assess whether Meta’s content moderation and recommendation systems adequately safeguard children from inappropriate content.
Officials are particularly concerned about content promoting depression or unrealistic body image. Furthermore, the EU will investigate Meta’s default privacy settings for minors, aiming to ensure a high level of protection for young users.
If the EU finds Meta in violation of the Digital Services Act (DSA), the company could face huge fines up to 6% of its global revenue.
The investigation authorizes the EU to conduct inspections, gather further evidence, and potentially impose interim measures while the probe continues. There’s no set deadline for the proceedings, but the EU is focused on protecting the well-being of young users online.
This investigation comes at a time of similar EU probes into Meta’s advertising practices and its approach to election integrity on Facebook and Instagram.
The EU is clearly sending a message that it expects major online platforms to prioritize user safety, particularly when it comes to the most vulnerable users – children.