The European Union has initiated a formal investigation into Meta, the parent company of Facebook and Instagram, amid concerns that the company is not sufficiently protecting children on its platforms. This investigation may lead to substantial fines if Meta is found in violation of EU regulations.
This move is part of a broader regulatory focus on the harmful effects of social media on young users, including the promotion of addictive behavior and exposure to inappropriate content.
The European Commission, which acts as the EU’s executive body, is assessing whether Meta has adhered to its obligations under the Digital Services Act (DSA). This comprehensive legislation mandates that online platforms implement measures to protect children, such as preventing access to unsuitable content and ensuring high levels of privacy and safety. Failure to comply with these regulations could result in fines of up to 6% of a company’s global revenue or enforced changes to its operations.
In a statement released on Thursday, the European Commission expressed concerns that Facebook and Instagram might be exploiting the vulnerabilities and inexperience of minors, potentially leading to addictive behaviors. Additionally, the Commission questioned the effectiveness of Meta’s age verification and assurance methods.
“The Commission is also concerned about age assurance and verification methods put in place by Meta,” the statement added, highlighting potential inefficiencies.
Responding to the investigation, a Meta spokesperson stated, “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
Despite Meta’s efforts, a report submitted to the European Commission last September detailing how its platforms protect minors did not alleviate the regulators’ concerns. Commissioner Thierry Breton emphasized, “We are not convinced that Meta has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans. We are sparing no effort to protect our children.”
Meta has been under growing scrutiny regarding the impact of its platforms on young users. In the United States, the company faces lawsuits from various school districts and state attorneys general related to youth mental health, child safety, and privacy.
Furthermore, an investigation earlier this month by the New Mexico attorney general into the potential dangers of Meta’s platforms resulted in the arrests of three men charged with attempted sexual abuse of children.
Meta’s challenges are not limited to child protection. The company has frequently faced regulatory actions from the EU over various issues, including its handling of advertisements by scammers, foreign election interference ahead of upcoming EU elections, and the spread of disinformation and illegal content related to the war in Gaza.
The European Commission’s investigation highlights the increasing regulatory pressures on social media companies to safeguard young users. As the EU continues to enforce stringent protections for children online, the outcomes of this probe could have significant implications for Meta and the broader tech industry.