The European Commission has launched a formal investigation into Meta Platforms Inc., the parent company of Facebook and Instagram, over concerns that its algorithms and recommender systems may contribute to the spread of harmful content, particularly to minors. The probe, announced on May 6, 2026, focuses on whether Meta's systems violate the Digital Services Act (DSA), a landmark EU regulation aimed at curbing illegal and harmful online content.
Regulators are specifically examining how Meta's recommender systems—which use user data to suggest posts, videos, and accounts—may amplify content that could lead to addiction, cyberbullying, or exposure to inappropriate material for children. The investigation follows a preliminary analysis by the Commission, which found potential risks in Meta's platform design, including the use of age verification and default privacy settings for minors.
Meta has stated it will cooperate fully with the investigation, emphasizing its ongoing efforts to improve safety features, such as parental controls and content filters. However, campaign groups have welcomed the probe, arguing that algorithmic amplification of sensational or extreme content has long been a concern. The DSA allows for fines of up to 6% of a company's global annual turnover for non-compliance.
This is not the first EU action against Meta; the company has faced previous fines and orders related to data privacy and antitrust issues. The outcome of this investigation could set a precedent for how social media platforms manage algorithmic recommendations across Europe.