A growing legal and political battle between state authorities and one of the world’s largest technology companies intensified this week as Raúl Torrez, Attorney General of the U.S. state of New Mexico, issued a strong statement condemning Meta Platforms for what he described as its refusal to adequately protect children on its social media platforms.
In his remarks, Torrez accused Meta—the parent company of Facebook, Instagram, and WhatsApp—of deliberately choosing not to implement safeguards that could reduce harm to minors. He argued that the company has the technological capability to make its platforms safer but has declined to do so because of business considerations.
The statement comes at a critical moment in an ongoing legal case in which New Mexico officials have taken action against Meta over concerns related to child safety, online exploitation, and exposure to harmful content. According to the Attorney General’s office, the case reflects broader concerns about how large technology companies design and manage digital environments that are widely used by children and teenagers.

Torrez’s criticism was particularly pointed in addressing Meta’s response to proposed reforms. He described the company’s stance as dismissive of both legal obligations and public safety concerns. In his view, Meta has consistently prioritized user engagement and advertising revenue over the well-being of younger users. He emphasized that while the company has demonstrated the ability to quickly adapt its platforms in other contexts, it has been unwilling to take similar steps when it comes to protecting children.
A key point of contention in the dispute involves a set of measures that New Mexico officials have proposed to make social media platforms safer for minors. These include stronger systems for verifying users’ ages, tighter controls to prevent adults from impersonating children, and changes to recommendation algorithms that currently promote content based on engagement rather than safety considerations. Additional proposals involve clearer warnings about potential risks and stricter enforcement against accounts involved in harmful or exploitative behavior.
Meta, however, has pushed back against these proposals, arguing that some of the requested changes would be difficult to implement and could have unintended consequences for user privacy and platform functionality. The company has also raised concerns about the feasibility of tailoring its services to meet the specific legal requirements of a single state, suggesting that such demands could disrupt the uniform operation of its global platforms.
Perhaps the most controversial aspect of Meta’s response has been its suggestion that it could restrict or withdraw certain services in New Mexico rather than comply with the proposed changes. This possibility drew sharp criticism from Torrez, who characterized it as an attempt to pressure regulators and shift public opinion rather than a genuine technical limitation. He framed the move as a strategic decision designed to avoid accountability while maintaining the company’s broader business model.
The dispute reflects a larger and increasingly urgent debate about the role of social media in the lives of young people. Over the past decade, platforms like Facebook and Instagram have become deeply embedded in daily communication, entertainment, and self-expression. At the same time, researchers, policymakers, and advocacy groups have raised concerns about issues such as cyberbullying, mental health impacts, addictive design features, and the risk of exposure to inappropriate or dangerous content.
In this context, the legal battle in New Mexico is being closely watched by other states and jurisdictions. Observers note that the outcome could set an important precedent for how governments regulate technology companies and enforce standards related to user safety. If New Mexico succeeds in compelling Meta to implement new safeguards, it could encourage similar actions elsewhere, potentially leading to broader changes across the industry.
Torrez has framed the case as part of a broader effort to hold powerful corporations accountable and ensure that technological innovation does not come at the expense of public welfare. He stressed that protecting children is a fundamental responsibility that should not be compromised for the sake of profit or convenience. In his statement, he reiterated his commitment to pursuing legal and regulatory measures that prioritize safety and transparency.
For its part, Meta has maintained that it is committed to user safety and has pointed to existing tools and policies aimed at protecting younger users. These include content moderation systems, parental controls, and initiatives designed to limit harmful interactions. However, critics argue that these measures are insufficient and that more comprehensive changes are needed to address systemic issues in how the platforms operate.
As the case moves forward, both sides appear firmly entrenched in their positions. The confrontation highlights the challenges of balancing innovation, regulation, and public safety in a rapidly evolving digital landscape. It also underscores the growing pressure on technology companies to demonstrate that they can responsibly manage the immense influence they wield over global communication networks.
With legal proceedings continuing and public scrutiny intensifying, the outcome of this dispute could have lasting implications—not only for Meta and New Mexico, but for the broader relationship between governments and the technology industry in the years to come.








