In a case that could redefine the legal liabilities of social media giants, jury selection began today in Santa Fe for a landmark trial against Meta Platforms, Inc. The lawsuit, filed by New Mexico Attorney General Raúl Torrez, alleges that Meta’s platforms Facebook, Instagram, and WhatsApp, have become a “marketplace for predators” and a “breeding ground” for child exploitation. The trial is the first of its kind to reach a jury, representing a significant shift from previous legal battles that were often dismissed under federal immunity laws.
The state’s case is built on a 2023 undercover investigation dubbed “Operation MetaPhile.” Investigators posing as children under the age of 14 created decoy accounts on Instagram and Facebook. Within hours, these accounts were allegedly bombarded with sexually explicit material and solicitations from adult predators. Prosecutors argue that Meta’s algorithms do not just host this content but actively “push” it toward vulnerable minors to maximize engagement and profit.
New evidence set to be presented during the trial includes internal company emails regarding Meta’s AI chatbots. Documents recently unsealed suggest that Meta executives, including CEO Mark Zuckerberg, were warned by their own “integrity staff” that AI companions were capable of engaging in romantic or sexual conversations with minors. The filings allege that leadership rejected recommendations for stricter guardrails, opting instead to prioritize a “non-censorship” approach for adult users that inadvertently left children exposed to inappropriate interactions.
Meta has vehemently denied the allegations, calling the Attorney General’s claims “sensationalist” and based on “cherry-picked documents.” The company maintains that it has invested billions in safety features, including “Teen Accounts” with built-in protections and advanced age-verification tools. Legal experts note that Meta will likely lean on Section 230 of the Communications Decency Act, which traditionally protects tech companies from being held liable for content posted by third-party users. However, New Mexico is attempting to bypass this by focusing on the platform’s design and algorithmic recommendations rather than the content itself.
The outcome of this trial is being watched closely across the globe, including by African regulators and diaspora communities. As digital penetration grows across the African continent, the safety of the “next billion users”—many of whom are children—remains a top priority for policy makers. If New Mexico secures a victory, it could create a “beachhead” for other states and nations to hold social media companies accountable for the real-world harms caused by their platform architecture.
The trial is expected to last seven to eight weeks, with opening statements scheduled for February 9.