Meta Threatens to Exit New Mexico Over Child Safety Rules
Meta is warning that it might pull Facebook and Instagram out of New Mexico if the state enforces strict new rules aimed at protecting young users. The state has been pushing for changes to how these platforms handle children’s safety, and Meta’s response signals a potential showdown. The case highlights the ongoing fight between tech companies and regulators over how to keep kids safe online.
What New Mexico Is Asking For
New Mexico wants Meta to make several big changes to its platforms. The state is asking the company to verify users’ ages more strictly, redesign its recommendation algorithms so they don’t target minors for engagement, and turn off autoplay and infinite scroll for users under 18. They also want push notifications turned off during school hours and late at night, plus a cap of 90 hours a month on children’s screen time.
Additionally, New Mexico is demanding a $3.7 billion fund to improve teen mental health services across the state, on top of the $375 million already awarded for past violations. Many of these measures aren’t new ideas for Meta, which has tested some of them in different markets. But none have been imposed through U.S. courts before, and if a judge orders them, it could change how the company designs its platforms globally.
The Legal Backstory and Meta’s Response
The case started in late 2023, when New Mexico Attorney General Raúl Torrez filed a lawsuit. His team created a fake Instagram profile of a 13-year-old girl and found that it was flooded with harmful content and solicitations. The lawsuit argued that this was not just a coincidence but built into the platform’s recommendation system, which encourages more engagement from young users.
During the trial, prosecutors presented internal Meta communications discussing the impact of Facebook’s encryption change in 2019. They revealed that making Messenger end-to-end encrypted would hinder law enforcement from detecting child abuse reports, which Meta acknowledged could lead to more harm. Meta responded by saying some of the requested changes are technically impossible or would force it to withdraw from the state altogether.
Meta’s threats to pull out of New Mexico are seen as a negotiation tactic, but they also raise concerns about setting a precedent. The state has a small population compared to Meta’s global user base, and the company’s stance suggests it might resist such orders to avoid setting a wider standard. More than 40 other states are also pursuing similar suits, so New Mexico’s case could shape future regulations.
Overall, this legal battle highlights the tension between protecting children online and the company’s desire to operate freely. While Meta has introduced some teen safety features like AI monitoring, parental controls, and usage limits, it has resisted more invasive regulation. The outcome could determine how much control states have over social media platforms and how companies balance safety with business interests.












What do you think?
It is nice to know your opinion. Leave a comment.