While New Mexico works to combat human trafficking rates, the state's momentum may be slowed by companies like Meta — formerly known as Facebook and the owner of the widely-used social media platform — which made a recent internal announcement that human smuggling solicitation will remain allowed on all of its platforms.
The Washington Free Beacon reported on Feb. 1 that Facebook's owner had announced internally that they would allow the solicitation of human smuggling on all platforms moving forward, while offering human smuggling services is not allowed. Meta claims that this move ensures that people can continue "to seek safety or exercise their human rights," but this may conflict with state efforts to ensure that no one's safety or human rights were ever threatened in the first place.
"Any decision to allow vulnerable populations to be exposed to potential exploitation on social media platforms presents additional challenges to the efforts to combat human trafficking," New Mexico Office of the Attorney General (NMOAG) Director of Communications Jerri Mares said to the New Mexico Sun.
New Mexico ranks in the bottom 20 of all state human trafficking rates, with a rate of 2.46 persons per 100,000. Nevertheless, the state has allocated resources to human trafficking prevention. NMOAG recently partnered with Santa Fe-based nonprofit The Life Link to establish the New Mexico Human Trafficking Task Force, a "state-wide coordinated effort to combat sex trafficking and labor trafficking," Mares said. The task force includes the efforts of 62 agencies across New Mexico, including local, county, state, and federal law enforcement, district attorney offices, service providers, social workers, and others.
The task force, reportedly focused on the three areas of trafficking protection, prevention and, prosecution, also includes subcommittees of Domestic Minor Sex Trafficking, Medical Professionals, Law Enforcement, Service Providers, Diversion Programs, and Outreach/Awareness. NMOAG employs a full-time four-person Human Trafficking Unit "committed to investigating human trafficking across New Mexico," Mares said, adding that "many resources are available for victims of human trafficking."
Human trafficking has long been an issue with Facebook. According to CNN, internal Facebook documents reviewed by the organization show that Facebook knew about human traffickers using its platform since at least 2018. In 2019, the issue was so bad that Apple threatened to remove Facebook and Instagram from its app store.
Meta spokesman Drew Pusateri confirmed that the platform would continue to allow solicitations for human smuggling, after its consultation with outside experts.
"We regularly engage with outside experts to help us craft policies that strike the right balance between supporting people fleeing violence and religious persecution while not allowing human smuggling to take place through our platforms," Pusateri said. "At this time, we have no policy changes to announce."
According to Doctors Without Borders, victims of human trafficking often experience horrific crimes including torture, rape, and extortion.
"Migrants and refugees are preyed upon by criminal organizations, sometimes with the tacit approval or complicity of national authorities, and subjected to violence and other abuses — abduction, theft, extortion, torture, and rape — that can leave them injured and traumatized," the organization said in a 2017 report. The same report also found that 31.4 percent of female migrants who traveled through Mexico into the United States had been sexually abused.
Mares said that the federal laws protecting companies like Meta, including Section 230 of the federal Communications Decency Act, "are outdated and do not reflect current reality." Section 230 protects platforms like Facebook from liability for how criminals decide to use them; the provision was amended in 2018 to allow platforms to be held liable if they allow users to post advertisements promoting child sexual exploitation or sex trafficking, but Mares argues that those amendments are too vaguely written.
"Essentially, Meta/Facebook created a massive online entity but is not responsible for policing its users," Mares said. "The 2018 amendments ... may not even punish the conduct Meta/Facebook has decided to allow (solicitation of illegal services as opposed to advertising those services for sale)."
Meta said that it debated the practice for five months, consulting with a variety of groups that provided the company with "global perspectives and a broad range of expertise." To help mitigate the risks associated with human smuggling, Meta "proposed interventions such as sending resources to users soliciting smuggling services." It also said it would allow "sharing information related to illegal border crossing."
"We observed that a slight majority of stakeholders favored allowing solicitations of smuggling services for reasons associated with asylum seekers," Meta said. "We decided that this was indeed the best option since the risks could be mitigated by sending resources, whereas the risks of removing such content could not be mitigated."
In a memo, Meta said that it accepts that the decision is one of "tradeoffs." Allowing the solicitation of smuggling services "can make it easier for bad actors to identify and connect with vulnerable people." It also added that "law enforcement and government bodies … raised concerns that permitting this type of content on our platforms facilitates illegal activity and puts migrants at serious risk of exploitation or death."
Internal documents reviewed by The Wall Street Journal showed that when Meta employees raised flags about the issues they found regarding how the platform was being used, the response was often “inadequate or nothing at all.” Mares said that, in a broad sense, allowing Meta to flout antitrust, consumer protection, and data privacy laws for decades "has contributed to a culture at these companies that the rules simply do not apply to them."
"The NMOAG has been a national leader in efforts to protect children’s privacy online, to punish tech companies that lie to consumers about the safety of their personal information, and to prevent these companies from gobbling up their competitors in a bid to make themselves the only game in town," Mares said. "If Congress continues to refuse to rein in these companies by updating Section 230, state attorneys general will have to make up the difference."