In 2014, three victims of online sex trafficking, all minors, filed a lawsuit against Backpage, a classified advertising website similar to Craigslist. The plaintiffs argued that the website, which had hosted advertisements for sex posted by their trafficker, had “tailored its website to make sex trafficking easier.” The victims’ stories were shocking; one claimed she had been raped 1,000 times, beginning at the age of 15. The judge in the case conceded that the plaintiffs had made a “persuasive case.” In spite of that, he dismissed the lawsuit—not, he said, because he wanted to, but because Section 230, a federal law that shields digital platforms from liability for their third-party content, “requires that we . . . deny relief to plaintiffs whose circumstances evoke outrage.”
Backpage has long-since gone offline. The Justice Department seized and shuttered it in 2018 in connection with a criminal case. But the judge’s ruling in Doe v. Backpage continues to haunt victims in many civil lawsuits for other federal crimes—including child pornography.
Sex trafficking exploded online in the early 2010s. Testifying before the Senate Permanent Subcommittee on Investigations in November 2015, Yiota G. Souras, a representative from the National Center for Missing and Exploited Children, reported, “Over the past five years, NCMEC has seen an 846% increase in reports of suspected child sex trafficking to the CyberTipline.” While Backpage was not the only culprit, it was the worst of the lot. “Of all the child sex trafficking reports submitted by members of the public to the CyberTipline, more than seventy-one percent (71%) relate to Backpage ads,” said Souras.
After the hearing in January 2017, the subcommittee published a damning report on Backpage’s knowing facilitation of online sex trafficking. Multiple states had already conducted their own investigations. The Washington Post published an exposé in July 2017. The Justice Department delivered the final blow in April 2018, seizing the website and filing criminal charges against seven former executives and owners in a 93-count indictment.
Both states and individual victims had tried to rein in Backpage as far back as 2011, but lawsuits kept hitting the same obstacle: Section 230, originally a small part of a larger law known as the Communications Decency Act of 1996. The CDA was intended to protect children, but Section 230 wound up endangering them. Congress eventually resolved this problem with the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA). Signed into law in April 2018, FOSTA amended Section 230 so that it would not apply to federal sex trafficking laws, though it didn’t include other criminal offenses in the exception.
Today, for example, law enforcement is dealing with an enormous increase in online child pornography. In 2019, the New York Times reported that the FBI was prioritizing child porn involving infants and toddlers because it was unable to respond to the large volume of reports for older children (unless they were in immediate danger). That year, NCMEC’s CyberTipline received 16.9 million reports for child porn. That number increased to 21.6 million in 2020 and 29.3 million in 2021.
What happens when victims sue digital platforms for violations of federal child porn laws? In Doe v. Twitter (N.D. Cal. 2021), two minors were communicating on Snapchat with a perpetrator who pretended to be a 16-year-old girl. The perpetrator enticed the victims to share nude photos and videos and then blackmailed them to obtain more explicit content. When the victims eventually refused to cooperate, the perpetrator compiled the child porn he had already received into a video and published it on Twitter. The victims later learned about the video from their classmates.
Despite at least three reports from a victim, the victim’s mother, and a concerned citizen, Twitter didn’t remove the content and acted only after a federal agent intervened. By then, the video had accumulated more than 167,000 views and 2,223 retweets.
After the victims sued Twitter, Twitter filed a motion to dismiss the lawsuit. Here, Twitter did not argue that the plaintiffs had failed to allege a violation of federal child-porn laws. Instead, the company argued that this civil claim was barred by Section 230. The court agreed. (However, the court allowed a separate civil claim for sex trafficking to proceed, thanks to FOSTA.) Victims in other child-porn lawsuits have received similar rulings, including one in 2021 involving Reddit and one in 2011 involving Backpage.
Under Section 230, digital platforms such as Facebook and Twitter are generally not liable for the third-party content generated by their users. If Jack makes a defamatory comment about Jill on Facebook, then Jill can sue Jack for that comment, but Facebook itself has no liability for Jack’s defamatory comment.
Defenders of Section 230’s broad protections have argued that some digital platforms have millions or even billions of pieces of content, and that content moderation at scale is too difficult. Indeed, many—let’s call them “techno-libertarians”—call Section 230 the Internet’s First Amendment. In their view, the ruling against the plaintiffs in Doe v. Backpage was a “big win for free speech.”
The techno-libertarians do raise a valid point that content moderation at scale is hard, but that point only proves the existence of a problem. It does not prove that the government is the solution. And Section 230 is the government; it is a special immunity provided to the tech industry by the government.
Of course, the question of what to do about Section 230 does not have to be an all-or-nothing one. If we do not want to overhaul Section 230, but we believe that its application to federal child-porn laws is disastrous, then we can carve those laws out of Section 230, just as FOSTA did for sex trafficking.
The techno-libertarians claim that reforming Section 230 could lead to more frivolous lawsuits against digital platforms, and for problems like defamation, they could have a point. For problems like sex trafficking or child porn, though, not only is it much harder to envision frivolous lawsuits, but Section 230 also often blocks meritorious ones, such as Doe v. Backpage or Doe v. Twitter.
When Craigslist blamed its decision to shut down its personals section on FOSTA, the techno-libertarians inevitably complained about FOSTA’s “unintended consequences” (ignoring the fact that Craigslist’s hands were not entirely clean). But if we’re talking about unintended consequences, then what about those of Section 230 itself? Wouldn’t Doe v. Backpage be an unintended consequence of Section 230? After FOSTA was enacted, victims of sex trafficking have invoked FOSTA to get past Section 230 in lawsuits against Twitter, Facebook, and Pornhub.
When they are confronted with problems such as sex trafficking and child pornography, techno-libertarians often resort to a talking point: Section 230 does not stop the enforcement of federal criminal law. This is true, but it is not the whole truth. When analyzing the law, we should look at not just what the law forbids but also who has access to the courthouse when the law is broken.
By default, only federal law enforcement can take perpetrators to court when they commit a federal crime. If Congress wants to let the victims of that crime sue, then they need to add civil remedies to that law. Many (but not all) federal criminal laws include civil remedies for victims, including both federal sex-trafficking and child-porn laws.
When victims in cases like Doe v. Backpage or Doe v. Twitter sue, their fate is determined not by clever talking points but by statutory interpretation—in layman’s terms, the task of figuring out what a law says. Here is the text of Section 230(e)(1): “Nothing in this section shall be construed to impair the enforcement of . . . any . . . Federal criminal statute.”
What does the word “enforcement” mean here? To answer that question, we can play a game of “spot the differences.” Here is the text of Section 230(e)(4): “Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.”
Did you spot the differences? Section 230(e)(1) says “impair the enforcement”; Section 230(e)(4) says, “limit the application.” And Section 230(e)(4) says, “or any similar State law,” while Section 230(e)(1) does not contain this phrase. The first difference may seem subtle, but it has profound legal consequences. (The second difference is also worth exploring, though it is a more complex policy issue.)
Here’s why. The meaning of the word “enforcement” comes down to a practical question: Can victims enforce a federal criminal law? According to the courts, the answer is no. For example, let’s say that Twitter violates federal child-porn laws. If the FBI brings a criminal charge against Twitter, then the courts would say that the FBI is “enforcing” federal child-porn laws. If a victim brings a civil claim against Twitter for the same conduct, however, then the courts would not say that the victim is “enforcing” federal child-porn laws; one might say instead that the victim is “applying” federal child-porn laws—a much broader term.
Section 230(e)(1) is an “enforcement” carveout. As such, this exception applies to federal law enforcement but does not apply to victims. If the victim of a federal crime files a civil lawsuit, then Section 230 can interfere with that lawsuit.
The difference between “enforcement” and “application” is subtle but important if you are the victim of a federal crime. As far back as 2011, Section 230 blocked victims’ attempt to hold Backpage accountable by “applying” federal sex trafficking laws. Without the ability to “apply” the law, the problem continued to grow, until the Department of Justice stepped in and “enforced” the law against Backpage in 2018. But between 2011 and 2018, how many child victims of Backpage were raped more than 1,000 times?
So what is the solution here? One solution would be to allow the “application” of any federal criminal law. Let victims use any civil remedies that can be found in those laws. This option does not create any new civil remedies; it simply restores full access to existing civil remedies, remedies already fully applicable to businesses in the offline world.
The techno-libertarians may object that this change is overly broad, but maybe it is Section 230 that is overly broad. An argument can be made against a blanket carveout that allows civil remedies for any crime—but an even stronger one can be made against a blanket immunity that bars civil remedies for crimes like sex trafficking and child porn.
Given a choice between two extremes—a blanket immunity and a blanket carveout—it is perhaps inevitable that a middle road emerged: carve out the “application” of certain federal criminal laws.
A range of options, both narrow and broad, exist down that path. FOSTA took a narrow path, carving out only sex trafficking. If your daughter is a victim of sex trafficking on Facebook, and Facebook violates federal sex trafficking laws, then Section 230 cannot block a lawsuit against Facebook. But if your son is a victim of labor trafficking on Facebook, and Facebook violates federal labor trafficking laws, then Section 230 can block a lawsuit against Facebook. And if your child is a victim of child porn on Twitter, and Twitter violates federal child porn laws, then Section 230 can block a lawsuit against Twitter.
Evidence certainly exists to support broader carveouts. In the original Backpage lawsuit, M.A. v. Village Voice Media (E.D. Mo. 2011), the victims used the civil remedies for many federal crimes, not just sex trafficking. According to Save the Children, boys are also victims of human trafficking, though it tends more often to be labor trafficking. Moreover, the Wall Street Journal’s investigation of Facebook revealed a problem with human trafficking more broadly—not just sex trafficking.
And while techno-libertarians may deride FOSTA as a blight on Section 230, FOSTA did not set the precedent for carving out the “application” of certain federal criminal laws. That precedent was set, in fact, by the original, “pure” version of Section 230—namely, Section 230(e)(4), which carved out the “application” of federal criminal law for communications privacy. If we can fully carve out federal crimes for communications privacy, then why can’t we do the same for child pornography?
Photo: hamzaturkkol/iStock