Circuit Issues New §230 Ruling as Law’s Future Remains Uncertain

April 19, 2021 | Privacy, Data & Cyber Law

For much of the past few years, Section 230 of the Communications Decency Act of 1996 (CDA) has been in the news. Section 230 generally immunizes providers of interactive computer services (such as internet service providers or ISPs, social media companies, and website hosting entities, among others) from liability in connection with third party content appearing on their platforms unless the publisher would otherwise be liable under certain established intellectual property or criminal laws.   

Politicians from both major parties, critics (including competitors) of big tech companies such as Facebook, Twitter, and Google, and various other businesses and individuals speaking out on social media and elsewhere have attacked the law. Others, such as the Electronic Frontier Foundation, consider Section 230 to be one of the most valuable tools for protecting freedom of expression that has been essential to the ability of the internet to evolve and thrive since 1996. See, e.g.,  https://www.eff.org/issues/cda230. 

As the hearings on Section 230 held in the Capitol in March indicated, proposed amendments range from a complete repeal (apparently quite unlikely) to limited carve-outs or relatively minor changes. At this point, how all of this will play out remains to be seen.

In the meantime, federal courts in New York and around the country continue to issue rulings in disputes involving Section 230. In fact, just last month, in Domen v. Vimeo, Inc., 991 F.3d 66 (2d Cir. 2021), the U.S. Court of Appeals for the Second Circuit issued a rare decision involving Section 230(c)(2) of the CDA, rather than the provision of the CDA that is more commonly the subject of litigation, Section 230(c)(1).

After briefly describing the state of Section 230 law in the Second Circuit, focusing on Section 230(c)(1), this column will explore the circuit court’s Domen decision and its implications.

Background

Section 230(c)(1) of the CDA states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Subject to certain delineated exceptions, Section 230(c)(1) effectively immunizes a defendant when (1) it is a “provider or user of an interactive computer service,” as defined by Section 230(f)(2) of the CDA; (2) the plaintiff’s claims would treat the defendant as the “publisher or speaker” of information; and (3) the information is “provided by” an “information content provider” other than the defendant.

The circuit courts are in general agreement that the text of Section 230(c)(1) should be construed broadly in favor of immunity. See, e.g., FTC v. LeadClick Media, LLC, 838 F.3d 158, 173 (2d Cir. 2016) (collecting cases).

Circuit Caselaw

The Second Circuit’s decision in Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019), illustrates one of the typical situations in which Section 230 immunity can apply.

In this case, the plaintiffs – including U.S. citizen victims, and relatives and representatives of the estates of those victims, of certain terrorist attacks committed by Hamas in Israel – contended that Facebook unlawfully provided Hamas with a communications platform that enabled those terrorist attacks. In particular, they alleged that, as a result of Facebook’s famous algorithm, not only had Facebook failed to remove the “openly maintained” pages and associated content of certain Hamas leaders, spokesmen, and other members, but that the algorithm actually directed such content to the personalized newsfeeds of the individuals who harmed the plaintiffs. Thus, the plaintiffs claimed, Facebook’s algorithm provided “material support” to a terrorist organization when it enabled Hamas “to disseminate its messages directly to its intended audiences” and to “carry out the essential communication components of [its] terror attacks.”

The U.S. District Court for the Southern District of New York granted Facebook’s motion to dismiss the plaintiffs’ complaint on the basis of Section 230(c)(1) immunity, and the plaintiffs appealed to the Second Circuit. 

The Second Circuit reasoned that the alleged conduct for which the plaintiffs sought to hold Facebook liable – that is, “giving Hamas a forum with which to communicate and for actively bringing Hamas’ message to interested parties” – fell within the “heartland” of what it means to be the “publisher” of information under Section 230(c)(1).

The circuit court then ruled that the plaintiffs did not plausibly allege that Facebook itself is an “information content provider” because Facebook did not “develop” the content of the postings by Hamas.  Because the parties agreed that Facebook is a provider of an “interactive computer service” and therefore met the first part of the three-prong immunity test, the Second Circuit affirmed the district court’s decision to dismiss the plaintiffs’ federal claims. The Second Circuit explicitly rejected the plaintiffs’ argument that Facebook’s use of its algorithm, itself, to connect users to content rendered it a “non-publisher.” Notably, the U.S. Supreme Court declined certiorari despite the alleged split in the circuits concerning immunity under Section 230. 

Similar results have been reached in other recent cases in the Second Circuit. See, e.g., Herrick v. Grindr LLC, 765 Fed. Appx. 586 (2d Cir. 2019); Mosha v. Facebook Inc., No. 20-cv-2608 (JGK) (S.D.N.Y. Jan. 22, 2021). Brikman v. Twitter, Inc., No. 19-cv-5143 (RPK) (CLP) (E.D.N.Y. Sept. 17, 2020). Cf. Tang v. Guo, No. 17 Civ. 9031 (JFK) (S.D.N.Y. Nov. 2, 2020) (denying a defendant’s Section 230(c)(1) motion to dismiss upon finding that plaintiffs plausibly alleged defendant’s publication of the false statements and not merely its publication of a third-party’s statements).

Of course, not every defendant is able to succeed on a Section 230 defense in every instance – the statute’s requirements must be met. Consider, for example, the Second Circuit’s recent decision in La Liberte v. Reid, 966 F.3d 79 (2d Cir. 2020). 

Here, the plaintiff spoke at a city council meeting in California to oppose a proposed sanctuary-state law. Soon after, a photo was posted on social media showing the plaintiff with an open mouth in front of a minority teenager; the caption was that persons (unnamed) had yelled specific racist remarks at the young man in the photo. Thereafter, the defendant, a cable television personality, retweeted that post and followed with two later posts. 

The plaintiff sued the defendant for defamation and the defendant claimed immunity under Section 230. The U.S. District Court for the Eastern District of New York rejected the defense, and the Second Circuit agreed with the district court that the defendant could not claim immunity under Section 230. The circuit court explained that the plaintiff’s lawsuit did not treat the defendant as the publisher or speaker of any information “provided by another information content provider.” Rather, the Second Circuit said, the defendant was “the sole author of both allegedly defamatory posts.”

The Domen Case

Unlike most Section 230 cases, which involve Section 230(c)(1), the Second Circuit’s holding in Domen was based on Section 230(c)(2), which governs civil liability and which states that no provider or user of an interactive computer service shall be held liable for:

any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected. . . . 

The plaintiffs in the Domen case – Church United, a non-profit organization, and James Domen, Church United’s president and founder – sued Vimeo, Inc., which provides an online forum that allows users to upload, view, and comment on videos. The plaintiffs alleged that Vimeo discriminated against them on the basis of their religion and sexual orientation by deleting Church United’s account from Vimeo’s online video hosting platform.

Specifically, the plaintiffs alleged that Domen “was a homosexual” for three years, that “because of his desire to pursue his faith in Christianity, he began to identify as a former homosexual,” and that he shared his story through Church United to connect with others in California who had similar experiences.

In November 2018, Vimeo e-mailed Domen, informing him that a moderator had marked the Church United account for review. The e-mail explained that “Vimeo does not allow” videos that promote sexual orientation change efforts (SOCE). It instructed Church United to remove the videos and warned that if Church United did not do so within 24 hours, Vimeo might remove the videos or its entire account. It also instructed Church United to download the videos as soon as possible to ensure that it could keep them in the event Vimeo deleted the account. 

In December 2018, Vimeo deleted Church United’s account, explaining, “Vimeo does not allow videos that harass, incite hatred, or include discriminatory or defamatory speech.” The plaintiffs alleged that this was “censorship” insofar as it barred Domen from speaking about his preferred sexual orientation and religious beliefs. They also alleged that Vimeo allowed similar videos to remain on its website.

The Southern District of New York granted Vimeo’s motion to dismiss on the ground that Section 230 immunized Vimeo from this suit. The district court concluded that Vimeo deleted Church United’s account because of Church United’s violation of one of Vimeo’s content policies barring the promotion of SOCE on its platform. This policy, in turn, fell within the confines of the good-faith content policing immunity that the CDA provides to interactive computer services. The plaintiffs appealed to the Second Circuit.

The Second Circuit affirmed. It upheld the use of Section 230, typically an affirmative defense, in a motion to dismiss, declaring that the plaintiffs’ “conclusory allegations” of bad faith did “not survive the pleadings stage, especially when examined in the context of Section 230(c)(2).” 

Moreover, the Second Circuit said, Section 230(c)(2) does not require the use of a particular method of content restriction and does not mandate “perfect enforcement” of a platform’s content policies. Section 230(c)(2) provides protection for restricting access to content that providers “consider” objectionable, the Second Circuit explained, even if the material would otherwise be constitutionally protected, granting significant subjective discretion to providers.

Indeed, the circuit court continued, the “fundamental purpose” of Section 230(c)(2) is to provide platforms such as Vimeo with the discretion to identify and remove what they consider objectionable content from their platforms without incurring liability for each decision.

Therefore, the circuit court concluded, Vimeo was statutorily entitled to consider SOCE content objectionable and could restrict access to that content “as it sees fit” without incurring liability.

Conclusion

Despite the public debate over Section 230 that continues in public and in Washington’s hearing rooms and backrooms, courts continue to issue decisions supporting the freedom of providers to monitor, include, and delete the third-party content posted on their platforms. For now, at least, internet service providers, social networking sites, and other online publishers can continue to operate with the security and protection of both Section 230(c)(1) and Section 230(c)(2). How long that will be true is not clear. Stay tuned.

Reprinted with permission from the April 20, 2021 issue of the New York Law Journal. © ALM Media Properties, LLC. Further duplication without permission is prohibited. All rights reserved.

Share this article:

Related Publications


Get legal updates and news delivered to your inbox