Less than one week after issuing an order vacating its own March 2021 opinion in an important Communications Decency Act (“CDA”) case and granting a petition for rehearing, the Second Circuit issued a new opinion reaffirming “protection” under Section 230 of the CDA for video-sharing site Vimeo, Inc. (“Vimeo”) (Domen v. Vimeo, Inc., No. 20-616 (2d Cir. July 21, 2021) (amended opinion)).

It’s not completely clear why the Second Circuit decided to grant a rehearing and amend its original opinion to only reach essentially the same holding. It is possible that given the attention surrounding the CDA, the court thought it best to narrow the language of its original holding so it could insulate its ruling from possible Supreme Court review (recall, Justice Thomas previously issued a statement following denial of certiorari in a prior CDA case, that “in an appropriate  case,” the Court should consider whether the text of the CDA “aligns with the current state of immunity enjoyed by Internet platforms”).  The Second Circuit’s second decision arguably watered down some of its stronger statements in its earlier opinion enunciating broad CDA immunity (e.g., even swapping out the word “immunity” for “protection” when discussing the CDA). The court even mused in dicta near the end of the opinion about the types of claims that might fall outside of CDA protection, as if to intimate that CDA Section 230 immunity is broad, but not as broad as its detractors suggest.

Yet, despite the narrowing of its original opinion, the court reached the same result under the same reasoning. As in the original (now vacated) opinion from March 2021, the Second Circuit’s amended decision relied on Section 230(c)(2), the Good Samaritan provision, which allows online providers to self-regulate the moderation of third party content in good faith without fear of liability. Unlike the original opinion, in the second go-round the appeals court also knocked out the plaintiff’s claims on the merits, finding allegations of discrimination based on the presence of similar videos uploaded by other users that were left up on the site as “vanishingly thin” (thereby further reducing the chance of Supreme Court review).   

The case involved a user challenging Vimeo’s decision to terminate his “Church United” organization account for posting objectionable videos that violated Vimeo’s terms.  Vimeo’s platform terms prohibit, among other things, content that “[c]ontains hateful, defamatory, or discriminatory content or incites hatred against any individual or group.” The terms also reference Vimeo’s Guidelines, which states that moderators will generally remove, among other things, videos that promote sexual orientation change efforts (“SOCE”). The videos at issue were flagged by Vimeo as promoting SOCE. Afterward, the plaintiff was advised to take down the videos within 24 hours or Vimeo might remove the videos or terminate his account. When the plaintiff had not removed the videos, he received an email that his Vimeo account had been terminated.  The plaintiff challenged the termination and advanced various state discrimination claims against Vimeo.

In dismissing the complaint, the lower court had found that Vimeo was immune from plaintiff’s claims based on two aspects of CDA immunity: the most commonly-pleaded, § 230(c)(1), which provides immunity for “online publishers” of third-party content, and also under § 230(c)(2), the “Good Samaritan” screening provision, which immunizes providers for good faith actions to police objectionable content. (Domen v. Vimeo, Inc., 433 F.Supp.3d 592 (S.D.N.Y. 2020)). On appeal, the Second Circuit panel, in its original decision from March 2021, decided the case based solely on “Good Samaritan” blocking grounds and ruled that Vimeo was free to moderate to restrict access to material that, in good faith, it finds objectionable, even if that moderation is imperfect. The court also summarily rejected plaintiff’s contentions that Vimeo terminated his account in “bad faith.”

The Second Circuit’s second opinion in the case treads over the same ground.

The court found this to be an easy case of an online provider moderating and taking down content that expressly violated its content guidelines, all with the protection of the CDA:

“Vimeo’s deletion of Appellants’ account was not anti-competitive or self-serving behavior done in the name of content regulation. Instead, it was a straightforward consequence of Vimeo’s content policies, which Vimeo communicated to Church United prior to deleting its account. Indeed, the policy was communicated to Church United before it even joined the platform.”

The court also delved further into plaintiff’s argument that the alleged presence of other similarly objectionable videos remained available on the site meant that Vimeo’s actions were not done in good faith. The court noted the difficulty of managing the breadth of content hosted on a large platform. The appeals court portrayed the CDA as protecting providers’ good faith blocking under Section 230(c)(2) even if not every bit of objectionable content is flagged and taken down, reiterating Congress’s purpose that the CDA remove disincentives for the development and use of blocking technologies.

“[T]he mere fact that Appellants’ account was deleted while other videos and accounts discussing sexual orientation remained available does not imply bad faith. One purpose of Section 230 is to provide interactive computer services with protection from suit for removing ‘some—but not all—offensive material from their websites,’ as Vimeo has done here. Given the massive amount of user-generated content available on interactive platforms, imperfect exercise of content-policing discretion does not, without more, suggest that enforcement of content policies was not done in good faith.” [citations omitted].

In the end, the affirmance under the Good Samaritan provision is likely to discourage protracted litigation in this case and further empowers other online platforms to restrict access to harassing or objectionable content that violates site terms.

Practical Implications and Lessons Learned

There is a flood of CDA reform bills piling up in Congress, including a new bill introduced last week that would carve out an exception to CDA protection for misinformation during a public health emergency.  Even with a new Administration, there remains a bipartisan appetite to amend the CDA (even if the justifications for such reform differ on party lines). Many CDA reform bills contain provisions that aim to increase transparency and hold providers to their stated terms and content guidelines or else lose CDA immunity for filtering decisions.  The Vimeo case is a great example of a provider that posted site terms and content policies governing user content, followed such procedures, gave notice of the possible consequences of non-compliance and then took an editorial action that was immunized under the CDA. As the court summed up:

“Section 230(c)(2) protects from liability providers and users of interactive computer service who voluntarily make good faith efforts to restrict access to material they consider to be objectionable . . . . Here, Vimeo did just that: it removed Appellants’ account for expressing pro-SOCE views which it in good faith considers objectionable. […] [Plaintiff] ignored Vimeo’s notice of their violation, and as a result, Vimeo deleted their account. By suing Vimeo, Appellants run headfirst into Section 230, which ‘allows computer service providers to establish standards of decency without risking liability for doing so.’”  [citations omitted].

Of course, not every moderation decision will involve content that is expressly prohibited by a site’s content policy (and some moderation may be time-sensitive necessitating immediate action). Still, sites should take a moment to ensure they have broadly-worded terms that are designed to give the user notice of what is and is not allowed on its site and give the service ample latitude to filter out a wide variety of offensive material.  In this case, Vimeo’s content guidelines actually listed the content at issue in this case as prohibited on the site.  Obviously, there’s no way to list every type of content that’s objectionable, but outlining in some detail the types of content that are harmful or harassing and against site terms may be helpful in future litigation to give courts an easier path to applying CDA immunity and terminate a case as early as possible.

In keeping with the appeals court’s narrowing of its original broad holding, one of the new additions to the amended opinion was dicta near the end where the court suggests certain limitations of the CDA defense.

“Our decision should not be read to confer immunity on providers acting in circumstances far afield from the facts of this case. Courts have rejected Section 230 defenses against claims for false advertising, deceptive trade practices, and tortious interference. Judges, commentators, and the executive branch alike have expressed concern about Section 230’s potential to protect companies engaging in anti-competitive conduct. Certain claims sounding in contract or tort may be beyond the reach of Section 230(c)(2)’s protection from suit. Our decision applies to the limited circumstances of this case and analogous claims.”

The court’s dicta about claims that potentially fall outside the CDA is quite general (and not exhaustive), and one could find decisions involving some versions of these claims where online providers have eventually prevailed on CDA grounds. Still, online providers should look over this laundry list for clues on how litigants might seek to bypass CDA immunity in future cases and avoid an early dismissal.

Ultimately, however, Vimeo is an important decision, as it further cements a developing body of case law in the Second Circuit interpreting broad immunity under the CDA.  The holding was based on Good Samaritan §230(c)(2) immunity, and thus, is likely to be well-cited in future cases, given the relative dearth of precedent in the area. Meanwhile, on the legislative front, we will watch closely the developments surrounding the push for CDA reform.

Photo of Jeffrey Neuburger Jeffrey Neuburger

Jeffrey Neuburger is a partner, co-head of the Technology, Media & Telecommunications Group, a member of the Privacy & Cybersecurity Group and editor of the firm’s New Media and Technology Law blog.

Jeff’s practice focuses on technology, media and advertising-related business transactions…

Jeffrey Neuburger is a partner, co-head of the Technology, Media & Telecommunications Group, a member of the Privacy & Cybersecurity Group and editor of the firm’s New Media and Technology Law blog.

Jeff’s practice focuses on technology, media and advertising-related business transactions and counseling, including the utilization of emerging technology and distribution methods in business. For example, Jeff represents clients in online strategies associated with advertising, products, services and content commercialized on the Internet through broadband channels, mobile platforms, broadcast and cable television distribution and print publishing. He also represents many organizations in large infrastructure-related projects, such as outsourcing, technology acquisitions, cloud computing initiatives and related services agreements.

Serving as a collaborative business partner through our clients’ biggest challenges, Jeff is part of the Firm’s cross-disciplinary, cross-jurisdictional Coronavirus Response Team helping to shape the guidance and next steps for clients impacted by the pandemic.