The FSC Guide to the EARN IT Act

What Does the EARN IT Act Do?

The EARN IT Act allows federal and state governments to prosecute, and victims to sue, a site or service if CSAM is uploaded onto the platform. While the prevention of CSAM is a worthy goal, under the EARN IT Act, a platform may become legally liable for any uploads of illegal content, no matter how aggressively they work to block it.

Someone posts CSAM on Twitter? Twitter can now be prosecuted in any state, even if they remove it immediately. Someone shares CSAM in a private chat on Facebook? Facebook identifies and removes it, but they can now be sued. Someone sends a link to CSAM using WhatsApp? WhatsApp can now be sued.

What Does This Mean for Adult?

If the EARN IT Act passed, platforms featuring user-generated content could be prosecuted or sued for any CSAM uploaded to their sites. Most non-adult platforms would likely get rid of adult content and/or ban adult creators and businesses. 

Social Media Would Likely Block All Sex Content

Reddit and Twitter have no interest in reviewing, verifying and storing model IDs for every single piece of content. They’re not primarily adult sites and don’t want to be. It’s far easier to block any adult content and ban those likely to upload it. The risk of litigation is too great.

This would affect creators and adult businesses the most, but we know that aggressive filtering and banning regimes affect sexual communities, including BDSM/kink, LGBTQ+ and sex educators, even when there’s no explicit content.

An End to End-to-End Encryption 

Because the EARN IT Act makes no distinction between whether content is shared publicly or privately, platforms would likely need to monitor private communications for possible violations, or risk expensive litigation if they were misused. That means more aggressive patrolling of drives, chats, links, and zero-tolerance bans for adult content.

Products like WhatsApp or ProtonMail which provide encrypted messaging could have to provide backdoors for the law enforcement to monitor exchanges and content.

Adult Sites Delisted from Google

The EARN IT Act makes no exception for search engines and, like social media sites, it’s easier to block (or in Google’s case, delist) adult sites than to verify each image or video linked themselves. No Google image search, no search results for adult content.

Loss of Webhosts

The proponents of the EARN IT Act have specifically called out Amazon, which includes Amazon Web Services, for not patrolling the content on its platform. As with Google and social media sites, this could lead to AWS and other hosts ceasing to work with adult sites in general, especially those that work with user-generated content.

What Could The Adult Industry Look Like Post-EARN IT?

Imagine a world where you couldn’t reach an audience by social media. A world where fans can’t find you with a Google search, and where you can’t sell content through a link. The EARN IT Act takes away the tools that creators and small businesses have used to build their fan bases, and forces them to work with large networks and platforms that have the capability to do that type of verification.

Consolidation

Though adult platforms are far more aggressive about monitoring and blocking illegal content than mainstream platforms, companies are always going to be more heavily targeted and patrolled. With increased scrutiny comes increased restrictions on who and what can upload.

  • Margins for most creator platforms are already extremely tight. We expect that the increased demands for surveillance and monitoring would lead smaller companies to drop out of the market entirely. 
  • We would expect more stringent verification processes for the remaining companies, which would lead to more aggressive bans for legal content that might appear controversial. 
  • We expect these costs would be borne by creators, either through reduced payouts, or through delays in uploads.

Lawsuits

Following the passage of FOSTA-SESTA, we saw numerous antiporn groups bring lawsuits against adult platforms for alleged sex trafficking violations. While there appears to little merit to the arguments, they understand that these lawsuits are expensive and damaging — and can provide the means to hobble or destroy a site. We would expect an exponential increase in such suits should the EARN IT Act pass.

What About Section 230?

Section 230 protects platforms from prosecution and lawsuits so long as they do not knowingly distribute illegal content like CSAM. The authors of Section 230 understood that the amount of information shared on the internet is vast, and no platform could be expected to track or be responsible for it all. 

The EARN IT Act creates an exception to Section 230 when it comes to CSAM. If CSAM is uploaded to the platform, the platform now becomes potentially liable.

So like FOSTA-SESTA?

Yes. FOSTA-SESTA created an exemption in Section 230 for sex trafficking. Because many social media platforms did not have the capacity (or interest) to tell the difference between sex work and sex trafficking, they just removed all disussions of sex work. After FOSTA-SESTA was signed Craigslist, Reddit and others simply blocked forums about sex work. We still see the effects today, with services like LinkTree, which recently banned sex workers from using its service.

What If I Report the Content?

Traditionally, there was a “safe harbor” exemption for CSAM that you reported, online or off. However, that standard does not apply the moment you reproduce or distribute CSAM. Of course, the digital nature of uploads is that they automatically reproduce content, even if that just means converting a file into a new format after upload. Even if the content is then blocked or moderated, that reproduction may invalidate the safe harbor exemption.

Do You Have to Know That It’s CSAM?

Theoretically, the standard is that you “knowingly” reproduce or distribute CSAM. However, there’s absolutely no guidance as to when “knowing” begins. In previous CSAM prosecutions, distributors have been charged with reproducing or distributing content that they did not know was CSAM.

Our concern is that platforms might knowingly distribute content that they believe is legal, only later to have it identified as CSAM.  Or that a moderator sends something that they believe could be CSAM to a support team for further review, it might constitute “knowing” distribution. At the very least, this vagueness opens the door to thousands of potential lawsuits and overzealous prosecutors.