New Law Targets Nonconsensual Explicit Content Amid Free Speech Concerns

Kevin Lee Avatar

By

New Law Targets Nonconsensual Explicit Content Amid Free Speech Concerns

Former President Donald Trump’s first and only joint address to Congress, on March 4. In this speech, he articulated his passionate support for the recently passed Take It Down Act. This legislation would make it a crime for anyone to publish nonconsensual explicit images, whether produced through traditional or AI-based means. As soon as the bill hits Trump’s desk, he has indicated that he will sign it into law. He made hefty promises on safeguarding Americans from online kid exploitation.

The Take It Down Act would impose significant burdens on online platforms. To top it all off, they have to respond to any and all takedown requests within 48 hours. Otherwise, they risk incurring legal liability. This law is a balanced approach to addressing revenge porn and explicit deepfakes while not interfering with protected free speech and art on the internet. With Senator Marsha Blackburn as a co-sponsor, the legislation gained bipartisan support in recognition of the devastating impact of nonconsensual intimate imagery (NCII).

Key Provisions of the Take It Down Act

>The Take It Down Act requires online platforms to establish clear mechanisms for the removal of NCII. They are required to implement this change no later than one year after the law’s enactment. The new law shifts the burden onto these platforms to take action. It is a victory for victims of nonconsensual content, with the aim of holding companies that allow the use of these images on their services accountable.

Special rules would apply to when a full-blown takedown request comes in. Defaulting on the court-ordered terms within the allotted time can lead to costly sanctions, which adds an urgency to the need for action. Furthermore, the law seeks to prevent the spread of toxic materials by making revenge porn and deepfakes illegal.

“There’s nobody who gets treated worse than I do online.” – Donald Trump

>The law’s implications go well beyond its stated intentions. Industry experts are already worried about what it will do to free speech overall and content moderation efforts in all corners of the internet.

Concerns Over Free Speech and Content Moderation

India McKinney, director of federal affairs at the Electronic Frontier Foundation, has sounded a key alarm. She worries that the Take It Down Act would lead to over-moderation that stifles legitimate expression. McKinney pointed out that “content moderation at scale is widely problematic and always ends up with important and necessary speech being censored.”

This fear is especially poignant in a climate where conversations around what is acceptable content in our society have already dogpiled on each other. McKinney warned that there could be a similar chilling effect on decentralized platforms like Mastodon, Bluesky, or Pixelfed. These platforms would be under immense pressure to preemptively censor content in order to protect themselves from liability from enforcement under the new law.

“The default is going to be that they just take it down without doing any investigation to see if this actually is NCII or if it’s another type of protected speech.” – India McKinney

McKinney highlighted concerns regarding potential overreach in monitoring, suggesting that platforms may begin scrutinizing encrypted messages in their efforts to comply with the law. This would result in an even greater erosion of privacy rights and muddle the already complicated landscape of digital communication even more.

Industry Response and Future Implications

In light of the challenges posed by the Take It Down Act, technology companies are seeking solutions that balance compliance with user protection. Hive, an AI-generated content detection startup, delivers potent tools. These tools aid platforms’ efforts to detect deepfakes and prevent CSAM. You can hook up their API in the upload process. By doing this, legislation becomes sensitive to the real-time monitoring of their content prior to their public release.

Kevin Guo, CEO and co-founder of Hive, commented on the potential benefits of such technologies: “It’ll help solve some pretty important problems and compel these platforms to adopt solutions more proactively.” Still, doubts persist about whether and how well these new measures will protect users without infringing upon their rights.

The Take It Down Act lives to see another day, drawing ties to other legislative efforts in progress. A recent, significant example of this is the Kids Online Safety Act. One might assume, then, that both laws prioritize user safety online. They may inadvertently harm free speech and content distribution as they seek to address these challenges.

“I really want to be wrong about this, but I think there are going to be more requests to take down images depicting queer and trans people in relationships.” – India McKinney

Kevin Lee Avatar
KEEP READING
  • Understanding AEST and Its Global Time Significance

  • Alarming Algal Bloom Threatens Marine Life in South Australia

  • Mistral AI Emerges as a Leader in Generative AI Market

  • North Korea Detains Officials After Failed Naval Destroyer Launch

  • Lachlan Galvin Set to Join Canterbury Bulldogs Through 2028 NRL Season

  • Exploring Electric Vehicle Alternatives to Tesla