Privacy

Nonconsensual Intimate Imagery on Your Platform? New Federal Law Says to “Take It Down.”

Published: Jun. 26, 2025

On May 19, 2025, President Trump signed the TAKE IT DOWN Act into law. The Act creates new obligations for online platforms to remove certain nonconsensual intimate visual depictions of both adults and minors within 48 hours of receiving a request. It also builds on the momentum of state-level laws and private initiatives like the NCMEC-administered “Take It Down” tool aimed at bolstering child safety and privacy online, but goes significantly further by adding parallel protections for adults and codifying removal obligations under federal law.

The TAKE IT DOWN Act received broad bi-partisan support, passing the Senate unanimously and the House of Representatives with a vote of 409-2. The law also received praise from civil society groups like the National Organization for Women and NCMEC, which applauded the law for closing “a dangerous gap by targeting the distribution of both real and digitally altered exploitative content involving children – content that may fall outside existing CSAM definitions.

Real and Manipulated Intimate Imagery

The TAKE IT DOWN Act applies to a broad range of intimate photos and videos and contains criminal prohibitions for individuals and removal requirements for covered platforms (detailed below). The law applies to both (1) authentic photos and videos and (2) “digital forgeries”—photos and videos created or manipulated by technological means, such as deepfakes and photoshopped images.  

Covered Platforms 

The law will impose new obligations on a wide range of platforms that host user-generated content. It applies to any “covered platform,” which is a broadly defined category that includes any website, online service, online application, or mobile application that:

  1. serves the public; and
  2. primarily provides a forum for user-generated content (e.g., messages, videos, images, games, and audio files) or publishes, curates, hosts, or otherwise makes available nonconsensual intimate visual depictions during the regular course of its business. 

Exclusions are limited to broadband internet access providers, email providers, and platforms that do not primarily consist of user-generated content where chats (and similar features) are incidental to the provision of the platform’s main service. There are no other exceptions for non-profit organizations (see Enforcement  below), small businesses, educational institutions, or end-to-end encrypted platforms.

Request, Verify, & Remove

At its core, the TAKE IT DOWN Act creates and institutes a request-verification-removal regime, which goes into effect for covered platforms on May 19, 2026. 

  • Requests: A covered platform must establish a “clear and conspicuous” mechanism for individuals to request removal of nonconsensual intimate visual depictions that (a) is easy to read and in plain language and (b) explains the platform’s responsibilities under the law.
  • Verification: Individuals making removal requests must provide platforms with certain information to verify their identity. They are required to include their signature, an identification of the nonconsensual intimate visual depiction, a statement that they have a good faith belief that the depiction is not consensual (and other relevant information to aid the platform in its investigation), and their contact information. 
  • Removal: A platform must remove the content—including any copies that it can identify with reasonable efforts—as soon as possible, no later than 48 hours after receiving the request.

The law provides a limitation on liability against claims arising from the removal of apparent nonconsensual intimate visual depictions for platforms that implement and comply with this process, even if the removed material is ultimately determined not to be unlawful. 

Enforcement

Like other consumer protection, child safety, and privacy-focused laws, the Federal Trade Commission (FTC) will enforce the TAKE IT DOWN Act. Platforms’ “failure to reasonably comply with the [law’s] notice and takedown obligations” will be treated by the FTC as an unfair or deceptive act or practice in violation of the FTC Act. Notably, the law also expressly states that the FTC’s enforcement authority extends to nonprofit organizations. Early FTC enforcement actions are likely to provide more guidance as to what constitutes “reasonable compliance.” For example, the FTC might look closely at internal content moderation practices, whether a covered platform employs industry standard safety technologies (e.g., PhotoDNA), and how a platform decides to verify, accept, or deny requests.