On May 19, 2025, President Trump signed into law the Take It Down Act. The new law imposes strict takedown obligations and creates new civil and criminal liabilities for individuals and platforms that distribute nonconsensual intimate images (NCII).

The Act makes it a federal crime for individual posters to knowingly share or threaten to share NCII, including AI-generated images that depict real people, and clarifies that consent to create an image does not mean consent to share it.

“Covered Platforms” must now provide a notice-and-takedown notification process allowing affected persons to request the removal of intimate visual depictions of an identifiable individual posted without consent. Covered Platforms include any website, online service, application, or mobile app that (1) serves the public and (2) either (a) provides a forum for user-generated content (e.g., videos, images, messages, games, or audio), or (b) in the ordinary course of business, regularly publishes, curates, hosts, or makes available nonconsensual intimate visual depictions. Covered Platforms do not include the following entities: (1) broadband internet access providers (ISPs); (2) email services; or (3) online services or sites with primarily preselected content, where the content is not user-generated but curated by the provider, and interactive features are merely incidental or directly related to the preselected content.

Under the statute, the FTC will enforce a Covered Platform’s obligations under the Act. With a failure to meet the Act’s takedown obligations subjecting a platform it to civil penalties, investigation, and injunctive relief. Note that the Act also provides “safe harbor” protection for good-faith removal efforts where a Covered Platform takes prompt action, prevents re-uploads, and complies with the Act’s recordkeeping and reporting requirements.

Compliance Recommendation – By May 19, 2026, a Covered Platform should provide on its platform a clear and conspicuous notice of its removal process, including an online submission process with identity verification. The Covered Platform must then put in place a procedure such that when it receives a valid request to remove content, it will take the following steps: (1) remove the reported content within 48 hours, (2) make reasonable efforts to locate and remove identical copies of the same image or video, and (3) ensure its systems detect and prevent attempts to re-upload the offending content. Finally, the Covered Platform should implement an internal system to log and document all takedown actions, as well as other efforts to demonstrate good-faith compliance and to meet the Act’s safe harbor.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Craig A. Gilley

Craig Gilley provides a broad range of services for regulated communications entities, as well as information technology, education technology, investment, and private equity companies. Craig’s primary practice involves counseling cable operators, broadband providers, internet service providers, video programmers, satellite providers, and wireless/wireline telecommunications…

Craig Gilley provides a broad range of services for regulated communications entities, as well as information technology, education technology, investment, and private equity companies. Craig’s primary practice involves counseling cable operators, broadband providers, internet service providers, video programmers, satellite providers, and wireless/wireline telecommunications providers on a broad range of legal, regulatory, operational, and transactional issues. He also regularly provides transactional, operational, compliance, and strategic advice to information and educational technology firms. Craig also represents investment and private equity companies, providing transactional and compliance support to ensure that both their acquisitions and ongoing investments fully comply with regulatory requirements.