How will the Take it Down Act affect web developers?
The bill requires platforms to provide a means for victims to request non-consensual intimate imagery (NCII) within 48 hours of receiving a valid request. What does that mean?
Non-consensual Intimate Imagery
The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, that, when viewed as a whole by a reasonable person, is indistinguishable from an authentic visual depiction of the individual.
As per the bill.
Platforms
Platforms, as defined by the bill, are, any website, online service, online application, or mobile application that serves the public and primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files; or for which it is in the regular course of trade or business of the website, online service, online application, or mobile application to publish, curate, host, or make available content of nonconsensual intimate visual depictions.
Exclusions
There are some exclusions under this definition. Paraphrasing the bill; broadband providers, email, and any online service, application, or website that consists primarily of content that is not user generated but is preselected by the provider of such online service, application, or website; and for which any chat, comment, or interactive functionality is incidental to, directly related to, or dependent on the provision of that content.
What Does that Mean?
I’m not legal scholar, and I’m not offering any legal advice, but by my interpretation it’s social media, messaging apps, forums, and any tools built around user communication. Why not email? I’m not sure if this exclusion is due to the technical limitation of email being a decentralized protocol—not a platform—or if politicians are just confused about how tech works.
Where I’m confused is in the exclusions.
[A]ny website, online service, online application, or mobile application that serves the public and primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files […]
What are messages? Let’s use the example of a forum. If I post a comment on a post, is that a message? I don’t think reasonable people would say so. Messages are typically targeted at an individual or a group. If I have a text-only forum, do I have to invest in the architecture to support this? What about links? Do I have any legal responsibility if people are using my site to share links to this content? What about private messaging? Like texts or group chats. What is publishing?
Provide a Means
This part of the bill is a bit clearer. Emphasis is mine.
(a) In general.—
(1) NOTICE AND REMOVAL PROCESS.—
(A) ESTABLISHMENT.—Not later than 1 year after the date of enactment of this Act, a covered platform shall establish a process whereby an identifiable individual (or an authorized person acting on behalf of such individual) may—
(i) notify the covered platform of an intimate visual depiction published on the covered platform that—
(I) includes a depiction of the identifiable individual; and
(II) was published without the consent of the identifiable individual; and
(ii) submit a request for the covered platform to remove such intimate visual depiction.
(B) REQUIREMENTS.—A notification and request for removal of an intimate visual depiction submitted under the process established under subparagraph (A) shall include, in writing—
(i) a physical or electronic signature of the identifiable individual (or an authorized person acting on behalf of such individual);
(ii) an identification of, and information reasonably sufficient for the covered platform to locate, the intimate visual depiction of the identifiable individual;
(iii) a brief statement that the identifiable individual has a good faith belief that any intimate visual depiction identified under clause (ii) is not consensual, including any relevant information for the covered platform to determine the intimate visual depiction was published without the consent of the identifiable individual; and
(iv) information sufficient to enable the covered platform to contact the identifiable individual (or an authorized person acting on behalf of such individual).
(2) NOTICE OF PROCESS.—A covered platform shall provide on the platform a clear and conspicuous notice, which may be provided through a clear and conspicuous link to another web page or disclosure, of the notice and removal process established under paragraph (1)(A) that—
(A) is easy to read and in plain language; and
(B) provides information regarding the responsibilities of the covered platform under this section, including a description of how an individual can submit a notification and request for removal.
(3) REMOVAL OF NONCONSENSUAL INTIMATE VISUAL DEPICTIONS.—Upon receiving a valid removal request from an identifiable individual (or an authorized person acting on behalf of such individual) using the process described in paragraph (1)(A)(ii), a covered platform shall, as soon as possible, but not later than 48 hours after receiving such request—
(A) remove the intimate visual depiction; and
(B) make reasonable efforts to identify and remove any known identical copies of such depiction.
(4) LIMITATION ON LIABILITY.—A covered platform shall not be liable for any claim based on the covered platform’s good faith disabling of access to, or removal of, material claimed to be a nonconsensual intimate visual depiction based on facts or circumstances from which the unlawful publishing of an intimate visual depiction is apparent, regardless of whether the intimate visual depiction is ultimately determined to be unlawful or not.
No later than 1 year
Since this isn’t law yet, us developers will have a year post if/when Trump signs this into law. You can see when this happens on the bill’s webpage.
Requirements for Platforms
This is the juicy part, which gives me hope that this bill won’t be abused. Platforms must provide a disclaimer regarding their responsibilities under this section and information about how a user can submit a notification and request for removal.
There is no instruction on how to provide this:
Considering a valid request must contain the following:
- A physical or electronic signature of the affected person or the authorized person acting on their behalf
- An identification of, and information reasonably sufficient to locate the media
- A statement that the identifiable individual has a good faith belief that the media is non consensual, including relevant information to prove it
- Contact information to contact the affected individual or authorized person acting on their behalf
It looks like determining if a request is valid is up to the platform. Say I run a forum and I don’t really want to deal with this.
I can make it so you have to send me a request via physical mail with photos of yourself, or the victim, attached, proof the image wasn’t consensual, and identifying information about the image so I can find it. At which point I can take my time to validate if the request is in good faith. What if I run a site with adult content, and I need to make sure the image isn’t machine generated. I mean, you can always appeal your decision to the FTC, but they’re a little busy. And, as a platform, I have no obligation to snitch on anyone uploading this data. Hell, after 30 days, they’re basically a ghost if I clear my logs.