Deepfakes that hurt people are already illegal, so let's stop trying to rush out ill-considered legislation

Deepfakes — videos with incredibly realistic faceswapping, created with machine learning techniques — are creepy as hell, except when they're not (then they're a form of incredibly expressive creativity with implications for both storytelling and political speech).


But when Deepfakes are used to create counterfeit pornography that nonconsensually conscripts people into sex acts they never participated in, they can be incredibly harmful: used for blackmail, or as hoaxes to discredit, or to harass and cause distress.

The sensational, visual nature of Deepfakes, combined with the hotbutton issue of sex and porn and the hype around machine learning, has prompted a lot of people to call for the creation of specific laws to target deepfaking. The thing is, we already have a large toolbox of legal remedies for people who are victimized by deepfaking: laws banning extortion, harassment, false light privacy invasions, defamation, Intentional Infliction of Emotional Distress, the right of publicity and, of course, copyright infringement.


These are laws with well-developed jurisprudence, which have been found to pass Constitutional muster, and that are well-understood by lawyers. The real problems of deepfakes deserve real solutions — and thankfully, that's what we have.


The tort of Intentional Infliction of Emotional Distress (IIED) will also be available in many situations. A plaintiff can win an IIED lawsuit if they prove that a defendant—again, for example, a deepfake creator and uploader—intended to cause the plaintiff severe emotional distress by extreme and outrageous conduct, and that the plaintiff actually suffered severe emotional distress as a result of the extreme and outrageous conduct. The Supreme Court has found that where the extreme and outrageous conduct is the publication of a false statement and when the statement is about either a matter of public interest or a public figure, the plaintiff must also prove an intent that the audience believe the statement to be true, an analog to defamation law's actual malice requirement. The Supreme Court has further extended the actual malice requirement to all statements pertaining to matters of public interest.

And to the extent deepfakes are sold or the creator receives some other benefit from them, they raise the possibility of right of publicity claims as well by those whose images are used without their consent.

Lastly, one whose copyrighted material–either the facial image or the source material into which the facial image is embedded–may have a claim for copyright infringement, subject of course to fair use and other defenses.


We Don't Need New Laws for Faked Videos, We Already Have Them
[David Greene/EFF]