An urgent proposal to make AI-generated fake people illegal

Philosopher Daniel C. Dennett argued in The Atlantic that fake AI-generated people represent a threat to economies, freedom, and civilization so great, their use should be immediately made illegal. Link here (paywall).

Today, for the first time in history, thanks to artificial intelligence, it is possible for anybody to make counterfeit people who can pass for real in many of the new digital environments we have created. These counterfeit people are the most dangerous artifacts in human history, capable of destroying not just economies but human freedom itself. Before it's too late (it may well be too late already) we must outlaw both the creation of counterfeit people and the "passing along" of counterfeit people. The penalties for either offense should be extremely severe, given that civilization itself is at risk.

It is easy to imagine a near future in which absolutely any person we see on a screen could be fabricated or counterfeited in a way to manipulate us for criminal or nefarious political purposes.

Dennett proposes that we borrow existing technical and legal structures in place for counterfeiting money, such as the EURion Constellation, which protects the world's currencies by helping imaging software detect the presence of currency in a digital image. Any fake people must be clearly labelled so with watermarks, and the criminal and civil penalties for failing to do so, or for disabling watermarks, should be severe.

[I]t would be reassuring to know that major executives, as well as their technicians, were in jeopardy of spending the rest of their life in prison in addition to paying billions in restitution for any violations or any harms done. And strict liability laws, removing the need to prove either negligence or evil intent, would keep them on their toes. The economic rewards of AI are great, and the price of sharing in them should be taking on the risk of both condemnation and bankruptcy for failing to meet ethical obligations for its use.