No Fakes act would limit replicating voice or likeness of individuals

The No Fakes Act [pdf] would prohibit the use of AI to replicate the image, voice, and visual likeness of individuals without permission, excepting parodies, news, documentaries or other fair uses. Emilia David at The Verge reports that it amounts to a Federal right to publicity law.

Individuals, as well as entities like a deceased person's estate or a record label, can file for civil action based on the proposed rules. The bill also explicitly states that a disclaimer stating the digital replica was unauthorized won't be considered an effective defense. 

The No Fakes Act essentially federalizes likeness laws, which vary from state to state. (Some states don't have ground rules around the right to publicity at all.) New York is one of the few states that explicitly mentions digital replicas and prohibits the use of a deceased person's computer-generated replica for scripted work or live performances without prior authorization. 

The text emphasis on "sound recording" seems a hint on where this one comes from.