Some clips played for laughs, such a video of “Mr. Rogers’ Neighborhood” host Fred Rogers writing a rap song with hip-hop artist Tupac Shakur. Others have leaned into darker themes. One video showed police body-camera footage of Whitney Houston looking intoxicated. In some clips, King makes monkey noises during his “I Have a Dream” speech, basketball player Kobe Bryant flies aboard a helicopter mirroring the crash that killed him and his daughter in 2020, and John F. Kennedy makes a joke about the recent killing of right-wing influencer Charlie Kirk.
OpenAI said the text-to-video tool would depict real people only with their consent. But it exempted “historical figures” from these limits during its launch last week, allowing anyone to make fake videos resurrecting public figures, including activists, celebrities and political leaders - and leaving some of their relatives horrified.
“It is deeply disrespectful and hurtful to see my father’s image used in such a cavalier and insensitive manner when he dedicated his life to truth,” Shabazz, whose father was assassinated in front of her in 1965 when she was 2, told The Washington Post. She questioned why the developers were not acting “with the same morality, conscience, and care … that they’d want for their own families.”
Sora’s videos have sparked agitation and disgust from many of the depicted celebrities’ loved ones, including actor Robin Williams’s daughter, Zelda Williams, who pleaded in an Instagram post recently for people to “stop sending me AI videos of dad.”
“To watch the legacies of real people be condensed down to … horrible, TikTok slop puppeteering them is maddening,” she said.
As AI’s rapid development gives everyday people the power to create realistic-feeling images and chatbots, it also challenges age-old notions about who controls a person’s memories, identity and legacy after they die. Most companies can’t use Robin Williams’s likeness for commercial gain without permission from his family. But on Sora, strangers have rendered Williams, who died in 2014, as they choose.
“Commercially, if you create meme-able content of famous people who are recognizable, that’s going to get more clicks,” said Henry Ajder, an AI expert who studies deepfakes and coined the term “synthetic resurrection” to describe creating digital copies of the dead. “With deceased individuals, this opens up such a huge question about ownership of likeness, and really fundamentally changes the social contract around what it means to be you online.”
As technology advances, Ajder’s vision is becoming a more common reality. The prospect of digitally cloning the dead is already sparking uncomfortable questions about families’ inability to control how their loved ones are portrayed.
“The amount and the volume of this kind of synthetic resurrection content is just huge now,” he said. “And it’s not being done by creative agencies in partnership with the estate … or by a Hollywood studio as a tribute to a much-loved actor or actress, with consent from their family. It’s being done by s—posters, memesters, racists and all the rest.”
OpenAI said its policy was based on “strong free speech interests in depicting historical figures.” But after backlash, the company said Wednesday it would begin allowing the representatives of “recently deceased” public figures to request that their likeness be blocked from Sora videos.
“We believe that public figures and their families should ultimately have control over how their likeness is used,” an OpenAI spokeswoman said. She declined to define “recently deceased.