Support the news

President Nixon Never Actually Gave This Apollo 11 Disaster Speech. MIT Brought It To Life To Illustrate Power Of Deepfakes

The heart of MIT's "In Event of Moon Disaster" film installation uses a deep fake video to depict an alternate version of the 1969 Apollo 11 moon landing. (Courtesy Francesca Panetta)

Imagine a past in which the crew of Apollo 11 landed on the moon in 1969 — but then became stranded there, leading then-President Nixon to give a speech memorializing the astronauts.

A new MIT film installation uses that exact premise to shed light on so-called deepfake videos and how they are used to spread misinformation. Deepfakes use artificial intelligence technologies to create or alter a video to make them untrue in some way.

MIT's Center for Advanced Virtuality created a video of Nixon giving a speech that was actually written for him — but that he never ended up delivering. The video is the centerpiece of “In Event of Moon Disaster," opening Friday at the International Documentary Film Festival Amsterdam (IDFA). The installation was supported by the Mozilla Foundation and the MIT Open Documentary Lab.

Film co-director Francesca Panetta says doctored videos like these have been a concern in current affairs and politics — but this effort points to another danger.

"Deepfakes can be used for many of the things we already know," Panetta says, "but also to create kind of alternative histories or have the potential to kind of rewrite history as well."

So how did MIT create a 50-year-old video of something that never happened?

Panetta says she and co-director Halsey Burgund selected actor Lewis D. Wheeler to voice three hours worth of recordings from Nixon. The goal was not to impersonate Nixon, but instead to get characteristics like his intonation and cadence correct, she says. After those recordings were done, they were given to Respeecher, a company that focuses on synthetic voice-over work. Respeecher then used clips of those recordings to create a synthetic version of Nixon's voice.

At that point, anything Wheeler recorded would come out in the voice of the 37th president of the United States.

Part two of the project consisted of working with video dialogue replacement firm Canny AI to create the actual deepfake video. The company took a Nixon speech — in this case, it was part of Nixon's resignation speech — and then put footage of Wheeler delivering the moon speech on top of it.

"They can take just the elements of the actual lip movement and chin movement and any other parts of the face that are sort of associated with talking," Burgund says, "and they overlay that onto the source video."

The end result is a rather unnerving, yet realistic video.

"I had one person say, 'Oh, so you got an impersonator to impersonate Nixon.' They didn't think it was a synthetic voice," Panetta says. "I had someone else say, 'Oh, so Nixon actually did record this? They filmed him saying this as a contingency speech then in case it happened?' And I was like, 'Wow! OK — you thought they filmed this before. I guess it worked.'"

At the IDFA, the video is displayed in a mock 1960s living room. Viewers of the project watch a video presentation of Apollo 11's journey, displayed on a vintage television. The presentation then switches to when Apollo 11 should have been safely touching down — but the flight goes wrong, stranding the astronauts in the process. President Nixon finally appears on screen to tell the American people about the disaster-- delivering the speech, courtesy of MIT.

Panetta says the selection of the moon landing as the basis for this project was a conscious one, stemming from its significance in history.

"This is one of the most historic events, and if we are trying to make a piece that is about interrogating whether deepfakes have the potential to rewrite history, then the moon landing seemed like a really good one to pick," she says.

Burgund says he hopes viewers of “In Event of Moon Disaster" come away with an appreciation for the power that deepfake videos can hold and also an awareness for how they can be used to form false narratives.

"We are trying to communicate that deepfakes are kind of a continuation --or an extension, if you will — of a continuum of misinformation that we all should be aware of and should have our ears tuned to, if we can," Burgund says.

The MIT team is planning on turning the installation into an interactive website next year.

Related:

More from WBUR