By Justin Sherman. October 12, 2018.
Surprise: fake video is here to stay, and its adverse impacts on society are only getting worse. Last week, we saw this illuminated perhaps more than ever when White House press secretary Sarah Sanders used a fake video from right-wing conspiracy site InfoWars to justify banning CNN reporter Jim Acosta from the White House. As Paul Scharre, Director of the Technology and National Security Program at the Center for a New American Security (CNAS), tweeted, “It was only a matter of time before fake videos were used to influence American politics, but I didn’t think the first major outing would be from the official account of the White House press secretary.” Well, America, think again.
This event occurs amidst an increasing volume of work focusing on so-called “deepfakes”—that is, AI-generated fake photo, video, and audio. (While the video cited by Sanders was not generated using artificially intelligent software, the video and its subsequent virality did provide a sneak-peek of what is to come.) The Guardian released a story just this morning on deepfake technology, zeroing in on a fake video of Donald Trump that made the rounds in Belgium this past May. Also today, The New Yorker ran a featured article on how “advances in digital imagery could deepen the fake-news crisis—or help get us out of it.”
Yesterday, my latest piece was published with think tank Technology for Global Security on countering the proliferation of AI-generated fake video, which will only exacerbate existing problems with fake news and online manipulation as the technology becomes more accessible, becomes easier to use, and produces more accurate-looking content. Reading just a handful of links from my research, it’s clear fake video is getting more attention. Even within the Duke Program in American Grand Strategy, the incident at the White House comes off the back of recent events with Peter W. Singer on his new book “LikeWar: The Weaponization of Social Media” and Monika Bickert and Tom Bossert on how social media companies and governments fight contemporary digital threats. Tonight, Jeffrey Ritter from the University of Oxford will speak to our Cyber Club about the issue of ‘digital trust’ and how we can conceptually approach building trust in the 1’s and 0’s that run our modern world.
Overlap in press coverage, scholarship, and events on the issue of fake video and related concerns (e.g., the role of corporations in national security in the 21st century) could be perceived as coincidence, but the reality is, none of this is coincidental. The world is coming to grips—at varying speeds—with the problems presented by current and future technologies for creating fake video, which means that we as citizens must also become involved.
Educational institutions, from primary schools to colleges and universities, must educate their students on online information, modern information consumption, and the ability of malicious actors to fraudulently manipulate (or entirely produce) videos. News organizations must train their employees on this threat and how it implicates their responsibility to keep the public informed. Governments must take strong, deterrent actions to dissuade any number of actors from using this technology for malicious purposes, and so on.
Fake video is here to stay, but for anyone who’s been reading the news of late, that’s not even an announcement, let alone a revelation. We need to get past this point—accepting that fake video is here and getting more problematic, including because of the entities who might distribute it—and we need to instead focus on building individual, technical, and societal resistance to this threat.
Justin Sherman is a junior double-majoring in computer science and political science and the Co-Founder and President of the Duke Cyber Team. He is a Fellow at Interact; the Co-Founder and Vice President of Ethical Tech; and a Cyber Policy Researcher at the Laboratory for Analytic Sciences. He has written extensively on cyber policy and technology ethics, including for Journal of Cyber Policy, Defense One, The Strategy Bridge, and the Council on Foreign Relations.