The Danger of Deepfakes (VIDEO)
How can you believe everything you see online?

Technality host Jacqueline Swan.
This article is part of Narcity Media's Technality series. Subscribe to Technality on YouTube for all things related to the future, tech and humanity.
Pop quiz: which face was created by AI?
Fake faces created by an AI face generator.This Person Does Not Exist
If you said all of them, you'd be right! These faces were all generated on This Person Does Not Exist, a site that uses AI to generate frighteningly accurate portraits of people who...don't exist.
If you thought some were real and some were fake, you're not alone. The site explains that it is almost impossible for a regular person to spot fake people, and there are no services available to do it. Occasionally, the AI will mess up, which will result in artifacts like a weird pattern or odd hair colour appearing on the photo, but you need to be paying attention to spot those.
Photos, of course, are easier for an AI to generate. Videos appear to be a different story because some deepfakes are better than others.
For example, a fake video appearing to show Ukrainian President Volodymyr Zelenskyy that was circulating online didn't seem to fool many.
There are telltale signs that the video has been digitally created.
It's not clear who posted the clip of Zelenskyy, although Ukraine's government has issued warnings about the possibility of Russia spreading manipulated videos.
On the other end of the spectrum, there are incredibly realistic deepfakes that could pass as real people. For example, South Korea has an AI reporter, AI Kim, who is a replica of reporter Kim Ju-Ha. The fake can replicate her reporting style almost exactly.
But the mere existence of the technology used in deepfake videos may be dangerous.
The possibility that something could be a deepfake can be enough to cause people to doubt what they're seeing. And, while a little bit of skepticism is useful, it can lead to confusion over what to believe.
In one case, it was enough to lead to a coup.
A 2018 New Year's address from Gabon President Ali Bongo Ondimba is an example of deepfakes interfering with politics. While many believed the video was falsified, there is no evidence of this.
In 2018, Ondimba was hospitalized outside of the country. Beyond what the government told people and the occasional photo, he seemingly disappeared from the public eye. Eventually, people grew suspicious, leading to speculation that he had died or was incapacitated.
In December 2018, the government released Ondimba's annual New Year's address. Now, if you watched the video and felt something was off, you're not alone.
Bongo's critics weren't sold on the fact that it was him. The following week, the military launched a coup — the country's first since 1964 — with the belief that all was not right with the president. While the coup might have failed, it does represent the power of deepfakes.
The existence of deepfakes blurs the lines of reality. Whether a video is real or not, the ability to digitally represent anyone is enough to cast doubt. This technology isn't going anywhere. It's only going to get better. But, are we going to get better at spotting them?
If you're interested in learning more, check out Technality's recent video on deepfakes over on their YouTube channel.