Artificial Intelligence has enabled the damaging practice known as deepfakes—extremely realistic but fake images and videos. But that same AI technology has the power to detect fact from fiction.
Naked Security explains:
Who are the victims of deepfakes?
Is it the women who’ve been blackmailed with nonconsensual and completely fabricated revenge porn videos, their faces stitched onto porn stars’ bodies via artificial intelligence (AI)?
Is it actor Nicholas Cage? For whatever reason, deepfakes creators love to slather his likeness into movies.
It’s broader than either, the US Department of Defense says. Rather, it’s all of us who are exposed to fake news and run the risk of getting riled up by garbage.
Researchers in the Media Forensics (MediaFor) program run by the US Defense Advanced Research Projects Agency (DARPA) think that beyond blackmail and fun, fake images could be used by the country’s adversaries in propaganda or misinformation campaigns. Think fake news, but deeper, more convincing, in video footage that’s extremely tough to tell has been faked.