Deepfake Technology: Seeing Is Believing, or Is It?

We live in a world where, in some instances, technology has replaced reality. This is certainly the case for an emerging technology called “deepfake”, which can fabricate images and videos to the point where you won’t be able to tell what’s real and what’s not.

ricardo-gomez-angel-UrOZTfXT0h0-unsplash.jpg

Deepfakes are hyper-realistic images, videos, and sometimes audio powered by artificial intelligence that mimic real people. They are often used to damage people’s reputations and steal identities. If you’ve ever seen a viral video on a social media platform of a  famous politician or celebrity saying something really unusual or antithetical to who they are, it may have been a deepfake. One main purpose is to gain attention from the general public.

Deepfakes can be incredibly dangerous. Seeing as their power to completely warp reality and appear as the truth, this isn’t too surprising. U.S. Senator Marco Rubio once called them a threat to national security that was near equivalent to nuclear weapons. Part of the threat is the ease of making a very realistic fake video. Basically anyone can learn how to use the AI behind deepfakes and make one. And, release it through the internet for all to see. An additional problem is that once the video is out there, it’s quite difficult if not impossible to tell if it even is a deepfake. The technology is both incredibly advanced and easily accessible, meaning that you can swap anyone’s face in any video you may choose. This has terrifying implications, particularly for women. 

Deepfakes are often weaponized against women. The majority of deepfakes on the internet are pornographic, in which female celebrities' faces are infused into these videos. There have been many instances in which deepfakes of this nature were created to take revenge against female journalists, such as those exposing corruption. Even though the videos are fake, they ruin reputations almost instantly. Deepfakes are effective if people believe them and spread them, not whether they are proven real or not. This means that victims can rarely find justice. 

Despite this room for abuse, deepfakes can also be used for good. Synthetic voices have been used to help patients with ALS (Amyotrophic Lateral Sclerosis) who, due to the disease, often lose some or all of their ability to speak. With the use of this technology, they can speak with their loved ones with a voice that sounds just like theirs. For educational purposes, deepfakes can be used to imitate teachers or bring back historical figures. In film, the accessible nature of deepfakes means that independent creators without huge budgets can create their art. Despite all the downsides, it’s really up to us to make deepfakes a positive to society rather than a negative. Like with all technology, you have to be hyper-aware of the content that you are consuming online and how it may negatively affect others. 


Maya.png

ABOUT THE AUTHOR

Maya Sobchuk is a third-year at Macalester College in Minnesota studying international law and international relations theory. She is from Kyiv, but grew up in Los Angeles, California. She has previously worked for the Kyiv Post, is the Web Editor for her college newspaper, and is involved with Ukrainian-American causes in Minnesota. Maya is particularly interested in the post-colonial space, multilateral diplomacy, and disinformation.


Previous
Previous

How to Start Your Own Blog

Next
Next

Science Fiction vs Reality: What Inspires What