Trolling in the DeepDeepfakes Are the Latest Innovation in Online Shaming

The Power issue cover featuring Meech, a Black woman with short hair dressed in a black and gold embroidered jacket and a Shakespearean ruff adorned around her neck, arms crossed in front giving a commanding look and demeanor.
This article was published in Power Issue #88 | Fall 2020

Indian investigative journalist Rana Ayyub was with a friend when she got a message from one of her sources: “Something is circulating around WhatsApp. I’m going to send it to you, but promise me you won’t feel upset.” Ayyub has been an outspoken critic of Prime Minister Narendra Modi and his Bharatiya Janata Party (BJP). It was 2018; Ayyub had just been pilloried on Twitter by far-right Hindu trolls, who remain platformed by Modi and other high-ranking officials to this day. People Photoshopped tweets, making her appear to say things like, “I hate India and Indians!”—a shorthand way to humiliate her for challenging the BJP. The WhatsApp message included a video. “I could not watch it beyond three or four frames because it was a porn video and there was my image morphed into it,” Ayyub told Public Radio International (PRI) in 2019. The video was a deepfake: a digitally altered video, always created through artificial intelligence, that often overlays an image of a person’s face onto someone else’s body.

Immediately after Ayyub saw the video, she felt nauseated. Her body had such an extreme reaction to stress that she was hospitalized. “It felt like, there I was, out there in the public domain naked, and I just froze,” she told PRI. There were clear indications that the video was fake: Ayyub has curly hair, while the fake graphic of her did not, and the woman in the video is much younger than she is. But it didn’t matter. She still doesn’t know who made the video, but lower-level operatives within the party, far-right trolls, and others shared it at least 40,000 times through WhatsApp and social media. Ayyub’s harassers even doxxed her, revealing her phone number online, and she got messages from trolls asking her how much she was charging for sex. Ayyub said she was deeply affected by the video, but unfortunately, she’s not the only woman journalist who has faced extreme sexual harassment and violence. Gauri Lankesh, another journalist critical of the Hindu right, was murdered in her home in 2017.

According to Ayyub’s 2018 article in HuffPost, she started self-censoring on social media. She questioned herself. Had she been too outspoken? Should she not have posted images of her face online? But nevertheless, Ayyub defended herself. She posted a response to the video asserting it was fake. But once the video came out of the box, fake or not, it was nearly impossible to shove it back in. “I get called Jihadi Jane, Isis Sex Slave, ridiculous abuse laced with religious misogyny. They will Photoshop me in front of a minister’s house to claim I’ve been sleeping with him,” Ayyub wrote in HuffPost. “But this has changed me.”

Period Aisle Advertisement

Collecting the Feathers

An estimated 95 percent of all deepfakes are “involuntary porn”—where someone’s face is superimposed onto a porn actor’s body. The term deepfake itself is a portmanteau of “deep learning”—a type of machine learning—and “fake.” Other types of machine learning give the computer inputs, or categories to differentiate between images. But deep learning mimics human learning by giving a machine a large volume of information and allowing it to create categories of difference between them. It’s the building block for any machine that “learns”—processing and interpreting more and more complex inputs. In February 2018, Reddit took r/Deepfakes down for violating the site’s policies about “involuntary pornography” as well as sexual content involving minors. Redditors used r/Deepfakes, along with other subreddits, to create and share deepfake pornographic videos of famous actors.

In the subreddit, users would crowdsource data sets to teach the AI how to make more convincing videos. Two networks are operating simultaneously: one that superimposes the actor’s face onto the porn actor’s body, and another that learns how to detect the imperfection. The result is a loop that teaches each network how to do its job better. In a subreddit about r/Deepfakes being taken down, one Reddit user observed, “While it was impressive tech, it was still really, really obvious that it was fake.” The edges of the faces were blurred. The video and audio weren’t perfectly synced. But the early subreddit—along with similar ones like r/CelebFakes—was very popular; it didn’t seem to matter that the videos were fake. When Ayyub tried to get the video taken down, the police minimized it—all the while smirking as she showed them the evidence. When she started to state publicly that the video was a fake, it didn’t seem to matter.

Now every time Ayyub posts on social media, someone mentions the video in the comments. People don’t care to parse the facts—they’re more enticed by the fantasy. “There’s a Jewish teaching about evil speech,” tech and sexuality writer Lux Alptraum told Bitch. “A woman who tells gossip feels bad and goes to the rabbi. He tells her to come back with a feather pillow. He tells her, ‘Take this knife and cut the pillow open. Now go collect all those feathers.’” The gossip, the information that excites or titillates, spreads more aggressively than the correction. And by that point, the rumor itself is the joke. The fact that a woman’s behavior can be fabricated and shared for the purpose of humiliation is expressed in every smirk, in every male impulse to disregard facts for a more exciting story. Ayyub’s story is one of explicit political harassment. The moderator of a far-right Hindu Facebook page posted the video with a telling threat: “See, Rana, what we spread about you; this is what happens when you write lies about Modi and Hindus in India.”

But all online harassment—and its in-person equivalents, from the high-school rumor mill to the catcall—has a political goal. The video of Ayyub validated a certain public’s grotesque imagination. The deepfake exists in its own plane—above truth—and functions to humiliate. In physical space, behaviors like catcalling and workplace harassment serve the same function. They are means of discipline, serving to uphold the idea that a woman’s body is public property; when male fantasy is projected onto her, it serves to strip her of agency, will, or desire. This extends to online spaces: If a woman takes a critical stand, then rape threats are leveraged against her to humiliate and, ultimately, discipline her. This doesn’t change when extended to the realm of the image. The idea that the image is an absolute truth is superimposed over an older method of control: It doesn’t matter if it’s “real.” The deepfake is a form of public shaming. It projects sexuality onto her, immediately recasting her as property.

Get Bitch Media's top 9 reads of the week delivered to your inbox every Saturday morning! Sign up for the Weekly Reader:

Conflicting Truths in Focus

In the 1981 essay “Simulacra and Simulations,” sociologist Jean Baudrillard wrote about the simulacrum, a replica without an original copy. New Orleans’s Bourbon Street is an example of this phenomenon. The street comes to represent an image of a historic, mythologized New Orleans of yore. While there may be similarities between the actual history and the represented one, our views of that historic space are informed greatly by this representation of the history, a replica without an original copy. We see examples of this phenomenon everywhere, from Disney’s Epcot to the National Geographic image of Sharbat Gula, the “Afghan Girl” who was living in a refugee camp in Pakistan. The photograph was taken by Steve McCurry in 1984, after her family had been displaced from Afghanistan by the Soviet occupation. The circumstances around the photograph are ethically fraught, as are the uses of the image. The image was widely circulated after 9/11; the rights of women and girls served as a justification for the invasion of Afghanistan. But the fear in her eyes—one that is so commonly attributed to her refugee status—was prompted because McCurry forced her to pose for him.

According to Gula herself, when she saw the image for the first time, she felt “nervous and very sad.” McCurry snatched 12-year-old Sharbat out of her school, without her parents’ consent, because of her “striking green eyes.” Choosing a young girl to photograph, as opposed to a young man, also carries implications. A photograph of an Afghan boy might imply an unfulfilled potential, boundless energy that’s been wasted, individual agency that’s been stolen. An Afghan girl simply needs protection from another dominant aggressor (in this case, a highly militarized Western country). Gula’s true emotional landscape is not the reality people see when they see the photo. In the image we can read a fantasy—the benevolent West saving innocent girls from the ravages of Islam and an untamed East. In this case, the image doesn’t solely serve to discipline the subject. It’s targeted at a Western audience and triggers an association with American saviorism.

Although McCurry probably didn’t explicitly intend this when he found himself captivated by those green eyes, his actions indicate some deep assumptions about the goodness of the West. The photograph ultimately upholds U.S. hegemony, the destitute and oppressed Afghan girl to be saved from the ravages of Islam, an idea that ultimately served to justify the invasion of Afghanistan. Instead of facing consequences for his decisions, McCurry received plenty of money and accolades for the image and has even accused those who have spoken about the circumstances around the image of slander. It’s precisely because this image projects a narrative of power that the fantasy a public seeks to attach to the photo—one of power, of white saviorism—is more compelling than the facts surrounding Gula’s experience. McCurry projected a fantasy onto Gula, and as a result, perpetuated it. American exceptionalism, the imposition of so-called democratic values abroad, does not square with the facts. The simulacrum is operating on its own plane of existence. Images and videos are encoded with these stories. 

We have to contend with the fact that the truth isn’t uniform, that facts are so often manipulated to support the stories power wants to tell.

We have to contend with the fact that the truth isn’t uniform, that facts are so often manipulated to support the stories power wants to tell. American hawks wanted us to believe that the Afghan beauty needed to be saved. Hindu right-wingers needed everyone to believe that Ayyub was a discredited whore. These stories are baked into the photographs and videos we encounter. Such forms of media are widely considered to be absolute truths—alterable by programs like Photoshop, but not subjective to begin with. The belief that an image or a video holds “the truth” must be examined. Collectively, we want to believe that pictures don’t lie—they’re a safe haven in a world of shifting and hard-to-pin-down facts. But anyone who takes pictures knows that, in practice, a photograph is the culmination of a series of choices. What is included in the frame, and what is excluded? Is the image angled downward, casting the subjects as smaller, or upward?

The way the image is taken, the ways it’s cropped or framed, and the caption it’s paired with are all forms of manipulation. And so, looking at images becomes a process of reading. Videos include cuts and edits, things in and out of frame, shots juxtaposed with each other, and narration over the film. We’re making meaning—or “truth”—when we watch a video. We’re accustomed to thinking of a photograph or video as an objective truth rather than a subjective compilation of facts to be read. So when new technologies are introduced—such as Photoshop or, in a newer iteration, deepfakes—it’s not so much an objective truth that’s violated, but our collective assumption that a photograph or video contained an objective truth to begin with. The deepfake makes literal the subjectivity of the video. That violation unsettles something deep within us, the idea that people might not be able to verify the facts of an image or video. But perhaps that’s the wrong question to ask.

Even when the facts are untrue, it’s easy to see how people may willingly buy into the story that’s being told. These old narratives of power are mapped onto the bodies depicted in images and video, fake or not. There is no single truth—there never really was. “Men have a hard time comprehending that their fantasy is not the reality,” Alptraum said. “No need to talk to me or involve me as a person with an opinion or an ability to discern. And no real conception of the possibility of conflicting perspectives.” What old stories are grafted onto deepfake pornography? And in reality, is this such a departure from the subjection of the woman’s body to male fantasy and power? Facts certainly matter, but a video tells a story. And those who hold power are rarely contested in their interpretations. Perhaps that’s what needs to be contested. Laws and regulations governing these new technologies will be useful. But the technology is here, which means that it will always be a step ahead of the law. And once a deepfake circulates—even if it’s legally required to be taken down—the harm has been done. Women still experience the shame of that violation and its disciplining effects. All that’s left to do is throw out the stories power tells—and write our own.

photo of Rage member Jae Hermonn, a Black woman with shorn hair and wearing glasses, holding up a yellow mug with the words “Filled With Rage” Like what you just read? Help make more pieces like this possible by joining Bitch Media’s membership program, The Rage. You’ll get exclusive perks and members-only swag, all while supporting Bitch’s critical feminist analysis. Join Today

by Padmini Parthasarathy
View profile »

Padmini Parthasarathy is a Bay Area-based journalist who mostly explores the intersections of gender, identity, labor, and space, most recently in a piece about the New Orleans stripper struggle. Her head is turned by any story that involves bravery, vulnerability, and a radical reimagining of the future.