When Vauhini Vara first gained access to GPT-3—an artificial intelligence program able to mimic human writing to a remarkably convincing degree—she briefly entertained the idea that it could be telepathic. “There was this ideal version of it where I fed it a sentence and then it spit out exactly what was in my mind,” the writer told This American Life in December 2021. The loss of Vara’s older sister, who was diagnosed with an aggressive cancer as a teenager, had always been ineffable; when GPT-3 offered the possibility of its articulation, she quickly took its algorithmic hand.
The result was “Ghosts,” a nine-part essay published in the Believer in August 2021. In each section, Vara writes a version of the same story, elaborating a bit more in every one. Her writing, marked as bold text, is followed by GPT-3’s expansion on each version of the story. Bitch recently spoke with the author about the emotional impact of the experiment, A.I.’s unconscious biases, and her explosive new novel, due out in May 2022.
[I]t was almost as if I was communing with all grieving people who had put their experiences to paper in the past.
When I read “Ghosts,” the first question that came to mind was: Did any of what the program wrote ring true for you?
Oh, absolutely. The most fascinating and moving thing was getting to see the A.I. evolve as I evolved in what I was willing to offer it. At the beginning, when I was providing only very basic bits of information, the A.I. sort of [met] me at that basic level. When I started being more expressive and more open, [it] had more to work with. So as the essay progressed, it felt like I got more and more from the interaction.
Were there any egregious untruths?
In the first one, the thing that is poignantly and upsettingly inaccurate is that it ends with my sister still being alive—the most inaccurate response possible. And, of course, there’s the one where it’s inexplicably the 1970s, and I’m a runner. I think I’ve maybe tried to run, like, three times in my life. I’m very much not a runner.
I did note that this A.I. seems weirdly obsessed with exercise. In that first story, it has you playing lacrosse almost immediately.
There’s also a theme of thinness running throughout: GPT-3 seems quite sure that you’re emaciated by your grief. In the fourth story, you’re in grad school and “skinny and pale and quiet,” and you later befriend a girl who’s “thin and pale and quiet, too.” Why do you think that happened?
I learned this as I was writing the essay, but the way GPT-3 works is that it takes [a] huge body of existing written material to learn how humans talk—or more specifically, how humans write. There are a lot of inspirational texts about cancer specifically: Somebody’s doing some kind of run to raise money for the American Cancer Society after a loved one is diagnosed, or playing lacrosse in their sister’s memory, or whatever. So the ai follows [that] template.
That’s an interesting observation about thinness and paleness. Again, one imagines that there are a lot of people who have written about grief in association with words like that, which then informs the language you see in the piece. The thing I found so fascinating about this exercise was that it was almost as if I was communing with all grieving people who had put their experiences to paper in the past.
Did you try feeding it anything else?
I experimented with some fiction. I was making shit up and letting the A.I. continue the story, then adding my own part and letting the A.I. continue that, and it kind of worked. There was one—I called it “A Magic Trick”—where I was like, “This is a story. This could be published.”
It ended up being about a parent with their child at a playground. The narrator’s son would often go on playdates with another child there who had since died. So her father is now this awkward, ghostly presence because he still shows up at the playground and hangs around with the narrator and his kid. It’s very surreal. Now that I’m looking at it again, I’m like, “I want to do something with that.”
So you haven’t tried to get it published?
No, because at that point I was just playing around. I also don’t necessarily want my personal brand to be “that lady who writes essays using A.I.” Though I can imagine a future in which ai text completion is a tool, just like spell-check.
Let’s talk about some of your non-algorithmic writing. Your new novel, The Immortal King Rao, sounds like a real epic for the ages. You’ve said that it took you 12 years to write.
I started it in the winter of 2009, writing about [an] Indian American CEO of a giant tech company. And, at the time, a lot of that was just not a thing. I remember really needing to revise one character in particular. When Donald Trump was this very fringe candidate for president—it must have been 2014 or 2015—I had pictured this dystopian future in which, many years from the present, someone like that becomes president. And then, of course, he did become president, and we had four years of that. So I had to create this new president character, who was even more outlandish than the Trump-like one I had imagined.
Because people had seen it all by then.
Exactly. I was setting the book in like the 2020s, 2030s, 2040s, 2050s. And as time went on, the parts of it that felt like they belonged in a dystopian future actually happened. It really felt like reality caught up to this world I had imagined.