“IMPOSTER SYNDROME” is often cited as a barrier to women and people of color being promoted into positions of leadership in tech, despite an ever-growing body of evidence showing that the reasons behind the persistent gender and racial leadership gap are multifaceted. Coined by psychologists Pauline R. Clance and Suzanne A. Imes in 1978, the term describes a psychological pattern of doubting one’s own accomplishments or competence despite high levels of achievement. Used uncritically, imposter syndrome burdens those who experience it with the task of fixing it, fomenting an ecosystem that sidelines women and people of color before they even attempt to climb the rigged corporate ladder. Kathy Wang’s second novel, Impostor Syndrome, uses an entertaining cat-and-mouse game to satirize Silicon Valley’s scapegoating of the idea.
Julia Lerner is the chief operating officer of the story’s Facebook stand-in, Tangerine, and is exactly the kind of executive who would give a TED Talk about imposter syndrome. As COO, Lerner walks a tightrope, balancing being a wife, mother, and public advocate for women in tech. Here’s the twist: Lerner is also a Russian spy, so while she’s publicly “railing against gender inequality,” she’s also “quietly torch[ing] the path of any rising female at Tangerine.” Lerner’s appropriation of progressive language and weaponization of white femininity deftly critique the smoke-and-mirrors show of tech diversity initiatives. Leading a double life, Lerner is flanked by the similarly sexist strictures that govern Silicon Valley and the Russian agency she reports to, the latter evocatively underscoring these problems in the former.
Though the espionage plot provides the novel’s most compelling tensions, it also externalizes the ethical issues confronting tech in a way that minimizes the homegrown risks they pose to its users. For instance, much of the tech industry’s privacy-related issues can be attributed to business models that rely on gathering and selling information about its users and that prioritize engagement over accuracy. These missteps have led to the proliferation of misinformation about issues of public concern, such as the novel coronavirus, which poses a more immediate risk than Russian spies infiltrating Silicon Valley. Alice Lu, a 35-year-old MIT graduate and the daughter of working-class Chinese immigrants, is a junior engineer at Tangerine working in technical support. As one of the few women of color engineers at the company, Lu walks her own tightrope as she clumsily tries to defy the racist and sexist stereotypes that hamper her career.
The rope narrows after Lu catches Lerner using God Mode—a function that gives Tangerine’s execs access to all app users’ information and that Tangerine lies about dismantling to curve abuses—and wrestles with the moral dilemma of reporting Lerner or keeping her job. Despite a compelling plot, Lu’s character remains disappointingly flat. While Wang carefully outlines the stakes for her characters, she doesn’t interrogate their motivations. Anna Wiener’s 2020 memoir, Uncanny Valley, is a spiritual predecessor to Wang’s novel that also offers a searing portrait of Silicon Valley excess. Like Lu and Lerner, Wiener struggles to find her place in the tech industry, and she writes frankly about being a nontechnical but well-compensated tech worker. She also delineates how her whiteness, self-delusions, and contrived justifications enabled her to continue to work in and benefit from the sector despite having humiliating and exploitative experiences, eschewing a narrative of individual transformation that underpins how corporations have redefined imposter syndrome.
In 2020, Timnit Gebru, the technical colead of Google’s Ethical ai team, coauthored the paper “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” The paper critiques large-scale artificial-intelligence systems—an important source of revenue for Google—and how these systems indiscriminately cull as much textual data from the internet as possible, including racist language and hate speech. Programs that rely on this technology, the paper concludes, risk perpetuating the same biases. With AI being integrated into education, health, and policing, these programs may have an enormously negative impact on the lives of women and people of color. Jeff Dean, Google’s head of AI, asked Gebru to remove her name from the paper, arguing that it “didn’t meet [their] bar for publication.” When she refused, Google fired her. Gebru’s groundbreaking work tackles one of tech’s most pressing issues while also implicitly understanding the stakes of exclusion. Her desire to make companies such as Google more accountable to their most vulnerable users is far more paradigm shifting than the motivations of Wang’s protagonists, who fiercely desire to hold on to the lives they have built for themselves. While the reader may empathize with Wang’s characters, their desires still reflect the industry’s myopic solipsism, a pathology that tech has yet to fully reckon with.
Institutional support for the work that Bitch, and other outlets like us, do literally doesn’t exist yet in this industry. That is why we turn to you, our community, for support. If Bitch has helped you on your feminist journey, please consider making a tax-deductible donation or joining Bitch’s monthly membership program today to keep independent feminist media going strong.