Safiya Noble Knew the Algorithm Was Oppressive – Vogue

When Frances Haugen blew the whistle on Facebook, she thrust algorithm—a clunky Silicon Valley buzzword—into mainstream conversation. Facebook’s algorithm, the mathematical map for how the platform works, was intentionally insidious, Haugen said: It boosted divisive content to the top of users’ timelines to ensnare them for as long as possible. Some people understood the social giant’s many ethical compromises, but they weren’t necessarily aware of this component of its influence—that its algorithms were predatory by design. Safiya Noble knew.

She knew from the day in 2011 when she googled “Black girls” to find activities for her daughter and young nieces and the search engine spat back racialized porn. (The disturbing first hit? HotBlackPussy.com.) “I was overtaken by the results,” wrote Noble, then an assistant professor of information studies at the University of Southern California, in the academic manifesto she’d go on to author, 2018’s Algorithms of Oppression: How Search Engines Reinforce Racism. “Hit indeed.”

At the heart of Noble’s work is the assertion that racism and sexism are baked into algorithms, from H.R. software that screens out women and candidates of color for jobs to Facebook’s advertising platform, which allegedly enabled landlords to exclude women, people with disabilities, people of color, and other underrepresented communities. (A lawsuit was settled on the matter.)

“The world is becoming more unequal,” Noble, now a professor at UCLA and director of its Center for Critical Internet Inquiry, told me by phone, “and these technologies are implicated in that.”

Noble argues that algorithms are not nameless, faceless bots and results like the ones she was served about Black girls are not glitches. Rather, behind every algorithm are real people who bring their own biases to the inner workings of the web. In the past, Noble writes, Google’s photo application automatically tagged African Americans as apes and animals, and Google Maps searches for the N-word directed users to the White House during the Obama presidency.

From the early days of the internet, “the computing industry came to be dominated and controlled by white men,” Noble said, and “they reconsolidated and reinscribed their power” in it, shaping public opinion and what information is seen as legitimate through technologies like Google search. (She cites Google search engineer James Damore, who notoriously went viral in 2017 for his screed arguing that, in her words, “women are psychologically inferior and incapable of being as good at software engineering as men.”)

“People who have very little to lose and everything to gain in terms of profits are the people who are so cavalier with the rest of our lives,” Noble said.

A newly minted MacArthur “genius” grantee, Noble has become a thought leader on the ills of the internet against vulnerable populations, especially Black women and girls. (Meghan Markle has cited Algorithms of Oppression as key to understanding the online vitriol spewed about her and has partnered with Noble through her and Prince Harry’s Archewell foundation.) Who is most likely to be targeted by unregulated revenge porn, Noble asks, somewhat rhetorically. Who is least likely to be able to recover from it? “We already see women being stripped of their jobs and of their power based on deepfakes or non-consensual pornography,” she says. “That’s going to change who can be a leader, who can participate, who can have a voice.”

It isn’t lost on Noble that the message she and many other scholars and journalists of color have been lobbying to make plain for the last decade finally landed in national headlines thanks to Haugen, raising questions about which messengers are believed. It certainly helped that, as a white woman and Facebook employee, Haugen “came in a package…that was sympathetic,” Noble notes.

Noble was met with resistance when she first wrote the dissertation that became Algorithms of Oppression as a Ph.D. student at the University of Illinois at Urbana-Champaign in 2012. “There were faculty along the way who said things like, ‘This research isn’t real. It’s impossible for algorithms to discriminate because algorithms are just math and math can’t be racist,’” she remembers. Noble pushed back (“they were value systems that were getting encoded mathematically”), accepting every invite she received to talk about her work and naming every one of those talks “Algorithms of Oppression,” branding her message and, in a meta twist, making it googleable.

“To make an idea exist in the world, you have to speak to it,” Noble said. “You have to be relentless and focused on what I really believed was true and still do to this day: These algorithms are touching our lives in so many different ways and will determine what’s possible. That’s what keeps me up at night.” She points to Florida’s Pasco Country sheriff’s office buying what Noble calls “snake oil software” intended to predict who would commit crime in the community, then harassing families and children based on its results, or ​​the algorithm employed by Stanford University’s hospital to guide its COVID-19 vaccine distribution that left out frontline workers. A.I., Noble emphasizes, should be no substitute for humanity.

The consequences of algorithmic oppression have already been grave, including the spread of misinformation ahead of the 2016 election and platforms like 4chan and Reddit giving rise to right-wing radicalism. “Can we stop being enamored with iPhone 13 or iPhone 28?” Noble sighs. “I’m like, Okay, but also liberal democracies are collapsing around the world.” Tech is a trillion-dollar industry, but “the largesse of all those profits don’t go back into the public,” she says, missing chances to invest in cancer research or slow climate change.

The likes of Facebook, Instagram, and Google can feel omnipotent, inextricably linked to modern social life and business. But Noble believes a more equitable internet is possible. “We need to not feel powerless in this,” she says.

She understands Haugen’s impact as a whistleblower in the context of past social movements. “There’s often decades of struggle and organizing and educating,” Noble says, “and then there is a tipping point where it becomes understood in the mainstream.” In another decade, “we may look at this era of social media and Big Tech like we looked at the era of Big Tobacco,” she adds, with unregulated corporate titans suppressing data about the dangers of their product while marketing it as “cool and sexy and glamorous.”

Noble looks to the abolitionist movement for historical precedent on paradigm shifting. “All the discourses were the same: We can’t do away with the institution of slavery because the whole American economy is reliant upon it. Doesn’t that sound like a familiar argument?” Noble asks. Abolitionists moved the zeitgeist on slavery, she said. Noble wants to do the same with “algorithmic oppression.” She advocates for the “breaking up of the monopoly” of Facebook—which also owns Instagram and WhatsApp, the texting app widely used for communication outside the U.S. The company’s recent outage “underscores that we should not be reliant upon one company for being connected to each other.”

More than quitting social media or deleting any one app, Noble seeks a more modulated, less all-or-nothing approach to technology: an internet culture where users can “share pictures with the grandparents without being targeted with anti-vaxx anti-science, disinformation, and propaganda.” There are people working on “public interest” search engines and social media, she says, platforms that aren’t built on “extractive models of profit at all costs” and don’t leave users “incredibly vulnerable to having every dimension of what we share and communicate and think about…mined and sold without our knowledge to thousands of companies who then use it to attempt to modify our behavior.”

Noble states her goals boldly yet matter-of-factly, but she feels “the intensity of what it means to take on the largest, most powerful companies on the planet.” Her path has been nonlinear and paved with setbacks. After college at Fresno State in California, she began pursuing a master’s degree. But as she neared the end of her program, first her father and then her mother became disabled. A first-generation college student, “I just felt like I had to go to work,” Noble said.

She went into corporate marketing and advertising for more than a decade, but the pull to academia never left her. When she tried to return, she learned she’d fallen short of receiving her master’s because she hadn’t filed a single intent-to-graduate form and had to go back and do it all over again. “I got my Ph.D. when I was 41,” Noble told me. At times, she felt humiliated and offtrack. Neither of her parents lived to see her finish grad school.

The MacArthur grant has given Noble the space to think about how Black women and women of color often exist in a “defensive posture,” holding up themselves and others while other women have “time, space and runway to go imagine the world they want to live in.” She wants to begin to dismantle some of those barriers, including launching a new nonprofit, The Equity Engine, offering “time, space, and resources for Black women and women of color to imagine and create.”

“We think we live in a culture that’s waiting for a messiah,” Noble said. But, “we really are the leaders we’ve been waiting for.”