Skip to main content
Coded Bias film poster

Spring Science Week Featured Discussion on Documentary, 'Coded Bias,' Citing Social and Political Impacts of AI

Director/producer Shalini Kantayya led a virtual conversation on findings in her film

May 5, 2021

When it comes to changing the inherent biases that filter into coding artificial intelligence—which can ultimately harm many marginalized communities—Shalini Kantayya believes that AI literacy could be the answer. 

 

“I really think that literacy is the spark that is going to make change,” she said during a virtual discussion of her film. “That’s what I’ve been in favor of, and that’s why I’m so grateful for conversations like this one.” AI literacy is important, not just for computer science students at elite universities, but for everyone, including 10-year-olds just starting to use smartphones, she said. 

 

WPI made her film available for screening and hosted a virtual discussion and Q&A as part of A&S Spring Science Week in April. In addition to Kantayya, WPI panelists included Crystal Brown, assistant professor of social science and policy studies, and Gillian Smith, associate professor of computer science and the Interactive Media and Game Development program. 

 

The panel was moderated by Laureen Elgert, associate professor of social science and policy studies, and sponsored by the Public Interest Technology University Network and WPI’s School of Arts and Sciences. 

 

“This film gives an account of the social and political impacts of the powerful, elite-dominated, ubiquitous yet unregulated technology, such as facial recognition technology and more broadly AI,” Elgert said.

 

The documentary, which premiered at the 2020 Sundance Film Festival and is available on Netflix, “explores the fallout of MIT Media Lab researcher Joy Buolamwini’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us all,” according to a description of the film. 

I realized that these same systems we’re trusting so implicitly have not been vetted for racial bias or gender bias.
Shalini Kantayya, director and producer of the documentary film, Coded Bias

Among the storylines in the film is the use of facial recognition AI by law enforcement agencies. One particularly powerful scene shows a person having their face scanned by facial recognition technology just to buy a soda from a vending machine in China. 

 

How bias is coded 

 

Part of the discussion focused on how AI technology tends to favor the creators, who are often white men, and thus leaves marginalized groups behind. 

 

“The same way that we see institutionalized bias in admissions to higher education institutes, job placement, immigration policies, and access to a host of things, we see the same bias played out in algorithms,” Brown said. “[The] documentary does a stellar job of highlighting how technology is biased” and arguing that the government might need to step in and regulate AI in some way. “We often forget that the person doing the coding has their own assumptions, stereotypes, and bias, which carries into code.” 

 

When asked by Brown what inspired her to create the film, Kantayya said she read Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil, which sent her down a rabbit hole. 

 

“I didn’t realize the extent to which we as human beings are already outsourcing our autonomy to machines,” she said. “We’re trusting them to make such important decisions … I realized that these same systems we’re trusting so implicitly have not been vetted for racial bias or gender bias.” 

 

One of the main messages of the film, according to Kantayya, is that bias in AI can make historically vulnerable communities open to further discrimination and brutality, as in the case with police surveillance. “Harm is what’s at stake,” she said. “These things infringe on our civil rights.” 

 

“It’s such an important conversation for us to be having at all scales—locally, nationally, globally—about the level of impact that algorithms and AI can have on humans and society,” Smith said. 

 

Noting a storyline in the film about residents of an apartment building in Brooklyn fighting back against facial recognition software and teaching themselves about AI, Smith said she often thinks about this imbalance of public understanding of computer science and algorithms. 

 

Action items 

 

Kantayya said that data scientists have almost too much responsibility, in having to question what is ethical and fair and what might impact someone’s civil rights. What needs to happen, she said, is a radical inclusion of more people, including ethicists and policy makers. “Inclusion has to be a part of the picture and the process,” she said. “[Diverse] voices need to be in the rooms where these decisions are being made.” 

 

When it comes to training the next generation of computer scientists, Kantayya suggested integrating more liberal arts education into a computer science education and making it more of an interdisciplinary study to “change the pipeline of how these subjects are being taught.” 

As the computer scientist on the panel, Smith agreed, pointing out that she doesn’t remember having learned any of that as an undergraduate student. 

 

Another way to diversify the field, Kantayya added, is to have a massive investment campaign and outreach to get more women and people of color in STEM. By speaking to low-income high school students and showing students how AI can be relevant to them, “the more competitive these industries will be,” she said. 

 

When some WPI students go on to work at big tech companies, she implores them to speak up and be a voice of dissent. Listening to people who are radically different from us “makes technology more fair, ethical, and competitive,” she said. 

 

Closing the discussion, Jean King, Peterson Family Dean of the School of Arts & Sciences, thanked Kantayya for combining arts and sciences into one conversation and for giving everyone a lens with which to look at these topics. “All professors and students have a role to play,” she said. 

 

Responding to a question from King about the next steps in this journey, Kantayya reiterated the importance of national literacy around algorithms and AI. “Maybe your institution could be a part of spreading that curriculum,” she said.

 

Aside from increasing literacy, “I’ve seen how everyday people can make a difference,” Kantayya said. “I think if we each do one thing, everyday people can make a difference when we care enough to make a difference.” 

 

—Melanie Thibeault