Equity graphic

It’s about Power

Questioning the tech that shapes our lives

In Sepehr Vakil’s ideal world, computer science and engineering education would offer more than technical skills and “coding for all,” and it wouldn’t be limited to students in only those fields.

Instead, English majors might explore the connection between surveillance systems and immigration. Drama students could act out how social media platforms and search engines can reinforce racism and sexism. And even young children would be able to understand, analyze, and thoughtfully question the technologies that profoundly shape their lives.

“Students would have the space to imagine the socially transformative possibilities of tech,” says Vakil, assistant professor of learning sciences at SESP. “They’d be cultivating a sense of moral responsibility to each other and to the world around them.”

New technology has created many social benefits – think advances in healthcare and expanded access to education. But the same technology that makes it possible to unlock an iPhone also has dark consequences, such as law enforcement’s misidentification of innocent individuals.

As schools around the nation increasingly incorporate computer science and engineering education into their curricula, Vakil is urging educators to prioritize an often overlooked aspect of technology: its intersections with ethics, culture, and power.

Lacking savvy, a generation of students may not recognize or understand the damage new technologies can inflict – from biased facial recognition software used to identify suspects to deeply flawed policing databases that target communities of color.

To date, much of the conversation over equity in science, technology, engineering, and math (STEM) education has focused on inclusivity: increasing opportunities for students of color and women to enter the field. What’s often missing – and it’s a glaring hole, Vakil says – is any discussion of STEM equity as it relates to ethics, power, and civic democracy.

“Diversifying the field isn’t just to get more diverse faces and different kinds of people working for tech companies, but to get their ideas, identities, and experiences to shift the possibilities of what can be created,” Vakil said in conversation with the Harvard EdCast. “That’s where the power comes in.”

Without a more diverse workforce, advances in technology will continue to be developed from a narrow perspective, something currently playing out in the field of facial recognition. Research on facial analysis algorithms conducted by Joy Buolamwini from MIT Media Lab and Timnit Gebru from Microsoft Research showed that “darker-skinned females are the most misclassified group with error rates of up to 34.7 percent.”

Lighter-skinned males were consistently identified with almost complete accuracy, because the facial analysis datasets used as benchmarks were overwhelmingly composed of white subjects. Predominantly white male faces were used to test the accuracy of tech that would then apply to everyone.

Vakil’s Technology, Race, and Ethics in Education (TREE) Lab, codirected with Sarah Van Wart, assistant professor of instruction at Northwestern’s McCormick School of Engineering, moves beyond the classroom and into the community – a hallmark of SESP’s approach. Northwestern students, along with Chicago and Evanston youths and community members, investigate how new technologies affect young people.

“While there’s a lot of new work from scholars and activists on the ethics of tech and computing, there’s a need for more youth voices,” Vakil says.

One of the lab’s main efforts, the Young People’s Race, Power, and Technology (YPRPT) project, is an afterschool pro- gram codesigned with partners including Evanston Township High School and three community-based organizations in Chicago – the Lucy Parsons Lab, Family Matters, and Endangered Peace.

In the pilot program, students from ETHS, Family Matters, and Northwestern created three documentary films exploring how new technologies such as artificial intelligence shape the experiences of communities of color.

As part of the project, the students interviewed local activists, computer scientists, and city council members. Chicago filmmaker and lead instructor Raphael Nash introduced students to the craft of documentary filmmaking. PhD students Natalie Melo, Alisa Reith, Jessica Marshall, Charles Logan, Shai Moore, and others played key roles.

The project produced multiple documentaries, including Targeted, which chronicles the Chicago gang database’s targeting of young Black and Brown men; Racial Recognition, examining the threat that facial recognition bias poses to marginalized communities; and Melting Ice, which explores US Immigration and Customs Enforcement’s use of social media and other technology tools in their deportation efforts.

“Students learned about storytelling, technology, and their own communities,” says Vakil, who believes educators can shift their curriculum by getting to know their local communities. “They then told their stories through film, which itself was a form of tech learning.”

In each of the last two years, the YPRPT project culminated in a film screening hosted by Northwestern’s Mary and Leigh Block Museum of Art.

After Princeton University professor Ruha Benjamin saw the 2020 screening, she joined Vakil’s advisory board. Founding director of the Ida B. Wells Just Data Lab, Benjamin is now supporting Vakil’s team as they explore how the TREE Lab can spark educational innovation focused on social justice and technology.

The lab currently hosts all student-produced films for free on its website. “These short documentaries become resources,” Vakil says. “They become data points in our consideration of how young people make sense of these technologies.”

For alumna Bijal Mehta (BS21), an AI and cyber researcher for Tortoise Media in London, the TREE Lab shaped both her interests and career path. Mehta, who cofounded Northwestern’s Responsible Artificial Intelligence Student Organization and worked as a student researcher, credits the TREE Lab with helping her explore the social implications of tech. “The lab helped me develop a critical lens for understanding power dynamics in the acceleration of new technology,” says Mehta.

Because new technologies are profoundly shaping democratic processes, policing, education, healthcare, and cultural life, Vakil’s vision for the future is that “citizens understand these dynamics and have a say in how they play out.” How we approach computer science education – how we look beyond the technical – will shift where the power lies.

'Who is Siri?'

Sepehr Vakil was born in Iran and immigrated to the US with his family when he was three. In Iran, his mother, Maryam Hazeghazam, volunteered at a hospital to care for wounded soldiers – many the victims of advanced military technology. His father, Roozbeh Vakil, taught math at a time when “education itself could be seen as an act of resistance,” he says. Their stories were Vakil’s first glimpse into the complicated relationship between power, technology, and education.

Vakil met his wife, kihana miraya ross, assistant professor of African American Studies at Northwestern, while they were students at the University of California, Berkeley. The two have five children: one-year-old twins Azad and Hafez; a four-year-old son, Sasan; and two older daughters – Simone, who attends Evanston Township High School, and Sage, a premed student at Howard University.

Reflecting on how young children interact with technology, Vakil mentions that Sasan likes to say that Siri “knows everything.”

“Who is Siri?” Sepehr asks Sasan. “The woman inside the phone,” he replies. Their exchange led them to talk about robots, knowledge, and what it means to know something.

“The potential to talk with young children about technology is amazing,” says Vakil, who may be teaming up with researchers at Berkeley and the University of California, Davis, on a new project that explores how teachers can support elementary and middle schoolers in understanding the ethics of tech.

Story by David Johnson
Illustrations by Özge Samanci