Share on: Share on Twitter Share on Facebook Share on LinkedIn
In the aftermath of the killing of George Floyd and other recent incidents of police brutality, a coalition of Black computer scientists drafted an calling for action in the computing community to address systemic and structural inequities.
Chad Jenkins , a roboticist and associate professor of computer science and engineering at the University of Michigan, is among the signers. He discussed some of the invisible (to most) effects of racism in artificial intelligence, as well as other ways these inequities are harmful. He also describes a sweeping solution.
How have you been affected by the events of the past several weeks?
Every Black person I know sees themselves in what happened with George Floyd, Ahmaud Arbery, Breonna Taylor and too many others. And we can’t help but connect it to our professional experiences as well. All you have to do is look at the field of computer science and engineering and see that we have very low representation. We’re only about 2% or 3% of CSE majors. And that’s a problem given how broad computing is, the impact it’s having on the world and the differential in salary.
Increasingly, computing is the path up the wealth ladder in this country. Undergraduates who come out of computer science make about $30,000 more than any other engineering major. That’s a huge economic opportunity that underrepresented minorities feel like they’re being weeded out of right now.
Do you think that lack of representation has a cost to those who are not members of an underrepresented group or who are not in the computing community?
Yes, because computing technology touches everyone and everything. Data is being collected, processed through artificial intelligence and being used to build profiles that increasingly drive our world. Artificial intelligence is being used for policing, automatic sensing, mortgage, admissions and hiring decisions, cancer screening and so many other things that have a major impact on all of us.
All that technology is coming from small bubbles, sort of these cloistered communities of technologists that don’t necessarily represent society. And because we don’t have a diverse workforce, we’re seeing biased outcomes in some of those algorithmic decision-making tools and we’re seeing technology that’s not as good as it could be. If it continues, I think we’re going to have discrimination and inequality on a level that’s unimaginable today.
What’s an example of how that lack of representation has harmed us?
This might sound drastic, but I think the results of the 2016 election are the result of not having diversity in computing. I think information companies like Facebook and others skewed the results. And I’d make the argument that if you had had more women and underrepresented minorities in the room when they were developing those technologies, someone would have had the sense to say ’What you’re doing with Cambridge Analytica and third-party apps is very wrong.’”
Another example is what’s happening in facial recognition technology. Some of my colleagues are working on this, and it’s being used in many different areas where it probably should be used, to do things it probably shouldn’t be doing.
Those systems are often biased against people with darker skin and others who aren’t represented in the technology workforce. If you had more people that were working in the technology field that truly represented society I think you’d get better, and potentially fairer, facial recognition systems and AI systems.
As a roboticist, are you concerned about inclusivity in the robotics field specifically?
The thing that I wrestle with is how do we use robotic technology in a way that results in the maximum benefit to society?
After all, as much as I love our country, its economy has been built on the uncompensated and exploited labor of Black people. And now we’re trying to build an automated labor force that can be exploited the same way that we exploited people before. And I worry that as we develop these technologies, we’re not necessarily going to include the people that have been historically disadvantaged.
Robotic technology can be used in very beneficial ways, to do things like educate more people, to monitor and repair infrastructure, to make agriculture more productive. But we’re not necessarily doing those things as much as we could be.
How can we begin to address these problems?
I think we could solve this problem immediately, at least in academia. What we need to do is to enforce the civil rights statutes that already exist and apply them to funding in the sciences and computing.
Title VI, for example, basically says that if you are receiving federal funding you have to make sure that there is equal access to it. And not having explicit discrimination isn’t enough, you have to show that the outcome of what you’re doing with federal funding is equitable.
Really all it’s saying is, at the end of the day, if the people in your research lab don’t represent the American people, then you’re really not a good investment for the American people.
Is it really that simple?
Yes, because we know how to do these things and many of us are already doing these things. But we pay an academic price for it because it’s not something that today’s system rewards.
As faculty, our set of responsibilities is so large that we can’t get everything done. And we have to make hard decisions about whether we spend time pursuing funding and prestige, or whether we spend time with students that need help and might be falling through the cracks.
Right now, it’s clear that most faculty do what’s best for their careers, and that means pursuing funding and prestige rather than helping students. And the ones who do help those students take a hit to their career.
How would applying civil rights statutes change the system?
The careers of faculty members like me are built with funding from organizations like the National Science Foundation, the Department of Defense, the Department of Energy and others. Those grants provide prestige, as well as money that we can use to hire grad students who become the faculty of the future.
If for every grant that every faculty member wrote, they had to show how they were adhering to civil rights statutes, in terms of their inclusion in the classroom and in the research lab, that would provide an incentive for them to say, "All right, I need to spend time to make sure that the students that are coming from different backgrounds have a real opportunity in the classroom and the research lab.”
And, to be honest, if more people were doing things to make the field accessible, it would take some of the burden off the shoulders of minority faculty. Because we’re all tired. We’re tired because we have to carry a much bigger load than our colleagues. But we carry that load because we think that’s what’s needed to help our society move forward, and that’s what we’ve been doing all along. So in this regard the need to create better technology, to diversify computing, to create equal opportunity in computing is really just the next step in the civil rights struggle.