Reducing racial bias in facial recognition

Facial recognition software is more likely to misidentify people who are not whi

Facial recognition software is more likely to misidentify people who are not white.

Our computer scientists are helping to reduce racial bias in facial recognition algorithms.

Facial recognition is becoming an essential part of our daily lives, from its use in security to education, to virtual assistants and personal gym trainers.

The technology brings a number of opportunities to make our lives easier and better, but it is more likely to misidentify people who are not white.

No facial analysis system is perfectly accurate and a recent study by the US National Institute of Standards and Technology identified significant differences in how accurately face recognition software tools identify people of varied sex, age and racial background.

Durham’s researchers are helping to solve this problem by developing a method that uses a synthesised dataset showing different facial characteristics and racial features.

In particular, the dataset used in the research included major racial categories such as African, Asian, Indian and white

This dataset is then used to train facial recognition software to focus more on identifying features than race.

Racial characteristics

The work focuses on bridging the gap between facial recognition systems that perform almost perfectly for white faces, but perform less well for faces of people belonging to other racial and ethnic groups.

The researchers do this by making multiple face images, based on a real a person’s face, with different racial characteristics while keeping identifying features.

The images are then used to train face recognition algorithms to verify these identifying features, instead of racial features, with the aim of making them less dependent upon race.

Facial recognition

The research has led to a one per cent improvement in reducing racial bias and has increased the accuracy of facial recognition across all ethnicities.

Our scientists say this is important given the difficulty of the problem and the considerable overlap between identifying and racial features.

Lead author Seyma Yucer-Tektas, in our Department of Computer Science, said: "Our proposed solution is promising and we continue to develop the solution and the extent of our study for greater improvements in the future as the problem is getting more important every day."

Eventually, the researchers hope their method can be employed to regulate facial recognition software used by governments and companies.

Durham alumnus Dr Neil Hunt, former Chief Product Officer of Netflix has donated $3.5m for targeted scholarship, bursary, and internship support across the Department of Computer Science and Collingwood College.


This site uses cookies and analysis tools to improve the usability of the site. More information. |