Reducing racial bias in facial recognition
Our computer scientists are helping to reduce racial bias in facial recognition algorithms. Facial recognition is becoming an essential part of our daily lives, from its use in security to education, to virtual assistants and personal gym trainers. The technology brings a number of opportunities to make our lives easier and better, but it is more likely to misidentify people who are not white. No facial analysis system is perfectly accurate and a recent study by the US National Institute of Standards and Technology identified significant differences in how accurately face recognition software tools identify people of varied sex, age and racial background. Durham's researchers are helping to solve this problem by developing a method that uses a synthesised dataset showing different facial characteristics and racial features. In particular, the dataset used in the research included major racial categories such as African, Asian, Indian and white This dataset is then used to train facial recognition software to focus more on identifying features than race. Racial characteristics The work focuses on bridging the gap between facial recognition systems that perform almost perfectly for white faces, but perform less well for faces of people belonging to other racial and ethnic groups.

