Most engineers are white — and so are the faces they use to train software – Recode

Not terribly surprising but alarming given how much facial recognition is used these days.

While the focus of this article is with respect to Black faces (as it is with the Implicit Association Test), the same issue likely applies to other minority groups.

Welcome any comments from those with experience on how the various face recognition programs in commercial software such as Flicker, Google, Photos etc:

Facial recognition technology is known to struggle to recognize black faces. The underlying reason for this shortcoming runs deeper than you might expect, according to researchers at MIT.

Speaking during a panel discussion on artificial intelligence at the World Economic Forum Annual Meeting this week, MIT Media Lab director Joichi Ito said it likely stems from the fact that most engineers are white.

“The way you get into computers is because your friends are into computers, which is generally white men. So, when you look at the demographic across Silicon Valley you see a lot of white men,” Ito said.

Ito relayed an anecdote about how a graduate researcher in his lab had found that commonly used libraries for facial recognition have trouble reading dark faces.

“These libraries are used in many of the products that you have, and if you’re an African-American person you get in front of it, it won’t recognize your face,” he said.

Libraries are collections of pre-written code developers can share and reuse to save time instead of writing everything from scratch.

Joy Buolamwini, the graduate researcher on the project, told Recode in an email that software she used did not consistently detect her face, and that more analysis is needed to make broader claims about facial recognition technology.

“Given the wide range of skin-tone and facial features that can be considered African-American, more precise terminology and analysis is needed to determine the performance of existing facial detection systems,” she said.

“One of the risks that we have of the lack of diversity in engineers is that it’s not intuitive which questions you should be asking,” Ito said. “And even if you have a design guidelines, some of this stuff is kind of feel decision.”

“Calls for tech inclusion often miss the bias that is embedded in written code,” Buolamwini wrote in a May post on Medium.

Reused code, while convenient, is limited by the training data it uses to learn, she said. In the case of code for facial recognition, the code is limited by the faces included in the training data.

“A lack of diversity in the training set leads to an inability to easily characterize faces that do not fit the normal face derived from the training set,” wrote Buolamwini.

She wrote that to cope with limitations in one project involving facial recognition technology, she had to wear a white mask so that her face could “be detected in a variety of lighting conditions,” she said.

“While this is a temporary solution, we can do better than asking people to change themselves to fit our code. Our task is to create code that can work for people of all types.”

Advertisements

About Andrew
Andrew blogs and tweets public policy issues, particularly the relationship between the political and bureaucratic levels, citizenship and multiculturalism. His latest book, Policy Arrogance or Innocent Bias, recounts his experience as a senior public servant in this area.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: