Skip to content

Artificial intelligence industry facing a ‘diversity crisis’ because it is overwhelmingly white and male, study finds

Hands of robot and human touching on global virtual network connection future interface. Artificial intelligence technology concept.
ipopba/iStock
Hands of robot and human touching on global virtual network connection future interface. Artificial intelligence technology concept.
PUBLISHED: | UPDATED:

A tech industry teeming with white men is creating biased bots.

The underrepresentation of women and people of color across artificial intelligence is causing a “diversity crisis” that is creating flawed systems and technology reflecting the racial and gender biases in the industry, according to a New York University research center and AI Now Institute report.

Artificial intelligence technologies are primarily developed within larger tech companies like Facebook, Google and Microsoft as well as in university research facilities — all of which skew toward rich, white men.

According to the study, about 80% of AI professors are men, while 15% of AI research staff at Facebook and 10% at Google are women. Just 2.5% of Google’s workforce is black. Facebook and Microsoft are both right around 4% each — and the underrepresentation of women of color is even worse, the report notes. What’s more, “There is no public data on trans workers or other gender minorities.”

The imbalances within the industry are mirrored in the products and technology its workers provide, perpetuating and replicating years of historical bias and power imbalances, the report titled, “Discriminating Systems: Gender, Race and Power in AI” warned.

“Image recognition technologies mischaracterize black faces, sentencing algorithms discriminate against black defendants, chatbots easily adopt racist and misogynistic language when trained in online discourse, and Uber’s facial recognition doesn’t work for trans drivers,” it said. “In most cases, such bias mirrors and replicates existing structures of inequality in society.”