Watch CBS News

How tech's white male workforce feeds bias into AI

  • About 80 percent of AI professors are men, while women make up just 15 percent of AI research staff at Facebook and 10 percent at Google.
  • The percentage is even lower for black employees at major tech firms, with just 2.5 percent at Google and 4 percent at Facebook and Microsoft.
  • Bias seeps through when AI programs are constructed by those mostly white male workers who reinforce "a narrow idea of the 'normal' person," a new study said.

The technology industry's mostly white male workforce of coders is creating a "diversity crisis," with bias seeping into products like facial recognition programs and chatbots, according to a new report from New York University's AI Now Institute. The report highlights how a workforce gender imbalance at major tech companies such as Google, Facebook and Microsoft is helping perpetuate bias within artificial intelligence.

AI is used in products ranging from facial recognition to chatbots. But only 15 percent of AI research staffers at Facebook are women, and for Google it's even lower, at 10 percent, the report noted. 

This underscores what the study's authors say is the importance of a diverse workforce that reflects a diverse society. They argue that the tech industry's mostly white male legions of AI coders are linked to bias within technology products. Remedying the issues, they said, will require a broader approach to diversity, including hiring from colleges other than elite campuses and creating greater transparency in AI products.

"To date, the diversity problems of the AI industry and the issues of bias in the systems it builds have tended to be considered separately," authors Sarah Myers West, Meredith Whittaker and Kate Crawford wrote. "But we suggest that these are two versions of the same problem: issues of discrimination in the workforce and in system building are deeply intertwined."

"Narrow idea of the 'normal' person"

It's not only that AI may discriminate against some types of people, but that it "works to the advantage of others, reinforcing a narrow idea of the 'normal' person," the researchers wrote.

Future of artificial intelligence becomes key topic at World Economic Forum 04:03

The report highlights several ways AI programs have created harmful circumstances to groups that already suffer from bias. Among them are: 

  • An Amazon AI hiring tool that scanned resumes from applicants relied on previous hires' resumes to set standards for ideal hires. However, the AI started downgrading applicants who attended women's colleges or who included the word "women's" in their resumes.
  • Amazon's Rekognition facial analysis program had difficulty identifying dark-skinned women. According to one report, the program misidentified them as men, although the program had no problem identifying men of any skin tone. 

"Deep concern"

New York University isn't the first to ring alarm bells over problems of bias within AI. Groups such as the MIT Technology Review and the ACLU have documented problematic outcomes that affect issues such as hiring and criminal sentencing

The problem stems from the deep-learning stage, when coders "teach" a program through training data, the MIT Technology Review noted. Programmers can add bias into the system by relying on data sets that don't accurately reflect the world, such as relying on facial images that include very few black people. 

Programmers can also add bias by deciding which attributes are important -- such as gender. If a company's previous hires were mostly men, the program may learn to exclude women, as in the case of Amazon's hiring program, reinforcing a biased pattern of hiring. 

"The use of AI systems for the classification, detection, and prediction of race and gender is in urgent need of re-evaluation," the New York University researchers noted. "The commercial deployment of these tools is cause for deep concern."

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.