Amazon’s AI Recruiting Tool Backfired in More Ways Than One
A company experiment to use artificial intelligence in hiring inadvertently favored male candidates.
PHOTO CREDIT: Getty Images
Amazon discovered a problem with using artificial intelligence to hire: their AI was biased against women.
The Seattle-based company developed computer programs designed to filter through hundreds of resumes and surface the best candidates, Reuters reports. Employees had programmed the tool in 2014 using resumes submitted to Amazon over a 10-year period, the majority of which came from male candidates. Based on that information, the tool assumed male candidates were preferable and downgraded resumes from women. In addition to the gender bias, the tool also failed to suggest strong candidates, an Amazon spokesperson told Inc. The company decided to scrap the project early last year.
"They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those," a person familiar with the matter told Reuters. Amazon used the AI only in a trial phase and never relied solely on the recommendations, the spokesperson said.
Amazon's experiment highlights some of the limitations with using machine-learning to automate business processes. Though using automation to optimize hiring is not new--40 percent of HR functions in international organizations use AI, according to research by PwC--computer scientists say algorithms designed to assess talent aren't yet fair, Reuters reports.