Amazon reportedly scraps AI recruiting tool that was biased against women

To learn good from bad hires, Amazon’s team taught the AI to learn from the past 10 years of Amazon resumes, which were male-dominated.

For stakeholders who see artificial intelligence as a magical cure-all, they believe that a bunch of computer code can quickly and more accurately fix inefficiencies in anything with the right software application. But machines are only as intelligent as the human intelligence that guides them.

Let Amazon’s reported recruitment software experiment be a cautionary tale on the limits of this line of magical thinking. From 2014 to 2017, Amazon built an internal artificially-intelligent recruiting tool, but the plans backfired when the company realized that the system was biased against women, according to a new Reuters report.

Report: Amazon sought ‘holy grail’ of recruiting through AI

Reuters said Amazon set up an engineering team in Edinburgh, Scotland, to create 500 computer models to recognize some 50,000 terms that showed up in past resumes. The machine definitely learned, but it did not learn perfectly.

Amazon reportedly wanted a recruiting tool to quickly judge the qualifications of a large dataset of candidates. “Everyone wanted this holy grail,” one of the Amazon sources that Reuters interviewed said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

To learn good from bad hires, Amazon’s team taught the AI to learn from the past 10 years of Amazon resumes, which were male-dominated. In 2014, Amazon’s employees were about 63% men. This is how Amazon’s system taught itself that male candidates were preferable over female candidates. It downgraded resumes that included the word “women’s,” such as “women’s chess club captain,” Reuters said. It would favor candidates that used language found on male candidate’s resumes like “executed” and “captured.”

Amazon reportedly scrapped the AI recruitment project by the beginning of 2017. Following the Reuters report, an Amazon spokesperson told Ladders that, “This was never used by Amazon recruiters to evaluate candidates.”

But the reported experiment points to a larger trend. Amazon is not the only company that has introduced artificial intelligence into employment practices, or is planning to in the future. HireVue, a company with a “video interview intelligence platform,” uses it to screen candidates for companies. More than half of human resources employees — 55% — said that they expect AI to be a regular feature in HR in the next five years, according to a 2017 CareerBuilder survey.  Reuters reports that Amazon is not giving up the idea of an AI recruitment tool entirely; a new Amazon team is reportedly working on an automated recruitment system “with a focus on diversity.”

Artificial intelligence can make more rapid connections than humans can, but as the Amazon reported experiment shows, they are not necessarily always more accurate connections. AI can introduce more problems by perpetuating subconscious biases that were already there. Before employers start feeding an AI their judgments about what the perfect hire should be, they first need to examine their personal biases behind what an ideal candidate should look like.

This post has been updated with a comment from Amazon. 

Monica Torres|is a reporter for Ladders and can be reached at mtorres@theladders.com.