business news in context, analysis with attitude

Reuters reports on how Amazon developed a computer model designed to recruit and evaluate job candidates, with the idea being that automating the process would be more efficient. There was only one problem - somehow, the artificial intelligence taught itself to prefer men to women.

According to the story, “the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way. That is because Amazon's computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

“In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word ‘women’s,’ as in ‘women’s chess club captain.’ And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.” Then, Amazon edited the program to make it gender neutral, but has concluded that there is no way to be assured that other discriminatory tendencies won’t creep into the system. And so, it has disbanded the project.

The story points out that Amazon says it never depended on any such programs to reach conclusions about job candidates.

Reuters concludes that the case provides a lesson “in the limitations of machine learning< especially since there is a “growing list of large companies including Hilton Worldwide Holdings and Goldman Sachs that are looking to automate portions of the hiring process.”
KC's View:
Jeez. Even the computer software is sexist.

No wonder women think they can’t get a break, and are taking to the streets. Can’t blame them, not even a little bit.