Amazon scraps secret AI recruiting tool that showed bias against women:

"Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.”

Instead, the technology favored candidates who described themselves using verbs more commonly found on male engineers’ resumes, such as “executed” and “captured,” one person said."

reuters.com/article/us-amazon-

Follow

@doina
This is the kind of nonsense that inspired my colleagues at Scrapinghub to write ELI5:
github.com/TeamHG-Memex/eli5
With common linear models it can directly explain the scoring per-word and even highlight passages of text in a Jupyter notebook. It can also do black-box analysis. You can directly view the scores for a whole lexicon and pick up irrational biases from the dataset. I have used it.. would never now not-use it. :)

Sign in to participate in the conversation
Octodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!