Just dropping this magnificent podcast episode off here.
Ifeoma Ajunwa, the author of the upcoming book, the Quantified Worker, goes pretty deep on automated hiring systems, and how humans encode bias into AI-powered hiring systems. She shares examples of how hiring platforms powered by AI are problematic and the impact on job seekers:
“A lot of hiring systems make use of machine learning algorithms. Algorithms are basically a step by step process for solving any known problem. What you have is a defined set of inputs and you’re hoping to get a defined set of outputs like hire, don’t hire, or in between. When you have machine learning algorithms, it kind of makes it murkier. You have a defined set of inputs, but the algorithm itself is learning. So the algorithm itself is actually creating new algorithms, which you are not defining. The algorithm is learning to how you react from the choices it gives you. It creates new algorithms from that. It can become murky in terms of discerning what attributes the algorithm is defining as important because it’s constantly changing.”
Plus, she covers how employers spy on their workers. This is a must listen for anyone curious about AI in the workplace.
Follow her @iajunwa.