Are computers racist? No, but people still are. (Replay)
Facial recognition software has made huge advancements in accuracy, but it has a long way to go — specifically when it comes to recognizing people of color. Commercially available software can tell the gender of a person using a photograph. According to researcher Joy Buolamwini, of the MIT Media Lab, that software is correct 99 percent of the time when it’s looking at a white male but is less than half as accurate when looking at a darker-skinned female. Marketplace Tech host Molly Wood spoke with Buolamwini about her research and the human biases that creep into machine learning. (This interview originally aired Feb. 13.)
Facial recognition software has made huge advancements in accuracy, but it has a long way to go — specifically when it comes to recognizing people of color. Commercially available software can tell the gender of a person using a photograph. According to researcher Joy Buolamwini, of the MIT Media Lab, that software is correct 99 percent of the time when it’s looking at a white male but is less than half as accurate when looking at a darker-skinned female. Marketplace Tech host Molly Wood spoke with Buolamwini about her research and the human biases that creep into machine learning. (This interview originally aired Feb. 13.)
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.