Tech can sift through video evidence…but can it avoid bias?
Hardly anything happens these days that isn't caught on video. Cell phones, security cameras, drones, even doorbells have cameras built in these days. All of that video would seem to be evidence galore for law enforcement, except for a few problems. First: there's so much of it. Companies and law enforcement agencies are developing algorithms and machine learning to sift through all that video, looking for patterns or places or people. Second: that technology can have all the same biases and flaws as the people who designed it. Molly Wood talks about this with Kelly Gates, associate professor at the University of California San Diego, who's studied the rise of forensic video evidence. (1/7/19)
Hardly anything happens these days that isn’t caught on video. Cell phones, security cameras, drones, even doorbells have cameras built in these days. All of that video would seem to be evidence galore for law enforcement, except for a few problems. First: there’s so much of it. Companies and law enforcement agencies are developing algorithms and machine learning to sift through all that video, looking for patterns or places or people. Second: that technology can have all the same biases and flaws as the people who designed it. Molly Wood talks about this with Kelly Gates, associate professor at the University of California San Diego, who’s studied the rise of forensic video evidence. (1/7/19)
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.