DataFramed

#50 Weapons of Math Destruction

November 26, 2018 Season 1 Episode 50
DataFramed
#50 Weapons of Math Destruction
Chapters
DataFramed
#50 Weapons of Math Destruction
Nov 26, 2018 Season 1 Episode 50
DataCamp
Cathy and Hugo discuss the current lack of fairness in artificial intelligence, how societal biases are perpetuated by algorithms and how both transparency AND audit-ability of algorithms will be necessary for a fairer future.
Show Notes

In episode 50, our Season 1, 2018 finale of DataFramed, the DataCamp podcast, Hugo speaks with Cathy O’Neil, data scientist, investigative journalist, consultant, algorithmic auditor and author of the critically acclaimed book Weapons of Math Destruction. Cathy and Hugo discuss the ingredients that make up weapons of math destruction, which are algorithms and models that are important in society, secret and harmful, from models that decide whether you keep your job, a credit card or insurance to algorithms that decide how we’re policed, sentenced to prison or given parole? Cathy and Hugo discuss the current lack of fairness in artificial intelligence, how societal biases are perpetuated by algorithms and how both transparency and auditability of algorithms will be necessary for a fairer future. What does this mean in practice? Tune in to find out. As Cathy says, “Fairness is a statistical concept. It's a notion that we need to understand at an aggregate level.” And, moreover, “data science doesn't just predict the future. It causes the future.”


LINKS FROM THE SHOW

DATAFRAMED SURVEY

DATAFRAMED GUEST SUGGESTIONS

FROM THE INTERVIEW

FROM THE SEGMENTS

Data Science Best Practices (with Heather Nolis ~20:30)

Data Science Best Practices (with Ben Skrainka ~39:35)


Original music and sounds by The Sticks.

×

Listen to this podcast on