Weapons of Math Destruction
How Big Data Increases Inequality and Threatens Democracy
Other Editions of This Title:
Digital Audiobook (9/5/2016)
“A manual for the twenty-first-century citizen . . . relevant and urgent.”—Financial Times
NATIONAL BOOK AWARD LONGLIST • NAMED ONE OF THE BEST BOOKS OF THE YEAR BY The New York Times Book Review • The Boston Globe • Wired • Fortune • Kirkus Reviews • The Guardian • Nature • On Point
We live in the age of the algorithm. Increasingly, the decisions that affect our lives—where we go to school, whether we can get a job or a loan, how much we pay for health insurance—are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules.
But as mathematician and data scientist Cathy O’Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they’re wrong. Most troubling, they reinforce discrimination—propping up the lucky, punishing the downtrodden, and undermining our democracy in the process. Welcome to the dark side of Big Data.
Praise For Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy…
"Weapons of Math Destruction is the Big Data story Silicon Valley proponents won't tell. . . . [It] pithily exposes flaws in how information is used to assess everything from creditworthiness to policing tactics . . . a thought-provoking read for anyone inclined to believe that data doesn't lie.”—Reuters
“This is a manual for the twenty-first century citizen, and it succeeds where other big data accounts have failed—it is accessible, refreshingly critical and feels relevant and urgent.”—Financial Times
"Insightful and disturbing."—New York Review of Books
“Weapons of Math Destruction is an urgent critique of . . . the rampant misuse of math in nearly every aspect of our lives.”—Boston Globe
“A fascinating and deeply disturbing book.”—Yuval Noah Harari, author of Sapiens
“Illuminating . . . [O’Neil] makes a convincing case that this reliance on algorithms has gone too far.”—The Atlantic
“A nuanced reminder that big data is only as good as the people wielding it.”—Wired
“If you’ve ever suspected there was something baleful about our deep trust in data, but lacked the mathematical skills to figure out exactly what it was, this is the book for you.”—Salon
“O’Neil is an ideal person to write this book. She is an academic mathematician turned Wall Street quant turned data scientist who has been involved in Occupy Wall Street and recently started an algorithmic auditing company. She is one of the strongest voices speaking out for limiting the ways we allow algorithms to influence our lives. . . . While Weapons of Math Destruction is full of hard truths and grim statistics, it is also accessible and even entertaining. O’Neil’s writing is direct and easy to read—I devoured it in an afternoon.”—Scientific American
“Indispensable . . . Despite the technical complexity of its subject, Weapons of Math Destruction lucidly guides readers through these complex modeling systems. . . . O’Neil’s book is an excellent primer on the ethical and moral risks of Big Data and an algorithmically dependent world. . . . For those curious about how Big Data can help them and their businesses, or how it has been reshaping the world around them, Weapons of Math Destruction is an essential starting place.”—National Post
“Cathy O’Neil has seen Big Data from the inside, and the picture isn’t pretty. Weapons of Math Destruction opens the curtain on algorithms that exploit people and distort the truth while posing as neutral mathematical tools. This book is wise, fierce, and desperately necessary.”—Jordan Ellenberg, University of Wisconsin-Madison, author of How Not To Be Wrong
“O’Neil has become [a whistle-blower] for the world of Big Data . . . [in] her important new book. . . . Her work makes particularly disturbing points about how being on the wrong side of an algorithmic decision can snowball in incredibly destructive ways.”—Time
Crown, 9780553418835, 288pp.
Publication Date: September 5, 2017