Don’t trust that algorithm

How to Run a Twitter Campaign With a Giveaway to Increase Your Followers
January 23, 2017
SEO Service Spammers: Combating Disinformation
February 4, 2017

Don’t trust that algorithm

Harvard Ph.D. warns of big data’s dark side in ‘Weapons of Math Destruction’

 

do not trust algorithm

Whether we know it or not, complex algorithms make decisions that affect nearly every aspect of our lives, determining whether we can borrow money or get hired, how much we pay for goods online, our TV and music choices, and how closely our neighborhood is policed.
Thanks to the technological advances of big data, businesses tout such algorithms as tools that optimize our experiences, providing better predictive accuracy about customer needs and greater efficiency in the delivery of goods and services. And they do so, the explanation goes, without the distortion of human prejudice because they’re calculations based solely on numbers, which makes them inherently trustworthy.
Sounds good, but it’s simply not true, says Harvard-trained mathematician Cathy O’Neil, Ph.D. ’99. In her new book, “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy,” the data scientist argues that the mathematical models underpinning these algorithms aren’t just flawed, they are encoded opinions and biases disguised as empirical fact, silently introducing and enforcing inequities that inflict harm right under our noses.
The Gazette spoke with O’Neil, who once worked as a quantitative analyst and now runs the popular Mathbabe blog, about what she calls the “lie” of mathematics and her push to get data scientists to provide more transparency for an often too-trusting public.
GAZETTE: How did your work as a hedge fund quant prompt you to start thinking about how math is being used today? Had you given it thought before then?
O’NEIL: It absolutely had not occurred to me before I was a quant. I was a very naive, apolitical person going into finance. I thought of mathematics as this powerful tool for clarity and then I was utterly disillusioned and really ashamed of the mortgage-backed securities [industry], which I saw as one of the driving forces for the [2008] crisis and a mathematical lie. They implied that we had some mathematical, statistical evidence that these mortgage-backed securities were safe investments, when, in fact, we had nothing like that. The statisticians who were building these models were working in a company that was literally selling the ratings that they didn’t even believe in themselves. It was the first time I had seen mathematics being weaponized and it opened my eyes to that possibility.
The people in charge of these companies, especially Moody’s, put pressure on these mathematicians to make them lie, but those mathematicians, at the end of the day, they did that. It was messed up and gross and I didn’t want to have anything to do with it. I spent some time in risk, after I left the hedge fund, trying to still kind of naively imagine that with better mathematics we could do a better job with risk. So I worked on the credit-default-swaps risk model. The credit default swaps were one of the big problems [of the 2008 financial crisis] and then once I got a better model, nobody cared. Nobody wanted the better model because nobody actually wants to know what their risk is. I ended up thinking, this is another example of how people are using mathematics, brandishing it as authoritative and trustworthy, but what’s actually going on behind the covers is corrupt. 
GAZETTE: Big data is often touted as a tool that delivers good things — more accuracy, efficiency, objectivity. But you say not so, and that big data has a “dark side.” Can you explain?
O’NEIL: Big data essentially is a way of separating winners and losers. Big data profiles people. It has all sorts of information about them — consumer behavior, everything available in public records, voting, demography. It profiles people and then it sorts people into winners and losers in various ways. Are you persuadable as a voter or are you not persuadable as a voter? Are you likely to be vulnerable to a payday loan advertisement or are you impervious to that payday loan advertisement? So you have scores in a multitude of ways. The framing of it by the people who own these models is that it’s going to benefit the world because more information is better. When, of course, what’s really going on and what I wanted people to know about is that it’s a rigged system, a system based on surveillance and on asymmetry of information where the people who have the power have much more information about you than you have about them. They use that to score you and then to deny you or offer you opportunities.
GAZETTE: How integrated are algorithms in our lives?
O’NEIL: It depends. One of the things that I noticed in my research is that poor people, people of color, people who have less time on their hands to be more careful about how their data are collected are particularly vulnerable to the more pernicious algorithms. But all of us are subject to many, many algorithms, many of which we can’t even detect. Whenever we go online, whenever we buy insurance, whenever we apply for loans, especially if we look for peer-to-peer lending loans.

more
http://news.harvard.edu/gazette/story/2016/10/dont-trust-that-algorithm/

Mos
Mos

Leave a Reply

Your email address will not be published. Required fields are marked *