الأربعاء، 2 نوفمبر 2016

Hidden algorithms that function as "weapons of math destruction"

Mathematics is, for many, a beautiful science.
But some -vitales algorithms to decide things that affect us directly (such as the transfer of a loan or obtaining a job) - are based on false or biased statistics that foster inequality and discrimination in the world.
At least, so says Cathy O'Neil.
The exprofesora the prestigious Barnard College of Columbia University, US, who worked as a data analyst on Wall Street, left the financial academia and to become one of the most active members of the movement Occupy Wall Street (OWS ), which denounced the excesses of the financial system since 2011.
Occupy Wall Street shows go "live" on the streets of New York
Occupy Wall Street vs. the American dream
Five years after the birth of that intellectual movement, O'Neil has just published his book, "Weapons of Math Destruction" (Weapons of Math Destruction), which describes how the algorithms govern our lives (and tend to disfavour the more disadvantaged).
"We live in the era of the algorithms," he writes mathematics.
"Increasingly extent, the decisions that affect our lives -a which school to go, if we may or may not get a loan or how much we pay for our sanitario- sure are not made by humans, but pormodelos mathematical".
In theory, he explains the specialist, this should lead to greater equity, so that everyone was tried under the same rules and bias is eliminated.
But, according to O'Neil, what happens is exactly the opposite.
The dark side of Big Data
The algorithms work like "recipes" created by computers to analyze large amounts of data.
An algorithm can recommend a movie or protect a computer virus, but that's not all.
(Some algorithms) are opaque: people do not understand how they work. And sometimes they are secret.
Cathy O'Neil, author of "Weapons of Math Destruction" (weapons of math destruction)
There are certain algorithms that O'Neil defined as "opaque, unregulated and irrefutable". But more worrisome, he says, it is that reinforce discrimination.
The first characteristic of these algorithms, O'Neil tells the BBC World Service, is that "take very important decisions in the lives of people".
For example, if a poor student in US tries to borrow, the system will reject it as too "risky" (by virtue of their race or neighborhood) and will be isolated from an education system that could get out of poverty, remaining trapped in a vicious circle.
This is just one example of how these algorithms support the luckiest and punish the most oppressed, creandoun "toxic cocktail for democracy," O'Neil says.
It is the dark side of Big Data.
Cities of tomorrow: how Big Data is changing the world

ليست هناك تعليقات:

إرسال تعليق