Full description not available
E**A
Weapons of Math Destructions or a dark side of computing technology
This book is about the dark side of the Internet and computer technology, and it is not for the faint-hearted. Its author, Cathy O’Neil, is a veteran of data business. With a PhD in number theory from Harvard, she began her career as a university math professor at Baruch College, CUNY. After a few years she moved on to the D.E. Shaw hedge fund, where she witnessed firsthand the destruction that computers and Big Data had unleashed upon the world’s economy. She then held a string of positions as a data scientist for several data processing companies, before finally becoming a journalist. The story told in Weapons of Math Destruction is an account of her experience with Big Data, big money and computers.Most of us know what Big Data is: a massive collection of information about anything we do from telephone conversations, private emails, chats, credit card records, Twitter activity, purchasing habits, ‘likes’, behavioral patterns, commuting and shopping habits, or professional contacts. The list is endless. We may see this vast collection of data about us as somewhat threatening to our privacy and somehow impinging on our ‘rights’. But we are uncertain how and on which rights it may impinge, and how these data violate our privacy. Our common view is that a store of these data exists somewhere, and someone may be looking at it for some reason. But so what? If we do not know how Big Data is affecting us, maybe it does not affect us at all, maybe all these worries are simply a form of paranoia? And, after all, what is wrong with someone knowing what kind of cornflakes I like, or that I like some photos?And we are right in a way. Big Data – even very big data – on its own is useless. What counts is how data is used, by whom, for what purpose, and how this use affects our life, our choices and opportunities. But for most of us this ‘how’ is totally unknown; it is hidden, invisible. As O’Neil describes this process: ”…the ocean of behavioral data (data about what we do), in the coming years will feed straight into artificial intelligence systems. And these systems will remain, to human eyes black boxes. Throughout this process, we will rarely learn about the ‘tribes’ (a statistical group) we belong to or why we belong there. In the era of machine intelligence, most of the variables will remain a mystery” (p. 173). She continues: “These automatic programs will increasingly determine how we are treated by other machines, the ones that choose the ads we see, set prices for us, line up for a dermatologist appointment, or map our routes. They will be highly efficient, seemingly arbitrary, and utterly unaccountable. No one will understand their logic or be able to explain it.” Weapons of Math Destruction is about precisely this ‘hidden’ element of Big Data.Each chapter of the book describes one facet of our lives and its immersion in the world of Big Data and algorithms. One case looks at a few teachers who were fired from their job. Of course, being fired is a sad part of life, but no one should be surprised if he/she was not properly doing their job. However, in this case, something was different. When one of the teachers asked the school board why she was fired, nobody could give her an answer. Finally, after threatening the school with legal action, she was told that she received a low score on the teacher’s evaluation score sheet. When she asked who did the scoring, she was told that they did not know. After more digging, it turned out that the school board had employed a data processing firm with a proprietary teacher’s evaluation algorithm. However, claiming trade secrecy, the firm declined to disclose how the score was calculated and what variables went into the model. Thus, the teacher was fired for no known reasons; she was effectively fired by an algorithm and data. And nobody could say why.A single incident with a teacher’s evaluation program should not be a cause for panic or even serious concern, but there are others. The 2008 worldwide collapse of a banking system and economies in most of the developed countries resulted in millions of lost jobs, family tragedies, broken lives and a loss of livelihood for countless individuals. It is difficult to comprehend the full scope of the destruction wrought by the 2008 economic collapse as it is impossible to summarize the many reasons for the crisis in a few sentences. However, one picture that emerges is as follows: using faulty algorithmic models employed by banks and financial institutions, unsecured loans were repackaged into so-called ‘collateral debt obligations’(CDOs) and sold as low-level risk investments – with risk being calculated by complex algorithmic methods. In the subsequent mixing of bonds and already risky CDOs into new CDOs, which is called repackaging, nobody could ascertain the actual value or risk of these new CDOs, since everything was done by algorithms – and algorithms ‘do’ what they are told to do. And as could be expected, these new products of dubious quality were rated by the algorithms as ‘low risk’ (i.e. secure investments). The market of subprime mortgages and their derivatives by 2007 grew to around $3 trillion. The market of CDOs and other financial instruments around it was twenty times as big. No country has an economy of this size. As this market of repackaged CDOs and other unsecured investment instruments was based solely on paper value derived by algorithms, at a certain point the economy collapsed. What is more, as O’Neil writes ,“Paradoxically, the supposedly powerful algorithms that created the market, the ones that analyzed the risk in tranches of debt and sorted them into securities, turned out to be useless when it came to clean up the mess and calculate what the paper was actually worth. The math could multiply the horseshit, but it could not decipher it” (p. 43).O’Neil describes similar algorithmic models used in the banking industry to assess someone’s loan rating, algorithms to speed up job applicant selections (about 70% of CVs submitted in job applications are rejected by a computer screening system), algorithms to process admission applications to colleges and graduate professional schools, algorithms to calculate individual insurance policy rates, algorithms deriving national university and college ratings, algorithms driving Amazon selections and suggested purchases, algorithms driving Google searches, or algorithms constructing targeted political advertisements, and so on. All these algorithms feed on data about us, and in one way or another all these algorithms rate our value for particular objectives. The problem is that only few of us know how we are rated and for what ends. And these few are certainly not us.A teacher’s evaluation, the economic crash, college admissions, access to loans, job hiring, etc. Does any of this matter? Perhaps if taken in isolation, we might say that it does not matter greatly. But I would argue the following. When taken together, these events – and there are countless other examples out there (for instance, see the Guardian editorials on the shaping of voter preferences in the Leave campaign in the lead up to the EU Referendum) – what emerges is a picture of the world in which someone is collecting extremely detailed information about the most private aspects of our lives and turning this information against us. With most critical decisions about our lives, we no longer matter; what matters is the data about us, so-called proxies. Decisions affecting us are made by someone else, and we do not know why. We have no way of challenging or questioning someone’s decision, because in most cases we do not know how the decision was made. And what is more, decision makers do not know either.With the advent of AGI, deep learning and the Internet of Things generating data about us probably on the scale of the Large Hadron Collider, there will be no place to hide. It sounds like an absolute dystopia, but as things stand, the situation is not yet dystopic. This is how O’Neil ends her book: “Data is not going away. Nor are computers–much less mathematics. Predictive models are, increasingly, the tool we will be relying on to run our institutions, deploy our resources, and manage our lives. But as I tried to show throughout the book, these models are constructed not just from data but from the choices we make about which data to pay attention to and which to leave out. Those choices are not just about logistics, profits, and efficiency. They are fundamentally moral”.A book is of value if it poses important questions, forces us to change our thinking, or discloses important aspects of the world previously hidden from view. Weapons of Math Destructions does all of these. The book reveals the dark underbelly of computer technology and how this technology invisibly but tangibly affects our lives. Most of us likely do not even suspect that the ‘world’ described by O’Neil exists, and some of us do not want to know. To be aware of this ‘world’ means that one is submerged deep within computing technology, and few of us are. While the book is about computers, Artificial Intelligence, Big Data and algorithms, it also raises important moral and ethical questions about privacy, the role of computers in our lives, as well as traditional ethical values and why they matter. Many books explore the issues of computer security, privacy, and Big Data. Few, if any, talk about the engines behind the internet, i.e. algorithms and computer models. O’Neils’s book is unique in this way. In the ongoing philosophical debate on the effect of the Internet on the society and people Weapons of Math Destructions, is a voice that should not be missed.
B**G
An insider's view of how bad analysis of big data can ruin lives
As a poacher-turned-gamekeeper of the big data world, Cathy O'Neil is ideally placed to take us on a voyage of horrible discovery into the world of systems making decisions based on big data that can have a negative influence on lives - what she refers to as 'Weapons of Math Destruction' or WMDs. After working as a 'quant' in a hedge fund and on big data crunching systems for startups, she has developed a horror for the misuse of the technology and sets out to show us how unfair it can be.It's not that O'Neil is against big data per se. She points out examples where it can be useful and effective - but this requires the systems to be transparent and to be capable of learning from their mistakes. In the examples we discover, from systems that rate school teachers to those that decide whether or not to issue a payday loan, the system is opaque, secretive and based on a set of rules that aren't tested against reality and regularly updated to produce a fair outcome.The teacher grading system is probably the most dramatically inaccurate example, where the system is trying to measure how well a teacher has performed, based on data that only has a very vague link to actual outcomes - so, for instance, O'Neil tells of a teacher who scored 6% one year and 96% the next year for doing the same job. The factors being measured are almost entirely outside the teacher's control with no linkage to performance and the interpretation of the data is simply garbage.Other systems, such as those used to rank universities, are ruthlessly gamed by the participants, making them far more about how good an organisation is at coming up with the right answers to metrics than it is to the quality of that organisation. And all of us will come across targeted advertising and social media messages/search results prioritised according to secret algorithms which we know nothing about and that attempt to control our behaviour.For O'Neil, the worst aspects of big data misuse are where a system - perhaps with the best intentions - ends up penalising people for being poor of being from certain ethnic backgrounds. This is often a result of an indirect piece of data - for instance the place they live might have implications on their financial state or ethnicity. She vividly portrays the way that systems dealing with everything from police presence in an area to fixing insurance premiums can produce a downward spiral of negative feedback.Although the book is often very effective, it is heavily US-oriented, which is a shame when many of these issues are as significant, say, in Europe, as they are in the US. There is probably also not enough nuance in the author's binary good/bad opinion of systems. For example, she tells us that someone shouldn't be penalised by having to pay more for insurance because they live in a high risk neighbourhood - but doesn't think about the contrary aspect that if insurance companies don't do this, those of us who live in low risk neighbourhoods are being penalised by paying much higher premiums than we need to in order to cover our insurance.O'Neil makes a simplistic linkage between high risk = poor, low risk = rich - yet those of us, for instance, who live in the country are often in quite poor areas that are nonetheless low risk. For O'Neil, fairness means everyone pays the same. But is that truly fair? Here in Europe, we've had car insurance for young female drivers doubled in cost to make it the same as young males - even though the young males are far more likely to have accidents. This is fair by O'Neil's standards, because it doesn't discriminate on gender, but is not fair in the real world away from labels.There's a lot here that we should be picking up on, and even if you don't agree with all of O'Neil's assessments, it certainly makes you think about the rights and wrongs of decisions based on automated assessment of indirect data.
B**N
Easy to read
The book was a good read. It explained in simple terms what big data is and how it effects everyday life. Cathy O’Neil has lite a candle in the darkness of big data. I am wiser for having read this book.
L**A
Brilliant book
Have wanted to read this book for some time as it came up in discussions whilst studying economics. I had also seen the author in some documentaries and had been impressed by their appearances and specialist subject matter knowledge. The book was just as brilliant and so thought provoking. Makes you want to go out there and help to right the wrongs of so many of our dysfunctional systems. Highly recommend.
J**N
Light but interesting read
Like others said, the book is light on details and only touches the tip of the iceberg which the abuse, misuse and misinterpretation of statistical and machine learning models referred to as weapons of math destruction (WMD). All the examples are American and as far as I know little of it applies to Europe. Yet they actually show how bad things can get with WMDs so I take it as a useful word of warning for countries other than the US.
Trustpilot
Hace 3 semanas
Hace 4 días