Is System 1 educable?

Maybe I am “prone to overconfidence” — most of us are — but there are many reasons that the extensive fact based scientific knowledge about how our thinking actually works, and sometimes makes mistakes, as described by Daniel Kahneman in his book “Thinking, Fast and Slow”, can be used by us to improve our thinking, our choices, and our decision making.

Why do we acquire new knowledge? To use it. How will we use this new knowledge about how we humans actually think? First we must use it in any theories that depend upon how we think. Any such theories must be modified to take account of this new knowledge — particularly economic theories, sociobiological theories, but really all social theories.

In practice, education will be changed to reflect these new facts, interpersonal communication will be changed, mass communication will be changed, art and science will be changed. This assumes that these new scientifically established facts will diffuse, will be spread far and wide, through most of our cultures, to almost all people. This will take time, but it will happen because this knowledge is useful. People who acquire this knowledge will think better. Their choices and decisions will better correspond to reality. They will get more of what they want. They will want more of what is good for them because they will better know what is good for them. They will be happier, healthier, and live longer than people who continue to think and communicate and choose and decide crudely and poorly.

Groups, organizations, societies, and cultures that acquire and use this new knowledge will be more effective, more efficient, more likely to attain their goals.

Daniel Kahneman is a proper scientist. He and other psychologists conjecture, test, validate, and methodically record and report the results psychological experiments. This is their job, this is how they see their jobs as scientists. They do not project to the future. But we can project to the future on the basis of sound principles of cultural, societal evolution. One such principle is: Knowledge — useful information — spreads through human communication. It will spread on its own through diffusion, person to person. And it will spread faster if those who have it deliberately spread it to more people.

So we can’t expect Kahneman, in concluding his book, in the quotes from yesterday (repeated below), to be as confident as I am here. Indeed there is an overconfidence mistake we often make. And most of us, on many occasions, are “prone to be overconfident”. But many of us are sometimes depressed, even overdepressed. This is a serious mistake too since it leads to inaction and sometimes death.

“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to my age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor …,” “The decision could change if the problem is reframed …” And I have made much more progress in recognizing the errors of others than my own.”  — Kahneman p. 417.

“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. … We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally more difficult  to recognize than perceptual illusions.” — Kahneman p. 417.

Kahneman says above “System 1 is not educable”. System 1 itself may not be educable but at least some of what it works on may be. Part of what System 1 works on is associative memory. This is information we acquire through learning and experience. Part of it consists of heuristics — rules of thumb, little rules, little associations, little connections we have learned and which we use automatically via System 1. Since this is all information we have learned one way or another, it may very well be able to be improved. Indeed Kahneman gives examples of vast improvements in the associations, the heuristics of System 1 in his chapter 22 “Expert Intuition: When can we trust it?” So experts can improve their associations and heuristics that System 1 operates on, and although not everyone can become an expert in everything, most people do become more or less expert in some few areas. So Kahneman’s own chapter 22 demonstrates that the quality of at least some of the information that System 1 works on can be improved. So System 1 can give better answers for us if we improve the information it works on.

When can we trust an expert’s intuition? Kahneman worked with another scholar, Gary Klein, on this question:

“At the end of our journey, Gary Klein and I agreed on a general answer to our initial question: When can you trust an experienced professional who claims to have an intuition? Our conclusion was that for the most part it is possible to distinguish intuitions that are likely to be valid from those that are likely to be bogus. As in the judgment of whether a work of art is genuine or a fake, you will usually do better by focusing on its provenance than by looking at the piece itself. If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions. You can trust someone’s intuitions if these conditions are met.” — Kahneman p. 242.

The environment must be sufficiently regular and the expert must have learned its regularities.

The environment must be sufficiently regular and the expert must have learned its regularities.

“… [Some] experts may not know the limits of their expertise. … [they] … do have intuitive skills in some of their tasks, but they have not learned to identify the situations and  the tasks in which intuition will betray them. The unrecognized limits of professional skill help explain why experts are often overconfident.” — Kahneman p. 242.

So we need to learn to evaluate our own intuitions by asking ourselves: Is the environment, the subject matter (stock prices, psychological evaluations, politics, chess, medical diagnosis, etc.) sufficiently regular and if it is, have I really learned its regularities.

Can people learn these things? Can we learn how and when and where to be skeptical about System 1’s answers? Of course we can.

On “Thinking, Fast and Slow”

 

I have just finished reading Daniel Kahneman’s book “Thinking, Fast and Slow”. It has many implications for how to make a revolution. The book is based on research Kahneman has done with Amos Tversky and others and on the work of other psychologists and a few economists. He mentions an area of study called behavioral economics but the book is based on psychological experiments.

The most well-known impact of the research described in this book is, or should be, on the simplistic economic theories that are now widely accepted. The book shows their assumptions, their axioms, are wrong, not at all based on how humans actually behave, contradicting how humans actually behave. Not only are much of the mathematics behind these theories wrong (see Steve Keen’s writings), but the assumptions about human behavior these theories start with are proved wrong by the psychological experiments described in this book. Economics, as we have known it is, or should be, dead, dead, dead. To make a revolution we must first stop using false theories. We must spread the word and discredit economists and any others who continue to rely on false theories.

Beyond this there are many implications for how the revolution should be carried out. People think differently depending on how questions are asked or how problems are stated. These effects are called framing. Here is a simple example. If you are about to have a dangerous heart operation and you ask what are your odds, it matters a lot to both you and to doctors (and almost everyone else) how the answer is made. One way to say it is: You have a 5% chance of dying. Another way to say it is: You have a 95% chance of making it. Framing matters. Heart surgeons who tell their patients “You have a 95% chance of surviving” instead of “You have a 5% chance of dying” will have more customers.

People generally are “loss averse” — they are much more unhappy about the idea of losing something they already have than they would be happy about gaining the same thing or something of equal value. We would like to convince the 1% that they would be better off if they were somewhat less rich in a more just and stable and productive system. We will have to overcome their loss aversion. We will have to convince them that the value for them of living in a system which distributes the productions of society more fairly in a stable and more productive system is way, way more than the value for them of the income or wealth they will lose. The same goes for those who will have different roles or jobs in a changed economic/political system.

Kahneman describes two systems for thinking which he calls system 1 and system 2. System 1 is the fast system, system 2 is the slow one. System 1 is roughly our intuitions. It works automatically and instantaneously. It works with our associative memory. If someone asks you if you like chocolate you immediately know the answer through system 1. If someone asks you what is 17 X 24, you must use system 2 because system 1 has no answer. System 2 is generally lazy. So, at times when system 1 does have an answer, system 2 will accept it even if it is wrong. Most often system 1 is right since its associations and rules of thumb work well enough for us in most of our daily activities. But system 1 doesn’t evaluate itself and it can only work with the information it has, with the information it can get by association immediately. So it is limited to working with this information. Kahneman characterizes this by the phrase “What You See Is All There Is” —WYSIATI. What system 1 “sees” in any situation is all that System 1 has to work with. System 1 is automatic and instantaneous so it doesn’t have time to ruminate, make comparisons, calculate. Besides these are what system 2 does. The interactions between system 1 and system 2 are where many mistakes in thinking occur. And we recently had a president (Bush II) who was proud to say he relied on his intuition, his gut, in making decisions. Kahneman’s book “Thinking, Fast and Slow” is relevant to the present sorry state of our world. If you want to begin to understand us and our cultures, our economic and political systems, and how we might change them for the better, read this book.