Is System 1 educable?

Maybe I am “prone to overconfidence” — most of us are — but there are many reasons that the extensive fact based scientific knowledge about how our thinking actually works, and sometimes makes mistakes, as described by Daniel Kahneman in his book “Thinking, Fast and Slow”, can be used by us to improve our thinking, our choices, and our decision making.

Why do we acquire new knowledge? To use it. How will we use this new knowledge about how we humans actually think? First we must use it in any theories that depend upon how we think. Any such theories must be modified to take account of this new knowledge — particularly economic theories, sociobiological theories, but really all social theories.

In practice, education will be changed to reflect these new facts, interpersonal communication will be changed, mass communication will be changed, art and science will be changed. This assumes that these new scientifically established facts will diffuse, will be spread far and wide, through most of our cultures, to almost all people. This will take time, but it will happen because this knowledge is useful. People who acquire this knowledge will think better. Their choices and decisions will better correspond to reality. They will get more of what they want. They will want more of what is good for them because they will better know what is good for them. They will be happier, healthier, and live longer than people who continue to think and communicate and choose and decide crudely and poorly.

Groups, organizations, societies, and cultures that acquire and use this new knowledge will be more effective, more efficient, more likely to attain their goals.

Daniel Kahneman is a proper scientist. He and other psychologists conjecture, test, validate, and methodically record and report the results psychological experiments. This is their job, this is how they see their jobs as scientists. They do not project to the future. But we can project to the future on the basis of sound principles of cultural, societal evolution. One such principle is: Knowledge — useful information — spreads through human communication. It will spread on its own through diffusion, person to person. And it will spread faster if those who have it deliberately spread it to more people.

So we can’t expect Kahneman, in concluding his book, in the quotes from yesterday (repeated below), to be as confident as I am here. Indeed there is an overconfidence mistake we often make. And most of us, on many occasions, are “prone to be overconfident”. But many of us are sometimes depressed, even overdepressed. This is a serious mistake too since it leads to inaction and sometimes death.

“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to my age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor …,” “The decision could change if the problem is reframed …” And I have made much more progress in recognizing the errors of others than my own.”  — Kahneman p. 417.

“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. … We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally more difficult  to recognize than perceptual illusions.” — Kahneman p. 417.

Kahneman says above “System 1 is not educable”. System 1 itself may not be educable but at least some of what it works on may be. Part of what System 1 works on is associative memory. This is information we acquire through learning and experience. Part of it consists of heuristics — rules of thumb, little rules, little associations, little connections we have learned and which we use automatically via System 1. Since this is all information we have learned one way or another, it may very well be able to be improved. Indeed Kahneman gives examples of vast improvements in the associations, the heuristics of System 1 in his chapter 22 “Expert Intuition: When can we trust it?” So experts can improve their associations and heuristics that System 1 operates on, and although not everyone can become an expert in everything, most people do become more or less expert in some few areas. So Kahneman’s own chapter 22 demonstrates that the quality of at least some of the information that System 1 works on can be improved. So System 1 can give better answers for us if we improve the information it works on.

When can we trust an expert’s intuition? Kahneman worked with another scholar, Gary Klein, on this question:

“At the end of our journey, Gary Klein and I agreed on a general answer to our initial question: When can you trust an experienced professional who claims to have an intuition? Our conclusion was that for the most part it is possible to distinguish intuitions that are likely to be valid from those that are likely to be bogus. As in the judgment of whether a work of art is genuine or a fake, you will usually do better by focusing on its provenance than by looking at the piece itself. If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions. You can trust someone’s intuitions if these conditions are met.” — Kahneman p. 242.

The environment must be sufficiently regular and the expert must have learned its regularities.

The environment must be sufficiently regular and the expert must have learned its regularities.

“… [Some] experts may not know the limits of their expertise. … [they] … do have intuitive skills in some of their tasks, but they have not learned to identify the situations and  the tasks in which intuition will betray them. The unrecognized limits of professional skill help explain why experts are often overconfident.” — Kahneman p. 242.

So we need to learn to evaluate our own intuitions by asking ourselves: Is the environment, the subject matter (stock prices, psychological evaluations, politics, chess, medical diagnosis, etc.) sufficiently regular and if it is, have I really learned its regularities.

Can people learn these things? Can we learn how and when and where to be skeptical about System 1’s answers? Of course we can.