Speeding up Cultural Evolution

In previous posts I have discussed deficiencies in human thinking as described in Daniel Kahneman’s book “Thinking, Fast and Slow”. The discovery and explanation of these deficiencies can reinforce pessimism. We might conclude that human thinking is so messed up that we are doomed to failures, confusions, mistakes that it’s a wonder we ever do anything right. Our economic theories are mostly crap; our democracy has been captured by the 1%; wars continue. Our understanding of human thought and behavior is wrong. Why bother to try to change or improve anything. Since our thinking is full of errors, confusions, illusions, delusions, conceits, and unwarranted optimism we would be foolish to try to fix anything that isn’t working or is working poorly.

If unwarranted optimism is bad, unwarranted pessimism is much worse since unwarranted pessimism leads to inaction, depression, and even death.

So how can we be optimistic after reading “Thinking, Fast and Slow”? We can be optimistic because we can see it as part of the larger process of cultural evolution. Groups of humans more or less working together — families, tribes, clubs, associations, corporations, towns, cities, states, nations, all organizations — change and evolve their social and individual behaviors by discovering, generating, assimilating new ideas through doing science and art and other social and individual activities. We invent new things, we make inspiring movies and videos, we speak and write stories and novels and poetry, we create images, symbols, illustrations and paintings. This is the process of change and evolution of human groups and of human cultures in general. Evolution is more than just change. Evolution builds on what came before. Change could be anything. The way cultures change is by building on what they already have. Thus cultures change through evolution.

Kahneman provides scientific facts about mistakes in human thinking he and other psychologists discovered by doing psychological experiments on groups of people living mostly in the last half of the twentieth century. Some of the mistakes in thinking he described may result from the physical structure of the human body and brain. Examples might be certain optical illusions and the fact that our memories do not store all the information about an event that we have at time of the event (see the cold hand experiment — Kahneman p. 381-383). A person might think that if a deficiency results from such structural factors that the error is then inherent to the nature of humans. But humans are adaptable. I am not saying that we can learn to store in our memories all the relevant information for the cold hand experiment (although I suspect we could with training), or that we can train our sensory systems to avoid sensory illusions. But we can, if are aware of dangerous situations, if we learn the categories of situations in which mistakes sometimes or often occur, then we can work around them, we can avoid them by thinking in a different way. As Kahneman said:

“The way to block errors that originate in System 1 is simple in principle: recognize signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. This is how you will proceed when you next encounter the Muller-Lyer illusion. When you see lines with fins pointing in different directions, you will recognize the situation as one in which you should not trust your impressions of length. …” — Kahneman p. 417.   

So Kahneman discovers and spreads the word about deficiencies in human thinking. This is not bad news. It’s good news. Now we know more about mistakes we often make and so we can correct these mistakes, work around them, or otherwise avoid them. This is progress. This is cultural evolution at work.

Most importantly, this new knowledge, as it spreads through the human population, will speed up cultural evolution since cultural evolution depends on human creativity which depends on human thinking, human choices, human decisions. So if we can learn to think better, learn to make better choices, learn to make better decisions, this will help us create new useful and beautiful things, help us create more humane and just social arrangements, improve our own individual behavior both with respect to ourselves (improved physical and mental health) and it will improve our behavior with respect to others.

There are many other things which have speeded up and which will speed up cultural evolution in the future besides just improving our thinking. Some are: cooperation, competition, care for others, maximal individual freedom, democracy, art, science, engineering, many human inventions such as writing, printing, mechanized farming and transport, mass education, computers, expansion of interpersonal communications (cell phones, the internet).

And for the future, to speed up cultural evolution, very shortly after now, maybe directed non-violent revolution — a pushed non-violent evolution towards a world whose economic and political systems will more justly distribute the goods produced by humans as a whole so that each individual person has the basic human necessities in order to live and thrive. These include food, clothing, shelter, education, health, maximal individual freedom consistent with the freedom and well-being of others, all in accord with the earth’s limited resources and preserving other life on earth.

This is not impossible. We can adopt this goal and work toward it. We will modify our economic and political systems carefully, one step at a time, always with the goal in mind, evaluating each step (did it get us closer to the goal, did it cause harm, did it have any unintended consequences). Then repeat, repeat, repeat. This is trial and error. But trial and error is mostly all we have here or in any other human activity. The grand, glorious theories have failed. Forget them. Maybe take some parts of them, some smallish principles, and see if we can use them to modify our present systems and move us closer to our just distribution goal. Since this is a non-violent evolution we must build upon what we know now. If some idea from our present systems would seem to bring us closer to our goal, use it, try it out, test it to see if it actually does work to bring us closer to our goal.

But look for new ideas too. Especially those which look likely to speed up our directed revolution.

Perhaps the very idea that there is some grand and glorious theory that can explain, model, and predict human economic behavior is itself a monstrous example of the Illusion of Validity. See Kahneman, chapter 20, “The Illusion of Validity”.

The very idea that such theories exist, or must exist, or could exist if only we could find them leads to a lot of wasted time and mental energy. Worse this idea is pernicious for at least two reasons. First is that the current candidates for grand theory are so wrong that they cause serious harm in the real world. Second, when people glom onto one such theory as the correct theory, the one and only true way, they cut themselves off from the possibility of change, and they try to cut everybody else off from the possibility of change and improvement too.

These are reasons why the revolution must not glom onto any grand economic type theories. Discard them all: Capitalism, Socialism, Communism, Anarchism, etc. (What other “ism”’s are there?). At most take pieces, smallish parts, maybe certain principles, certain ideas from any of them which look like they might make sense, might work in a new pragmatic framework that is being evolved carefully from our present system, and then adopt provisionally, check, test, evaluate, to determine if this old idea might actually work in our new evolving system to actually bring us closer to our just-distribution system.

Down with Grand Theories. The only test for any modification of our systems, any policy change, any new law, should not be does it conform to some theory, but rather it should be: Does it bring our systems closer to our just-distribution goals.

Is System 1 educable?

Maybe I am “prone to overconfidence” — most of us are — but there are many reasons that the extensive fact based scientific knowledge about how our thinking actually works, and sometimes makes mistakes, as described by Daniel Kahneman in his book “Thinking, Fast and Slow”, can be used by us to improve our thinking, our choices, and our decision making.

Why do we acquire new knowledge? To use it. How will we use this new knowledge about how we humans actually think? First we must use it in any theories that depend upon how we think. Any such theories must be modified to take account of this new knowledge — particularly economic theories, sociobiological theories, but really all social theories.

In practice, education will be changed to reflect these new facts, interpersonal communication will be changed, mass communication will be changed, art and science will be changed. This assumes that these new scientifically established facts will diffuse, will be spread far and wide, through most of our cultures, to almost all people. This will take time, but it will happen because this knowledge is useful. People who acquire this knowledge will think better. Their choices and decisions will better correspond to reality. They will get more of what they want. They will want more of what is good for them because they will better know what is good for them. They will be happier, healthier, and live longer than people who continue to think and communicate and choose and decide crudely and poorly.

Groups, organizations, societies, and cultures that acquire and use this new knowledge will be more effective, more efficient, more likely to attain their goals.

Daniel Kahneman is a proper scientist. He and other psychologists conjecture, test, validate, and methodically record and report the results psychological experiments. This is their job, this is how they see their jobs as scientists. They do not project to the future. But we can project to the future on the basis of sound principles of cultural, societal evolution. One such principle is: Knowledge — useful information — spreads through human communication. It will spread on its own through diffusion, person to person. And it will spread faster if those who have it deliberately spread it to more people.

So we can’t expect Kahneman, in concluding his book, in the quotes from yesterday (repeated below), to be as confident as I am here. Indeed there is an overconfidence mistake we often make. And most of us, on many occasions, are “prone to be overconfident”. But many of us are sometimes depressed, even overdepressed. This is a serious mistake too since it leads to inaction and sometimes death.

“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to my age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor …,” “The decision could change if the problem is reframed …” And I have made much more progress in recognizing the errors of others than my own.”  — Kahneman p. 417.

“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. … We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally more difficult  to recognize than perceptual illusions.” — Kahneman p. 417.

Kahneman says above “System 1 is not educable”. System 1 itself may not be educable but at least some of what it works on may be. Part of what System 1 works on is associative memory. This is information we acquire through learning and experience. Part of it consists of heuristics — rules of thumb, little rules, little associations, little connections we have learned and which we use automatically via System 1. Since this is all information we have learned one way or another, it may very well be able to be improved. Indeed Kahneman gives examples of vast improvements in the associations, the heuristics of System 1 in his chapter 22 “Expert Intuition: When can we trust it?” So experts can improve their associations and heuristics that System 1 operates on, and although not everyone can become an expert in everything, most people do become more or less expert in some few areas. So Kahneman’s own chapter 22 demonstrates that the quality of at least some of the information that System 1 works on can be improved. So System 1 can give better answers for us if we improve the information it works on.

When can we trust an expert’s intuition? Kahneman worked with another scholar, Gary Klein, on this question:

“At the end of our journey, Gary Klein and I agreed on a general answer to our initial question: When can you trust an experienced professional who claims to have an intuition? Our conclusion was that for the most part it is possible to distinguish intuitions that are likely to be valid from those that are likely to be bogus. As in the judgment of whether a work of art is genuine or a fake, you will usually do better by focusing on its provenance than by looking at the piece itself. If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions. You can trust someone’s intuitions if these conditions are met.” — Kahneman p. 242.

The environment must be sufficiently regular and the expert must have learned its regularities.

The environment must be sufficiently regular and the expert must have learned its regularities.

“… [Some] experts may not know the limits of their expertise. … [they] … do have intuitive skills in some of their tasks, but they have not learned to identify the situations and  the tasks in which intuition will betray them. The unrecognized limits of professional skill help explain why experts are often overconfident.” — Kahneman p. 242.

So we need to learn to evaluate our own intuitions by asking ourselves: Is the environment, the subject matter (stock prices, psychological evaluations, politics, chess, medical diagnosis, etc.) sufficiently regular and if it is, have I really learned its regularities.

Can people learn these things? Can we learn how and when and where to be skeptical about System 1’s answers? Of course we can.

Changing people’s minds non-violently

The revolution is and must be about changing people’s minds non-violently. Even when in wars threats and force and violence are used the aim is still to change people’s minds.  Threats and force and violence are not very efficient in changing people’s minds. One of the most important changes we want to see people make in themselves is for people to see that force and violence are poor ways of changing people’s minds. So if we used threats and force and violence to try to change people’s minds we would be doing the opposite of what we want since we would be teaching, through our own actions, that force and violence are acceptable ways to try to change people’s minds. And we would be less effective than we could be.

If we must not use violence etc. what can we do to change people’s minds. First, we cannot change someone else’s mind. We must communicate with them to help them change their own minds. There are well established methods for doing this. All the non-violent and non-threatening methods of education are ways we help one another change our minds. There is also a practical and very successful counseling/negotiating/educating practice called non-violent communication — read the book “Non-Violent Communication” by Marshall Rosenberg. Every communication between two people changes both of their minds, more or less, sometimes very little or for a short period of time, and sometimes very much and permanently, and everything in between.

When people are communicating with one another they are passing ideas, information back and forth, each to and from the other. They are thinking. They are using System 1 and System 2 as described in Daniel Kahneman’s book “Thinking, Fast and Slow”. System 1 includes our intuitions, our sensory perceptions processing, language comprehension, feelings — automatic and instantaneous mental activities almost always not conscious. System 2 includes our calculations, ruminations, conscious choices, conscious decisions, etc. System 1 is Kahneman’s fast thinking; System 2 is slow thinking. System 1 and System 2 work together. System 1 is based on associations, connections, similarities, metaphors. System 2 does calculations, elaborate comparisons, what we call logical thinking (reasoning). System 2 is lazy. If System 1 offers an immediate answer to some question or problem, System 2 might do nothing more than accept it. System 1 works on information associated immediately with the question or problem. For System 1, what it sees in its limited automatic way, is all it has to work with. This Kahneman calls “What You See Is All There Is” — WYSIATI. System 2 is likely to try to give an answer if System 1 does not come up with one. But if System 1 has a quick answer, System 2, being lazy, may just accept it without question, without further thought. Many mistakes in thinking occur because System 1, being limited by its WYSIATI, gives a poor answer, and System 2, being generally lazy, doesn’t bother to check System 1’s answer. It is true that most of the time System 1’s answer is OK, it works well for most everyday activities, and so it’s fine that System 2 does not check these System 1 answers. Also there are many System 1 answers that System 2 is never aware of, that System 2 has no access to, so it couldn’t possibly check them. Also since system 2 is slow compared to System 1, System 2 would be hopelessly bogged down if it tried to check very many of System 1’s answers.

So this is the dilemma. System 1 is automatic and fast but is limited by WYSIATI and can be very wrong; while System 2 is slow and lazy and doesn’t have enough time to check very many of System 1’s answers.

Kahneman has documented numerous specific ways in which System 1 and System 2 get things wrong, numerous ways in which humans make mistakes in thinking. “Thinking, Fast and Slow” is a big book both scientifically and by size (499 pages, 38 chapters, 2 appendices). On the basis of this immense collection of scientific facts can we do anything to help humans make fewer mistakes in thinking? And if we could, would we speed up the non-violent revolution most of us want?

Daniel Kahneman does not seem to be optimistic:

“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to my age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor …,” “The decision could change if the problem is reframed …” And I have made much more progress in recognizing the errors of others than my own.”  — Kahneman p. 417.

“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. … We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally more difficult  to recognize than perceptual illusions.” — Kahneman p. 417.

Maybe I am “prone to overconfidence” but …(to be continued).

On “Thinking, Fast and Slow”

 

I have just finished reading Daniel Kahneman’s book “Thinking, Fast and Slow”. It has many implications for how to make a revolution. The book is based on research Kahneman has done with Amos Tversky and others and on the work of other psychologists and a few economists. He mentions an area of study called behavioral economics but the book is based on psychological experiments.

The most well-known impact of the research described in this book is, or should be, on the simplistic economic theories that are now widely accepted. The book shows their assumptions, their axioms, are wrong, not at all based on how humans actually behave, contradicting how humans actually behave. Not only are much of the mathematics behind these theories wrong (see Steve Keen’s writings), but the assumptions about human behavior these theories start with are proved wrong by the psychological experiments described in this book. Economics, as we have known it is, or should be, dead, dead, dead. To make a revolution we must first stop using false theories. We must spread the word and discredit economists and any others who continue to rely on false theories.

Beyond this there are many implications for how the revolution should be carried out. People think differently depending on how questions are asked or how problems are stated. These effects are called framing. Here is a simple example. If you are about to have a dangerous heart operation and you ask what are your odds, it matters a lot to both you and to doctors (and almost everyone else) how the answer is made. One way to say it is: You have a 5% chance of dying. Another way to say it is: You have a 95% chance of making it. Framing matters. Heart surgeons who tell their patients “You have a 95% chance of surviving” instead of “You have a 5% chance of dying” will have more customers.

People generally are “loss averse” — they are much more unhappy about the idea of losing something they already have than they would be happy about gaining the same thing or something of equal value. We would like to convince the 1% that they would be better off if they were somewhat less rich in a more just and stable and productive system. We will have to overcome their loss aversion. We will have to convince them that the value for them of living in a system which distributes the productions of society more fairly in a stable and more productive system is way, way more than the value for them of the income or wealth they will lose. The same goes for those who will have different roles or jobs in a changed economic/political system.

Kahneman describes two systems for thinking which he calls system 1 and system 2. System 1 is the fast system, system 2 is the slow one. System 1 is roughly our intuitions. It works automatically and instantaneously. It works with our associative memory. If someone asks you if you like chocolate you immediately know the answer through system 1. If someone asks you what is 17 X 24, you must use system 2 because system 1 has no answer. System 2 is generally lazy. So, at times when system 1 does have an answer, system 2 will accept it even if it is wrong. Most often system 1 is right since its associations and rules of thumb work well enough for us in most of our daily activities. But system 1 doesn’t evaluate itself and it can only work with the information it has, with the information it can get by association immediately. So it is limited to working with this information. Kahneman characterizes this by the phrase “What You See Is All There Is” —WYSIATI. What system 1 “sees” in any situation is all that System 1 has to work with. System 1 is automatic and instantaneous so it doesn’t have time to ruminate, make comparisons, calculate. Besides these are what system 2 does. The interactions between system 1 and system 2 are where many mistakes in thinking occur. And we recently had a president (Bush II) who was proud to say he relied on his intuition, his gut, in making decisions. Kahneman’s book “Thinking, Fast and Slow” is relevant to the present sorry state of our world. If you want to begin to understand us and our cultures, our economic and political systems, and how we might change them for the better, read this book.