The revolution is and must be about changing people’s minds non-violently. Even when in wars threats and force and violence are used the aim is still to change people’s minds. Threats and force and violence are not very efficient in changing people’s minds. One of the most important changes we want to see people make in themselves is for people to see that force and violence are poor ways of changing people’s minds. So if we used threats and force and violence to try to change people’s minds we would be doing the opposite of what we want since we would be teaching, through our own actions, that force and violence are acceptable ways to try to change people’s minds. And we would be less effective than we could be.
If we must not use violence etc. what can we do to change people’s minds. First, we cannot change someone else’s mind. We must communicate with them to help them change their own minds. There are well established methods for doing this. All the non-violent and non-threatening methods of education are ways we help one another change our minds. There is also a practical and very successful counseling/negotiating/educating practice called non-violent communication — read the book “Non-Violent Communication” by Marshall Rosenberg. Every communication between two people changes both of their minds, more or less, sometimes very little or for a short period of time, and sometimes very much and permanently, and everything in between.
When people are communicating with one another they are passing ideas, information back and forth, each to and from the other. They are thinking. They are using System 1 and System 2 as described in Daniel Kahneman’s book “Thinking, Fast and Slow”. System 1 includes our intuitions, our sensory perceptions processing, language comprehension, feelings — automatic and instantaneous mental activities almost always not conscious. System 2 includes our calculations, ruminations, conscious choices, conscious decisions, etc. System 1 is Kahneman’s fast thinking; System 2 is slow thinking. System 1 and System 2 work together. System 1 is based on associations, connections, similarities, metaphors. System 2 does calculations, elaborate comparisons, what we call logical thinking (reasoning). System 2 is lazy. If System 1 offers an immediate answer to some question or problem, System 2 might do nothing more than accept it. System 1 works on information associated immediately with the question or problem. For System 1, what it sees in its limited automatic way, is all it has to work with. This Kahneman calls “What You See Is All There Is” — WYSIATI. System 2 is likely to try to give an answer if System 1 does not come up with one. But if System 1 has a quick answer, System 2, being lazy, may just accept it without question, without further thought. Many mistakes in thinking occur because System 1, being limited by its WYSIATI, gives a poor answer, and System 2, being generally lazy, doesn’t bother to check System 1’s answer. It is true that most of the time System 1’s answer is OK, it works well for most everyday activities, and so it’s fine that System 2 does not check these System 1 answers. Also there are many System 1 answers that System 2 is never aware of, that System 2 has no access to, so it couldn’t possibly check them. Also since system 2 is slow compared to System 1, System 2 would be hopelessly bogged down if it tried to check very many of System 1’s answers.
So this is the dilemma. System 1 is automatic and fast but is limited by WYSIATI and can be very wrong; while System 2 is slow and lazy and doesn’t have enough time to check very many of System 1’s answers.
Kahneman has documented numerous specific ways in which System 1 and System 2 get things wrong, numerous ways in which humans make mistakes in thinking. “Thinking, Fast and Slow” is a big book both scientifically and by size (499 pages, 38 chapters, 2 appendices). On the basis of this immense collection of scientific facts can we do anything to help humans make fewer mistakes in thinking? And if we could, would we speed up the non-violent revolution most of us want?
Daniel Kahneman does not seem to be optimistic:
“What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to my age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor …,” “The decision could change if the problem is reframed …” And I have made much more progress in recognizing the errors of others than my own.” — Kahneman p. 417.
“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2. … We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally more difficult to recognize than perceptual illusions.” — Kahneman p. 417.
Maybe I am “prone to overconfidence” but …(to be continued).