A couple of years ago I read the million selling book “Freakonomics”. While I’m not particularly fond of most economists, I think that Steven and Stephen are clearly alright and they are great storytellers. In the meantime they did another book, “Superfreakonomics”, but somehow I missed that, most likely because I mixed it up with a couple of similar books, like “Undercover Economist” and “Errornomics”.
Whatever. While at the airport (there we go again) I picked up their newest book (or so I thought, only to find out that there is an even newer book out, “When to Rob a Bank: ...And 131 More Warped Suggestions and Well-Intended Rants”, that celebrates the tenth anniversary of “Freakonomics”). I expected another book with funny stories and smart observations. These are clearly to be found in this book as well, but this book is also much more instructional than the others since the authors this time explain the principles of how to ‘think like a freak’. That’s why my Penguin pocket is subtitled “How To Think Smarter About Almost Everything” and another version’s subtitle says “The Authors of Freakonomics Offer to Retrain Your Brain”.
The book is like “Freakonomics” highly entertaining and written in an enthusiastic style with a humorous note to it most of the time. This makes it very easy to read and digest. Being a pocket book with only slightly over 200 pages you may be able to finish it on a mid-distance flight and back (with some waiting time at the airport) if you put your mind to it.
Thinking like a Freak - Quick Guide
Chapter 1 outlines what it means to think like a freak. This is by no means a magic wand, because problem solving is hard work and, as the authors say, if a given problem still exists chances are that other people have tried before you and failed. But, thinking like a freak may help. It’s seeing that there is not necessarily a right or wrong way to think about a problem. One should rather think a bit more productively, creatively, rationally and look at things from a different angle. This may not necessarily increase one’s popularity…
There is a brief rehash of the basic set of ideas that governed the previous books: 1) Incentives are the cornerstones of modern life. 2) Knowing what to measure, and how to measure it, can make a complicated world less so. 3) Conventional wisdom is often wrong. 4) Correlation does not equal causation.
This ‘economic’ approach relies on data, rather than ideology, biases or a ‘moral compass’, and tries to understand how the world works, to learn how incentives succeed and fail, how resources are allocated or how what obstacles stand in their way.
The next chapter deals with the “Three Hardest Words in the English Language”, which appear to be “I Don’t Know”. Admitting your lack of knowledge is the first step in learning something. Often it’s hard to know what actually happened and why, even harder is it to ‘know’ something about the future. Alas people routinely pretend to more than they do, often based upon political, religious or other views. Dogmatism is probably the most important attribute to cause bad predictions.
To make things worse: human’s often even don’t know themselves very well. So we are often overconfident about our abilities and knowledge, especially if we are good at one thing. But: just because you’re good at something doesn’t make you great at everything! Alas, people often get away with their half-assed guesses posing as facts because after things are played out the bluffers are usually gone, or forgotten.
Even worse, often the individual cost of saying “I don’t know” is higher than the cost of being wrong. Looking confident (even when you shouldn’t) is a way to protect your reputation, even though it’s not for the collective better.
A key to learning is feedback, it’s almost impossible to learn without it. One good way to get feedback is to experiment. This is rarely done, however, because of tradition (“that’s not how we do things here”), lack of expertise how to do it and (there we are again) the failure to admit a lack of knowledge. You don’t need a lab to run experiments; often real world experiments are more valuable.
The final tip from this chapter is to say “I don’t know” more often, immediately followed by “But I can try to find out”.
The Right Questions
The next chapter then discusses the next step. After one has admitted to not know, it’s important to find the right question to ask, because if you ask the wrong question you will most likely get the wrong answer. Our perception of problems is often influenced by others, like peers or the media. Often we don’t stop to think about problems because we think that we know what the problem is, or because we focus on a part of the problem that bothers us (but not necessarily is the core of the problem). Neither should one only attack the ‘noisy’ part of a problem (the part that captures one’s attention)
It helps to ask questions differently, because it makes us to look for the answers in different places. Redefining the problem one is trying to solve is powerful. Interestingly this is illustrated by the example of a speed-eating champion. This example also illustrates the possible problem with accepting limits and artificial barriers. Partly because it’s even harder to solve a problem when it’s decided on forehand that it can’t be done…
Chapter 4 is all about root causes and how important it is to address those instead of the very visible symptoms. Hopefully this message is not new to safety professionals. A word of caution though: even when you get to the root cause of a problem, you still may be stuck… The chapter builds for a large part of the causes for ulcers and a cure (“the power of poop”).
“Think Like A Child” is the next chapter’s title, because children have this great ability to ask questions and generate ideas, to be curious and relatively unbiased. Neither are they afraid to ask simple questions or observe things that are in plain sight. All great treats if one wants to think like a freak. Seeing things from a new angle can sometimes gain an edge in solving a problem. Preconceptions make us to rule out many possible solutions because they seem unlikely or several other reasons. Still, as long as you can tell the difference between a good and a bad idea, generating many ideas (including outlandish ones) can be a good thing because more ideas will also mean more good ones. It’s wise to apply a cooling-off period for ideas. They nearly always seem more brilliant when they’re conceived, but may be less so when you have had time to reflect a bit.
It’s also wise to think small: 1) Small questions are by their nature less often asked and investigated, or maybe not at all. Much potential for learning thus. 2) Since big problems are usually a big bunch of intertwined small problems, you can often make more progress by tackling a small bit of the big problem. 3) Change is difficult, so triggering change on a small problem is more likely than on a big one. 4) By tackling small issues chances are that you are relatively sure what you are talking about.
Kids aren’t afraid to like the things they like. Freaks like to have fun. Doing something you like supports learning and persevering. You want to do more of it. Most of the business world, however, remains allergic to fun. Why? Perhaps out of fear that you will appear as not serious? There appears to be no correlation between seriousness and being good at what you do. Actually an argument can be made for the opposite, after all people become experts by practicing endlessly and how can you do that if you don’t like what you do? The best predictor of success in a job is that people like what they do. If people approach their job as just a job then they are unlikely to thrive.
Sticks and Carrots
Chapter 6 is about incentives, including some that have spectacularly misfired, big and small. People do respond to incentives, understanding this is an important step in solving problems. But you have also to understand which incentives work and what works in what situation. Incentives aren’t only financial. In fact often non-financial are much more effective and considerably cheaper. In order to find out one has to get inside people’s minds and find out what really matters to them. Sometimes the incentives are not obvious and most of the time you won’t find out by asking because people say what they think you’d like to hear and not what they really do think (the difference between declared and revealed preferences). There’s often a major gap between these… So, don’t listen what people say, watch what they do.
When designing an incentive scheme to ‘herd’ people into doing the right thing (even when they do it for the wrong reasons - possibly your incentive) it’s important to figure out what actually works and not only what you believe should work. The key is to think less about the ideal behaviour of imaginary people and more about the actual behaviour of real people. Real people are much more unpredictable. Remember also that the people whose behaviour you are trying to change often don’t think like you and therefore may react differently than you might expect.
Despite their major presence, moral incentives are not very effective at all. In fact they often backfire, for example by legitimizing undesirable behaviour. People react differently in different frameworks and this also means that incentives will work differently in different frameworks. Mixing up the frames may get you in trouble, but by nudging people from one frame into another you can achieve great results.
A simple set of rules for a working incentive scheme: 1) Figure out what people really care about, not what they say. 2) Incentive them on the dimensions that are valuable to them, but cheap for you to provide. 3) Pay attention to how people respond. Learn from feedback and try to improve your scheme. 4) Try to create incentives that switch the frame from adversarial to cooperative. 5) Don’t imagine that people will do something just because it’s the ‘right’ thing to do. 6) Rest assured that some people will do everything to beat the system, often in ways that you haven’t imagined.
The best advice, however, comes right before this: “the best way to get what you want is to treat other people with decency. … It is most powerful when least expected, like when things have gone wrong”. Amen to that.
Chapter 7 asks the outrageous question what Dave Lee Roth and King Solomon have in common. Quite a lot as it turns out. One thing is that both applied game theory to reach their goals. In this chapter the authors explain the concept of the ‘self-weeding garden’: let people sort themselves into different categories with little effort for you. This trick explains why Nigerian e-mail scams are actually quite effective.
The next chapter discusses how to persuade people who don’t want to be persuaded. In this chapter I would have expected some references to “Getting To Yes”, but there are none, despite a good deal of overlap.
Persuasion can be hard. It’s important to understand this and why. More knowledge or education is not the way, neither is being convinced about being right. Even worse, having heavily invested in an opinion makes it more difficult to change it. It’s difficult to draw and keep people’s attention. Rather than trying to convince them of the worthiness of a goal it’s much easier to trick or nudge people with the right cues and by changing default settings.
Some important points to remember: Appreciate that your opponent’s opinion is likely based less on fact and logic than on ideology and herd thinking. Despite this, it’s him who decides if he takes your point of view or not. If the argument doesn’t resonate with the recipient you get nowhere. Accept that your argument is not as perfect as you believe. Contrary to popular belief your message will become stronger if you also mention potential downsides. Nothing is perfect and it’s wise to be realistic by acknowledging potential flaws and unintended side-effects. It’s also important to acknowledge the strengths of your opponent’s arguments. You can learn from his point of view, or strengthen your own argument. It also signals that you take your opponent seriously. Calling names is never effective because it turns you into an enemy, not a possible ally. Finally, it’s important to tell stories, because they fill out the picture, capture attention and are powerful tools for teaching. A rule makes much stronger impression once a story illustrating said rule is lodged in your mind.
The final chapter fittingly discusses “The Upside Of Quitting”. This may be counter-intuitive because: 1) Our culture often sees quitting as a sign of failure. 2) Sunk cost. 3) A tendency to focus on concrete costs and too little on opportunity cost. Resources you spend on a dead end cannot be spent on other ways forward. Resources are limited and you may not be able to solve tomorrow’s problem if you keep beating today’s dead horse. Quitting can also provide valuable feedback and thus gain in the long run. Trial and error and tinkering are powerful tools for progress. It’s important to have the right culture of dealing with failure, or quitting. Demonizing it will send the wrong message and people will avoid it at all costs, including cheating and hiding, which causes much bigger problems in the long run.
The last page (before the extensive section with end notes) says that “…there are no magic bullets. All we’ve done is to encourage you to think a bit differently, a bit harder, a bit more freely”. At the end of the day I think that HSEQ professionals definitely should own the skill to think like freaks! Worthwhile stuff.
ISBN 978-0-141-98007-2 (Penguin pocket)