Book Review: Self-Insight: Roadblocks and Detours on the Path to Knowing Thyself
David Dunning (2005)
Psychology Press, Taylor & Francis Books
Reviewed by Dr Crista McDaniel
Practice Manager Psychologists and Programmes Central Region, Department of Corrections
Reviewer biography:
Crista has been with the department since 2003, first as a senior psychologist, then principal psychologist. Before coming to New Zealand she worked with civilian trauma victims, combat veterans, Native Americans, and spent some years as a forensic examiner.
This is one of most eye-opening books I have read and I recommend it to you. This book is about our ability to accurately assess personal limitations and competencies and take feedback. It is research-based, entertaining, easy to understand, and thoroughly relevant to everyone. Who should read this book? Everybody – if you depend on the biological computer sitting between your ears to make sense of the world and respond competently, this is the book for you.
Ironically, this eye-opening book is all about our cognitive ‘blind spots’. These blind spots interfere with our ability to accurately self assess our performance and competence and can interfere with the very feedback needed to make change.
Dunning opens his book with a discussion about climbing Mt Everest. He suggests that individuals who consider a dangerous climb like Mt Everest have an important reason to accurately assess their abilities – survival. Hopefully, they would consider their ability to climb over a deep crevice on an unstable ladder, whether they could tolerate the extremes in temperature and weather, whether they are emotionally prepared for such an arduous climb, and the like. He notes that 1,496 people have ‘summited’ and 172 have died; that is a ratio of 1 in 9 people losing their lives. The question is, were they truly aware?
Some decades ago researchers began posing the following questions: Are people aware of their shortcomings? How well do people know themselves? What do people do with feedback? What they found may surprise you. In general, they found that “it is surprisingly difficult to form an accurate impression of self, even with the motivation to understand, we often reach flawed and sometimes downright wrong conclusions” (Dunning, 2005). Dunning adds, “If self insight is a goal we desire, we face many roadblocks and detours along the way, and each of us fails to reach that destination in some important fashion.” According to Dunning, “The glass of self insight isn’t empty, it’s just not half full.”
Some quick examples of the research findings include:
- Doctors’ beliefs about their understanding of thyroid disorders did not correlate with their actual level of knowledge (Marteau, Johnson, Wynne, & Evans, 1989).
- 94% of college professors, who responded to a survey, said they do above average work, which is statistically impossible (Cross, 1997).
- When high school students were asked to rate their performance, 60% rated themselves as above average and only 6% rated themselves as below average
- Smokers remain overconfident about their own health prospects. They know their risk is greater than non-smokers, but significantly underestimate their personal risk (Stretcher, Kreuter, & Kobrin, 1995).
Research has found that people’s impressions of themselves, either inflated or pessimistic (arrogant or humble), are not very closely anchored to their actual skill level. In general, people tend to overestimate their skill, knowledge, moral character, and place on the social ladder.
They found that people can be “blissfully unaware of incompetence,” noting that “the curse is that the skills necessary to produce competent responses are the ones needed to recognize whether one has acted competently.” Poor performers are often unaware of poor performance and “indeed, cannot know how badly they are performing” (Dunning, 2005).
High performers, on the other hand, may be unaware that others do not share the same level of knowledge (Christiansen, Szalanski, & Williams, 1991).
“People are also unaware of their errors of omission. They have no magical insight into the numbers of solutions they could have reached but missed” (Dunning, 2005). “In a sense, when people judge themselves, they know what they know but have little or no awareness of their personal ‘unknown unknowns’” (Dunning, 2005).
People may also be unaware of their knowledge gaps. In 2002 the National Science Foundation surveyed Americans on their knowledge of scientific principles; less than half knew that lasers work by focusing light, more than half thought humans and dinosaurs lived together.
Dunning points out that while a wealth of knowledge can increase your confidence, some of that knowledge may be wrong. Dunning notes that when people were asked what country exports the most olive oil, 53% of people thought it was Italy, when actually it was Spain. Only 20% of people thought it was Spain. I, too, would have guessed Italy, because in some way, I have associated Italy with olive oil. Sometimes our knowledge is simply wrong.
Dunning, along with other researchers, notes that, “If people do not have factual/correct information, they tend to rely on a world of knowledge that might lead them to the right answer, but might not. Yet they can act confidently.” People also have the ability to argue anything and this ability can cause problems. For example: Firefighters were asked to argue either – why firefighters who take risks succeed, or, why firefighters who are cautious succeed – compelling arguments were made for either assertion.
As I was reading this book, I thought that feedback might be important; however, feedback isn’t without its problems as well. Dunning notes that feedback is probabilistic – there isn’t a one to one correspondence between choosing the right reaction and getting a reward. Feedback is often incomplete, just as experience is. Feedback can be hidden (for example: good behavior may not be noticed while bad behavior is addressed), feedback can be ambiguous (a boy gets turned down for a date – was it something he did wrong or was it her?), or feedback can be absent. Feedback can also be very misleading. Dunning notes that after a particularly bad speech, a professor’s colleagues were hunkered down, wincing, and giggling. When the professor asked them how he had done, he was told the speech had been remarkable and that people would be talking about it for years to come (all true but not very helpful).
Dunning and other researchers note that people often have trouble giving difficult feedback about performance and people will often delay or avoid giving difficult feedback. People also don’t like to receive difficult feedback – they prefer feedback that fits with their perspectives. Researchers have found that we don’t approach feedback in an open and evenhanded way – we shy away from feedback that is inconsistent with our self-assessments, and at times, misremember what was said.
Towards the end of the book, Dunning takes us back to Everest and suggests a different solution to the problem he initially posed – maybe the important thing to understand is not what we know about ourselves. Maybe what we need to understand is the relevant situation, but that judgment also depends on how well we take into account the specific features of a relevant situation without distorting them.
So how do we make good judgments when we aren’t able to see or admit to various ‘blind spots’? Dunning suggests that we use data from the past, seek outside data from other perspectives, learn to predict what others will do (this leads to a better prediction of self), use others as a crucial source of information, seek feedback (even difficult feedback), submit your decisions to review by others, and pay attention to what others say about you and your decisions. Dunning also reminds us that feedback needs to be specific and applied judiciously.
This review gives a taste of the research and how it applies to each of us. The real meal is in the book itself – read it and let me know what you think. I welcome your feedback!