When you really can't "know thy self" - what next?

Dr Crista McDaniel
Practice Manager Psychologists and Programmes Central Region, Department of Corrections

Author biography:
Crista has been with the Department since 2003, first as a senior psychologist, then principal psychologist. Before coming to New Zealand she worked with civilian trauma victims, combat veterans, Native Americans, and spent some years as a forensic examiner.


“Now what is the message there? The message is that there are known ’knowns’. There are things we know that we know. There are known unknowns. That is to say there are things that we now know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know. So when we do the best we can and we pull all this information together, and we then say well that’s basically what we see as the situation, that is really only the known knowns and the known unknowns. And each year, we discover a few more of those unknown unknowns.” — Donald Rumsfeld (n.d.)
“Life, too, is like that. You live it forward, but understand it backward. It is only when you stop and look to the rear that you see the corpse under your wheel.” – Abraham Verghese, Cutting for Stone (2009)

Introduction

Each of us will make countless decisions in a lifetime. Most of us would argue that our decisions are largely sound and rational; however, a number of researchers have discovered that people aren’t always as rational as they would like to believe. Their insights can teach us about potential blind spots and methods to mitigate the risks involved.

If someone asked you if you are competent at your job, what would you say? How would you rate your performance compared with that of your peers? How would you rate your ethics, compared to others? How would you assess your decision-making skills? Would your peers and managers agree with your self-evaluation?

I hope the following information, which is only a sample of the research on self-insight and decision making, will encourage you to explore this area for yourself. The research could influence how you assess competency (yours and other people’s) change how you approach feedback and encourage more critical decision-making skills.

When reflecting on this article consider the following:

  1. People are very skilled at understanding human nature, but are not as skilled at self-examination. We easily address our inability to accurately self-reflect by seeking regular observation, videotaping of our work, and/or regular feedback from others.
  2. Corrections has developed many processes to support good decision making. Our use of actuarial and structured clinical tools like the STABLE and DRAOR help us capitalise on the System 2 thinking described in this article
  3. While feedback is crucial in our growth and development, taking feedback can be a challenge. The main thing to remember is to approach any feedback with a sense of curiosity; this will pave the way for growth.
  4. When making organisational decisions around issues that have risk and/or ethical implications, don’t make them hastily or in isolation, and be sure to seek diverse opinions. Listen carefully to all the feedback to ensure robust decision-making. Remember, how we define the problem is important to the solution.

Self-insight

David Dunning, a researcher from Cornell University (2014), contends that despite a lifetime of considering our strengths, weaknesses and skills and with a strong motivation to assess ourselves accurately, “we often reach flawed and sometimes downright wrong conclusions.” Further, he argues that we are consistently better at evaluating other people than we are in assessing ourselves, adding that “it is surprisingly difficult to form an accurate opinion of self even when there is motivation to understand.”

Dunning, along with other researchers, has discovered that many people tend to be overconfident when assessing their performance, with more than 50% believing their performance is above average compared to their peers. “While this overconfidence decreased with decreasing knowledge or skills, he found that the gap between self-assessment and performance increased as an individual’s performance became poorer (Yarkoni, 2010).” This finding has been replicated frequently enough that it has been dubbed the “Dunning-Kruger effect” after David Dunning and Justin Krueger, whose article was published in 1999.

Studies have revealed that peers, who know something about an individual, tend to be more accurate in their perceptions. For example, the rating of peers and supervisors outstripped self-ratings in how well a surgical resident would do on final exams (Risucci, Torttalani & Ward 1989). Roommates were better at predicting how robust their roommate’s romantic relationships were relative to self-prediction (MacDonald & Ross, 1999).

Like Dunning, researchers have consistently found that people tend to be overly optimistic when they consider the likelihood of an event. For example, although marriage statistics suggest a 40 to 50% likelihood of divorce for first time married couples in the United States, 0% of the couples sampled believed that their relationship would be the one to end in divorce (Sharot, 2012). Bungee jumpers, in one study, thought they would be less likely to be harmed in a jump than the typical bungee jumper (Middleton, Harris, & Surman 1996). 94% of college professors in one sample believed they did above average work (Cross, 1977), and elderly drivers thought they would be better drivers than other individuals their age (Marottoli & Richardson, 1998). Loftus & Wanenaar (1988) found that lawyers overestimated the likelihood of winning cases that went to trial.

Dunning and his colleagues have come to believe that individuals don’t recognize their own incompetence, defined as “performing poorly in some specific domain”, “because the skills needed to perform a task competently, are the same skills needed to judge competent performance.”

Dunning (2014) asserts that incompetent individuals tend to be overconfident in their knowledge and don’t seem to experience uncertainty about their level of skill or knowledge.

Conversely, Dunning and colleagues (2005) observed that top performers were often unaware of their expertise in comparison to their peers because these individuals tended to assume that their peers had the same skills and knowledge. Both findings suggest that determining who among us is expert is not an easy or straightforward task.

Put in a work context, a medical specialist’s advice to his or her client is based on that specialist’s understanding of medicine. If their knowledge is incomplete or incorrect, then their advice could be limited in usefulness or even harmful. As Dunning (2014) explains, “the incompetent mind is not an empty place; instead it is filled with irrelevant and misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of real knowledge.” Consider the following example - an authoritarian and aggressive parent believes that harsh discipline is required and that this parenting style is superior to more gentle parenting methods. When asked about harsh methods, this parent may provide numerous “logical” explanations for harsh discipline while being completely unaware of any other options.

Dunning and colleagues point out that we typically know what we know and what we don’t know, but are unaware of any gaps or incorrectly learned information – this missing information is part of the “unknown unknowns.” As Dunning explains, a beginning chess player playing chess with a master chess player can never know how many other possible plays the master could make each time he or she moves a piece on a chessboard.

You may believe that you are not one of the incompetent individuals described by researchers. You may think you are above average in your knowledge and skill, but it is critical to understand that each of us will have tasks where our level of competence is questionable. And due to the limitations of our knowledge and understanding, we will be victims of the Dunning-Kruger effect, and we won’t know it when it happens – in fact, we can’t know it! This potential lack of awareness is why all of us need feedback from peers, colleagues, mentors, and supervisors. We need access to other perspectives. We also need to ensure some of that perspective-taking is independent of our particular work environment, to allow for more independent assessment of our skills and competence.

The gold standard for assessing competency and skill is direct observation and videotaping. Observation provides a direct view of our practice and gives the individual and supervisor a platform where strategies to enhance performance can be discussed. While direct observation tends to occur regularly in university and training settings, that same level of feedback may not continue once the individual enters the workforce, and as they move into more senior positions.

Feedback

While feedback opportunities provide an important method of improving skills and competency, giving and receiving feedback is not a straightforward task, and feedback doesn’t always work as expected. Feedback may be absent, biased, spurious or inconsistent. In addition, we tend to seek out and respond to feedback that supports our self-image (Dunning, 2014). Some people may be more open to feedback than others, some environments more conducive to feedback than others, and many people receive little or no training on how to give or receive feedback. Stone and Heen (2014) note that the receiver of feedback is the one that controls the feedback, but explain that most training on feedback is given to the person providing the feedback.

Carolyn Dweck (2007), a Stanford University psychologist, has been studying success and achievement for decades and has proposed two types of mindsets that can affect our ability to receive feedback. She explains that a mindset is a set of beliefs an individual holds about his or her qualities and abilities. One mindset contains a fixed view of intelligence or talent. When individuals display a fixed mindset, they tend to prove and document their intelligence and skills rather than developing them. They worry about their performance, they feel they must prove themselves, and are in competition with others. Mistakes or setbacks are punishing, and feedback can be perceived as criticism. In the other mindset labelled “growth,” there is a belief that change can occur with hard work and dedication, there is a love for learning, an ability to take risks and make mistakes, and a greater sense of resilience. It is possible to have a fixed mindset about one area of life while holding a growth mindset in another. However, the growth mindset provides the most fertile ground for feedback. The good news is that people can learn how to move into a growth mindset. To promote the growth mindset, we need to provide a safe environment for feedback and reward the individual’s efforts and strategies rather than rewarding talent or innate ability.

Dunning’s research appears to support Dweck’s assertion that feedback should target strategies, skills, and effort. Dunning (2005, 2014) reports that incompetent individuals did develop the ability to assess their performance more accurately once they learned the skills needed to produce a competent performance. At that time, these individuals become more metacognitively capable of evaluating their performance and this knowledge not only raised their performance but allowed them to reflect more accurately on their previous lack of skill.

System 1 and System 2 reasoning

So what are the thought processes that underpin decision making? The dual process theory of reasoning suggests that reasoning can be divided into two hypothetical systems (Evans & Over, 1996).

System 1 thinking can be conceptualised as a pattern detector. Reasoning in this system is based on prior experience and beliefs. It is fast, associative, and intuitive, and can achieve results without awareness. System 1 thinking tends to work well in many situations.

System 2 thinking can be conceptualised as analytical and rule-based reasoning. System 2 reasoning tends to be slow, serial, effortful, and deliberately controlled. When individuals tire of the effort required for System 2 reasoning or find themselves under tight time constraints, they can slip back into using shortcuts that can lead to faulty conclusions. Daniel Kahneman (2001) explained that failures can occur in both systems, but it usually occurs with System 1 generating the error and System 2 not detecting it.

Lehrer (2012) describes how Daniel Kahneman, Nobel Laureate and Professor of Psychology, began studying our decision-making processes by asking simple questions such as: A bat and a ball cost a dollar and 10 cents. The bat costs a dollar more than the ball. How much does the ball cost? He discovered that a majority of people who were tested answered quickly and confidently, but were wrong, including individuals working on advanced degrees in maths and sciences. While ten cents was often given as the correct answer, the answer is actually five cents ($1.05 to .05 = $1.10)

Dr. Kahneman has also discovered that when people are faced with uncertain situations, they often do not evaluate the information or relevant statistics. Instead, they depend on a long list of mental shortcuts and default to an answer that requires the least mental effort; often leading to a wrong conclusion. He noted that many factors outside our awareness influence our judgments, attitudes, and behaviour. System 1 thinking reduces ambiguity by achieving a coherent story from the data we have. However, we may find patterns where none exists and believe in something that we should doubt. He explains that System 1 intuition can “feel right” and that can lead to overconfidence. System 2 thinking allows us to evaluate our stories and patterns skeptically. He encourages us to make judgments based on probability and base rates and to question our assumptions. Kahneman, along with other researchers, reports that intelligent people may be more vulnerable to thinking errors.

Ethical decision-making

Researchers have also established that overconfidence extends to how moral or altruistic our behaviour might be compared to that of our peers. Respondents consistently claimed that they were more likely than their peers to perform altruistic or ethical acts, but when placed in circumstances where they needed to demonstrate this, respondents did not behave as they had predicted (Balcetis & Dunning, 2008, 2013; Epley & Dunning, 2000, 2006). This overly positive estimation of our personal ethics coupled with hindsight bias (a tendency, after an event, to view the event as having been predictable) means that we may be very hard on others when errors occur.

Bazerman and Tenbrunzel (2011) have studied the gaps that occur between our desired behaviour and actual behaviour. Like Kahneman, they note that many of the factors that underpin our decision making are out of awareness but hypothesise that some of the gaps between who we believe ourselves to be and who we are may be connected to phenomena called bounded awareness. Bounded awareness is defined “as the tendency to place arbitrary or dysfunctional bounds around the definition of a problem.” Bounded awareness occurs when we narrowly and erroneously define a problem. They provided the example of Albert Speer, a Nazi government official, who described his role as “administrator”, and convinced himself that the issues he dealt with were not human related. Individuals may make “business” decisions or “engineering” decisions, which don’t feature all the information needed to make a sound or ethical decision. These authors believe that many instances of unethical behaviour by individuals and organisations are unintentional and a product of bounded awareness and fading (the removal of ethics from decision making).

Bazerman & Tenbrunsel (2011) explain that “errors in human decision making are more likely to occur when people are expected to make quick decisions. These types of errors are particularly important in today’s world where fewer people are being asked to do more work, with more interruptions, and as quickly as they can.” They explain that “not surprisingly, decision making tends to be most ethically compromised when our minds are overloaded. The busier you are at work, the less likely you will notice when a colleague cuts ethical corners or when you go over the line.” They explained that it was quite common for people to have emotional System 1 reactions to ethical problems. However, these responses may be at odds with the decision that would be made if the ethical issue was given more consideration. These authors explained that “System 1 thinking – our ‘gut instinct’ is likely sufficient for most decisions, but warn that for more serious ethical decisions, individuals need both System 1 and 2, so that consideration and planning can be brought to bear on issues.”

Implications

In summary, research indicates that we are very astute at predicting other people’s behavior, but are not as skilled at evaluating our own. When assessing others, we depend on observable, objective data, and account for the environment, but when assessing our own behaviour, we contend with our justifications, explanations, and experiences, which confound our observational skills.

Self-insight is notoriously unreliable. It is subject to all sorts of bias and distortion; thus, this author recommends that you do not depend on self-assessment for competency related issues. Regular observation and videotaping remain the gold standard for assessing and supporting competent performance. It also provides a platform for discussions of skills and strategies that eliminates many concerning distortions.

Training people to avoid bias or depending on intelligence to avoid bias and distortion, is unreliable. As Kahneman, who has studied decision-making processes for decades, explains, “Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.”

Feedback from others is crucial; however, feedback is not straightforward. Few people have been trained in how to give or receive feedback, and the most likely person to train is the person, who receives feedback. A “growth” mindset supports feedback and learning.

While System 1 thinking is good for many decisions we make, this author would warn against making complex or risk-laden decisions using System 1 thought processes. Many professions are using structured judgment tools to support a more reasoned approach or Systems 2 approach to problem solving. It is important to remember that time constraints can impair System 2 analysis.

How we frame and define the issues we face in our work may directly influence whether we reach safe, ethical decisions. It is also important to remember that we are likely to make better ethical decisions when we apply a reasoned method versus reacting to our intuition.


References
Bazerman, M.H., & Tenbrunzel, A.E. (2011). Blind Spots: Why We Fail to Do What’s Right and What to Do About It. Princeton University Press

Balcetis E., & Dunning, D. (2008). Cross-cultural studies of the “holier than thou” phenomenon. Journal of Personality and Social Psychology, 95, 1252-1267.

Balcetis, E & Dunning D (2013). Considering the situation: Why people are better social: Why people are better social psychologists than self psychologists. 12, 1-15.

Epley, N., & Dunning, D. (2000). Feeling “holier than thou” are self serving assessments in self or social prediction. Journal of Personality and Social Psychology. 79 (6) 861-875

Epley, N., & Dunning, D. (2006). The mixed blessing of self knowledge I behavioral prediction: Enhanced discrimination but exacerbated bias. PSPB, 32(5), 641-655.

Cross, P. (1977). Not can but will college teaching be improved? New Directions for Higher Education, 17, 1-15.

Dunning, D. (2014). We Are All Confident Idiots: The trouble with ignorance is that it feels so much like expertise. A leading researcher on the psychology of human wrongness sets us straight, Health & Behaviour, Retrieved from http://www.psmag.com/health-and-behaviour/confident-idiots-92793

Dunning, D. (2005). Self Insight: Roadblocks and Detours on the Path to Knowing Thyself: Essays in Social Psychology. NY: Psychology Press.

Dweck, Carolyn (2007). Mindset: The New Psychology for Success. Ballantine Books, Division of Random House.

Evans, J.B.T., & Over, D.E., Rationality and Reasoning, Psychology Press, 1996.

Kahneman, D., (2011). Thinking, Fast and Slow, Penguin Press.

Lehrer, Jonah. (2012). Why Smart People Are Stupid. Retrieved from http//www.newyorker.com/tech/frontal-cortex/why-smart-people-are stupid

Loftus, E.F., & Wagenaar, W.A. (1988). Lawyer’s predictions of success. Jurimetrics Journal, 29, 437-453.

MacDonald, T.K., & Ross, M. (1999). Assessing the accuracy of predictions about dating relationships: How and why do lovers’ predictions differ from those made by observers? Personality and Social Psychology Bulletin, 25, 1417-1429.

Marottoli, R.A., & Richardson, E.D. (1998). Confidence in, and self rating of, driving ability among older drivers. Accident Analysis and Prevention, 30, 331-336.

Middleton, W., Harris, P., & Surman, M. (1996). Give ‘em enough rope: Perception of health and safety risks in bungee jumpers. Journal of Social and Clinical Psychology 15,
69-79.

Risucci, D.A., Torttalani, A.J., & Ward, R.J. (1989). Ratings of surgical ratings by self, supervisors, and peers. Surgical Gynecology and Obstetrics, 169, 519-526.

Rumsfeld, Donald – Wikiquote. (n.d.) Retrieved from https://en.wikiquote.org/wiki/Donald_Rumsfeld

Sharot, T., (2012). The Optimism Bias: TED Talk. Retrieved from https://www.ted.com/speaker/Tali_sharot

Stone, D., & Heen, S. (2014). Thanks for the Feedback: The Science and Art of Receiving Feedback Well. Harvard Negotiation Project, Penguin Press.

Verghese, Abraham (2009) Retrieved from www.goodreads.com/author/quotes/93353

Yarkoni, T. (2010), What the Dunning Kruger Effect is and isn’t. Retrieved from https://www.Talyarkoni.org/blog/2010/07/07/what-the-Dunning-Kruger-effect-is-and-isnt