A totally rational person (enter Mr. Spock) would collect a lot of information, analyze many alternatives and only then arrive at the best decision. However, what we typically do is ration our time by using shortcuts, like rules of thumb, past experiences, recent events, etc.; we satisfice. Because we are not totally rational human beings, we have cognitive limitations. These biases affect our perspectives, decisions and actions. Everyone has them and most are based on our individual life experiences. Let’s take a look at the most pertinent to PM:
- Egocentrism – This occurs when we attribute more credit (or blame) to ourselves for a particular team outcome than a disinterested outside party would.
- Anchoring Bias – Allows an initial reference point to distort our perspective, even if the reference point is completely arbitrary. This bias can be something we are carrying around from the past or a reference point provided by others.
- Overconfidence Bias – We humans are systematically overconfident in our ability to get things done, just look at any project charter document’s schedule or any project plan.
- Sunk Cost Effect – Many of us are familiar with sunk cost as a financial concept, however it can also refer to the tendency to escalate all types of commitment to a troubled project. An organization’s inability to kill projects can be due in part to this.
- Availability Bias – A tendency is to place too much emphasis on readily available information and evidence. An associated phenomenon is the recency effect where too much emphasis is placed on recent events, especially a hot streak of success.
- Confirmation Bias – We tend to gather and rely on information that confirms our existing views and avoid or downplay information that disconfirms our hypotheses.
- Illusory Correlation – We jump to a correlation between variables when none exist or over emphasis weak correlations. We fail to look for cause-effect relationships.
As you can visualize all the above biases can easily slip into our project efforts and derail them. Since they are rooted in human nature and difficult to avoid, what can we do to combat these biases? Here is the way I handle it:
- As I approach a situation I perform a mental check for them, trying to understand whether I have any negative influences. Next, I check the contributions of other team members and always have a candid dialog with the team about them.
- Constantly review our efforts as the situation moves forward for vulnerability; insisting on rapid feedback.
- Sometimes I use of unbiased experts, if I suspect a vulnerability is significant.