Who, Me? Biased?

For some time now I’ve been intrigued by the role of bias in our everyday existence. Certainly we see this every four years when it’s time to elect a President in the United States. Each side of the political spectrum extols the virtues of their candidate while truly questioning the sanity of anyone who could possibly consider the alternative. Beyond politics, bias impacts each of us in business as decisions are made (or not) and projects are approved (or killed) based–knowingly or not–on bias.

I particularly enjoyed Harvard Business Review’s related article entitled “The Big Idea: Before You Make That Big Decision” by Daniel Kahneman, Dan Lovallo, and Olivier Sibony. NOTE: Access the article by signing up for a free online subscription to HBR.

In this post I summarize some key biases and filters that impact our decision-making as project sponsors, project managers, and leaders around the organization.

Affect Heuristic. Ever fall in love with an idea? An approach on a project? A tool or a vendor? Affect heuristic comes into play when we like an idea so much that we minimize the risks and costs and exaggerate the benefits. Or, when we don’t like an idea, we exaggerate the risks and costs and minimize the benefits. When a team is presenting options and you detect there is emotion tied to their recommendation, take care to check their recommendation more thoroughly for biases.

Confirmation Bias. It is so easy for us to ignore data that contradicts our preconceived notions. It’s not that the data isn’t available. Rather, it’s right there in front of us–invisible to our decision-making process because of confirmation bias. Cathy Davidson, author of Now You See It!, recommends a technique she refers to as collaboration by difference.  In short, surround yourself with people who see things differently than you. Encourage dissenting opinions. Confirmation bias is almost impossible to detect on your own. (Click here and here for my interviews with Cathy).

Saliency Bias. Memorable past events can significantly impact today’s decisions. Michael Roberto talks about this in his excellent book entitled Know What You Don’t Know. (Click here and here for my two interviews with Michael). If your team or organization had a very successful project in the past, the current recommendations may be making strong analogies to that past successful project. Yet are they really the same? Similarly, a past failure could overly influence us to shy away from opportunities. To combat saliency bias, consider the following:

  • Ask for more analogies. Don’t just cherry-pick analogies that may sway us one way or the other. A broader set of comparisons can help flush out saliency bias.
  • Rigorously analyze how comparable the analogies truly are.

Advocacy. A team can advocate for a certain decision by filtering out information or data that would tarnish the team’s bias. To help detect if the team is overly advocating a bias, consider:

  • Are there reasonably valid alternatives being presented? When advocacy is occurring, the team may present implausible or obviously less desirable alternatives, all designed to make their recommendation more appealing.
  • Is there genuine admission of uncertainty? As Tom DeMarco told me in an interview, “It’s not what you don’t know that kills you. It’s what you know that just isn’t so!” (Click here and here for my interviews with Tom DeMarco). Or, as Daniel Boorstin said, “The greatest obstacle to discovery is not ignorance — it is the illusion of knowledge.” Is there uncertainty being admitted or does the team seem to exude a false sense of assurance that they have everything under control? Reality check: No plan is perfect. Make sure uncertainty is recognized and presented.
  • Don’t assume WYSIATI (what you see is all there is). What information seems to be missing? Harvard Business School Professor Max Bazerman recommends asking, “If you had to make this decision again in a year, what information would you want? Can you get more of it now?” A recommendation can seem compelling until we realize certain information missing, either purposely through advocacy or unconsciously.

Anchoring Bias. We can easily prioritize or weight one piece of information too heavily when making decisions. Similar to confirmation bias, the data is provided. It’s just incorrectly weighted, swaying a decision because of anchoring, or an anchoring bias. Consider asking:

  • What are the facts? What are opinions, assumptions, or estimates?
  • Where did the numbers come from?
  • What assumptions underly the numbers?

Halo Effect. When suffering from the halo effect, success and failure is overly attributed to the personalities of the leaders. In my interview with legendary Henry Mintzberg, he called this the Leadership Attribution Error. When things go right, leaders get too much credit, and when results are bad, leaders get more than their share of the blame. This can happen when we make decisions because a well-respected company did the same thing. If they are supposedly an excellent company and they made the same decision and it worked out well then surely it’s a good decision for us. Or not. Consider asking:

  • How similar are our conditions to theirs?
  • Was their success due to their decisions or perhaps luck in timing?
  • What specifics do we know about the decision made by the other organization?

Sunk-Cost Fallacy. Good accounting practice tells us that we should not consider past investments when making future choices. Those sunk costs can easily lead us to spending good money on bad projects just because we don’t want to recognize faulty assumptions or changed conditions.

Planning Fallacy. This is a classic problem in project management: hit this date because we say it’s possible. We suffer from planning fallacy when overly optimistic forecasts are made without sufficient basis on history or a reasonably fleshed out plan.  Consider asking:

  • Is the recommendation overly optimistic?
  • What projects or situations in the past give us reason to think these assumptions are valid?
  • What risks have been considered in the formation of this plan?

Disaster Neglect. Is the worst-case scenario truly worst-case? Denial is a powerful force! To confront disaster neglect, consider:

  • Asking, “Where did the worst case come from? What could happen that we haven’t thought of yet?”
  • Conducting a pre-mortem. Psychologist Gary Klein came up with this technique which projects the team into the future, imagining the worst has already happened. Then have the team come up with stories about how it happened.

Loss Aversion. There’s a difference between denial of risk and being risk averse. The former ignores that risks are out there where the latter recognizes the risks exist and causes the team to react. Generally, denial of risk is deadly to projects and organizations. But risk aversion, or loss aversion, can be deadly as well. It’s when a team or organization is overly cautious, and the desire to avoid losses is stronger than the desire for gains.

Do you recognize any of these filters or biases in your organization? How about in yourself? This last question is critical. In the article I mentioned at the beginning of this post, the authors make a compelling case for why it is nearly impossible for us to see our own biases. Keep these in mind as you lead your projects and teams this week.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑