What effect of organizational power on decision making is often reflected in the tendency of staff?

What effect of organizational power on decision making is often reflected in the tendency of staff?

Most leaders view employee freedoms and operational controls as antagonists in a tug-of-war. They tend to focus on regulating workers’ behavior, often putting a damper on commitment, innovation, and performance without realizing it. But freedom and control aren’t zero-sum, argues the author. By giving people a clear sense of their organization’s purpose, priorities, and principles—that is, by providing freedom within a galvanizing framework—leaders can equip employees to make on-the-ground decisions that are in the company’s best interests.

Gulati uses businesses as diverse as Netflix, Alaska Airlines, and Warby Parker to show how freedom can function in different settings. A coherent framework helps employees develop a deeper understanding of the business, which can lead to improved engagement, creativity, efficiency, and customer service.

Most leaders view employee freedom and operational control as antagonists in a tug-of-war that can have only one winner. So they tend to pour their resources into regulating workers’ behavior—often unknowingly putting a damper on commitment, innovation, and performance.

The Solution

By giving people a clear sense of the organization’s purpose, priorities, and principles—that is, a galvanizing framework—leaders can equip them to make autonomous decisions that are in the company’s best interests. Employees should be involved in identifying and articulating those guidelines.

The Benefits

A coherent framework helps employees develop a deeper understanding of the business, which can boost performance on many levels, including engagement, quality, creativity, and customer service.

Leaders know they need to give people room to be their best, to pursue unconventional ideas, and to make smart decisions in the moment. It’s been said so often that it’s a cliché. But here’s the problem: Executives have trouble resolving the tension between employee empowerment and operational discipline. This challenge is so difficult that it ties companies up in knots. Indeed, it has led to decades’ worth of management experiments, from matrix structures to self-managed teams. None of them has offered a clear answer.

A version of this article appeared in the May–June 2018 issue (pp.68–79) of Harvard Business Review.

There are few business activities more prone to a credibility gap than the way in which executives approach organizational life. A sense of disbelief occurs when managers purport to make decisions in rationalistic terms while most observers and participants know that personalities and politics play a significant if not an overriding role. Where does the error lie? In the theory which insists that decisions should be rationalistic and nonpersonal? Or in the practice which treats business organizations as political structures?

A version of this article appeared in the May 1970 issue of Harvard Business Review.

Learning Outcomes

  • Compare various biases and errors in decision making

There are two types of decisions—programmed and non-programmed. A programmed decision is one that is very routine and, within an organization, likely to be subject to rules and policies that help decision makers arrive at the same decision when the situation presents itself. A nonprogrammed decision is one that is more unusual and made less frequently. These are the types of decisions that are most likely going to be subjected to decision making heuristics, or biases.

As we become more embroiled in the rational decision making model—or, as we discussed, the more likely bounded rationality decision making model—some of our attempts to shortcut the collection of all data and review of all alternatives can lead us a bit astray. Common distortions in our review of data and alternatives are called biases.

You only need to scroll through social media and look at people arguing politics, climate change, and other hot topics to see biases in action. They’re everywhere. Here are some of the more common ones you’re likely to see:

Overconfidence Bias

The overconfidence bias is a pretty simple one to understand—people are overly optimistic about how right they are. Studies have shown that when people state they’re 65–70% sure they’re right, those people are only right 50% of the time. Similarly, when they state they’re 100% sure, they’re usually right about 70–85% of the time.

Overconfidence of one’s “correctness” can lead to poor decision making. Interestingly, studies have also shown that those individuals with the weakest intelligence and interpersonal skills are the most likely to exhibit overconfidence in their decision making, so managers should watch for overconfidence as a bias when they’re trying to make decisions or solve problems outside their areas of expertise.

Anchoring Bias

The anchoring bias is the tendency to fix on the initial information as the starting point for making a decision, and the failure to adjust for subsequent information as it’s collected. For example, a manager may be interviewing a candidate for a job, and that candidate asks for a $100,000 starting salary. As soon as that number is stated, the manager’s ability to ignore that number is compromised, and subsequent information suggesting the average salary for that type of job is $80,000 will not hold as much strength.

Similarly, if a manager asks you for an expected starting salary, your answer will likely anchor the manager’s impending offer. Anchors are a common issue in negotiations and interviews.

Confirmation Bias

The rational decision making process assumes that we gather information and data objectively, but confirmation bias represents the gathering of information that supports one’s initial conclusions.

We seek out information that reaffirms our past choices and tend to put little weight on those things that challenge our views. For example, two people on social media may be arguing the existence of climate change. In the instance of confirmation bias, each of those people would look to find scientific papers and evidence that supports their theories, rather than making a full examination of the situation.

Hindsight Bias

What effect of organizational power on decision making is often reflected in the tendency of staff?
Hindsight bias is the tendency we have to believe that we’d have accurately predicted a particular event after the outcome of that event is known. On the Saturday before a Super Bowl, far fewer people are sure of the outcome of the event, but on the Monday following, many more are willing to claim they were positive the winning team was indeed going to emerge the winner.

Because we construct a situation where we fool ourselves into thinking we knew more about an event before it happened, hindsight bias restricts our ability to learn from the past and makes us overconfident about future predictions.

Representative Bias

Representative bias is when a decision maker wrongly compares two situations because of a perceived similarity, or, conversely, when he or she evaluates an event without comparing it to similar situations. Either way, the problem is not put in the proper context.

In the workplace, employees might assume a bias against white males when they see that several women and minorities have been hired recently. They may see the last five or six hires as representative of the company’s policy, without looking at the last five to ten years of hires.

On the other side of the coin, two high school seniors might have very similar school records, and it might be assumed that because one of those students got into the college of her choice, the other is likely to follow. That’s not necessarily the case, but representative bias leads a decision maker to think because situations are similar, outcomes are likely to be similar as well.

Availability Bias

Availability bias suggests that decision makers use the information that is most readily available to them when making a decision.

We hear about terrorism all the time on the news, and in fictional media. It’s blown out of proportion, making it seem like a bigger threat than it is, so people invest their time and efforts to combat it. Cancer, however, kills 2,000 times more people. We don’t invest in that, it doesn’t get enough news coverage, and it’s not as “available” in our mind as information. Hence, the availability bias.

Commitment Errors

This is an increased commitment to a previous decision in spite of negative information. A business owner may put some money down on a storefront location to rent DVDs and Blu-rays, start purchasing stock for his or her shelves and hire a few people to help him or her watch the cash register. The owner may review some data and stats that indicate people don’t go out and rent videos too much anymore, but, because he or she is committed to the location, the stock, the people, the owner is going to continue down that path and open a movie rental location.

Managers sometimes want to prove their initial decision was correct by letting a bad decision go on too long, hoping the direction will be corrected. These are often costly mistakes.

Randomness Errors

If you are certain your lucky tie will help you earn a client’s business at a meeting later today, you’re committing a randomness error. A tie does not bring you luck, even if you once wore it on a day when you closed a big deal.

Decisions can become impaired when we try to create meaning out of random events. Consider stock prices. Financial advisors feel they can predict the flow of stock prices based on past performance, but on any given day, those stock prices are completely random. In reality, these advisors were able to predict the direction of stock prices about 49 percent of the time, or about as well as if they’d just guessed.

In the case of the lucky tie, that’s more a superstition. Decision makers who are controlled by their superstitions can find it difficult or impossible to change routines or objectively process new information.

Managers who can objectively collect data and arrive at alternatives without being affected by these biases are already head-and-shoulders above other decision makers who aren’t aware of these pitfalls. Finding unique solutions to unique problems requires a little something more, though. Creativity in decision making can take you to the next step. We’ll talk about that next.

Contribute!

Did you have an idea for improving this content? We’d love your input.

Improve this pageLearn More