Have you ever played a game of Telephone with a third grader? It’s a trip.
Without fail, some phrase like “chocolate is the best” turns into some barely coherent string of words (read: “chalk ate sweat”).
Supply chain professors like to play a similar game with their students. It’s called the beer game.
It’s basically a game of Telephone played in the context of beer distribution. Students are assigned roles—customers, retailers, wholesalers, and suppliers—to create a supply chain. At each level, there is one decision to make: how much beer do we order?
Customers order in steady patterns. Retailers order based on customer orders; wholesalers order based on retailer orders; and suppliers order based on supplier orders.
If a customer needs one case of beer, then retailers need one case of beer…and so on.
But, just as words are mangled in Telephone, students distort order information in the beer game. The entire supply chain falls apart. Suppliers have too much beer, retailers don’t have enough beer, and our friend the customer is left without a brew.
What’s being taught here is the bullwhip effect. The bullwhip effect is the idea that mistakes made at one end of the supply chain radiate and amplify throughout the rest of it. Poor information that passes from retailers to wholesalers gets worse as continues from wholesalers to suppliers.
My guess is that you don’t work in supply chain management, and that you have little care for the finer details of beer distribution. But you probably do work within a system.
A supply chain is just a system, after all—a system that supplies information from one end to another. All of us operate in systems. The hierarchical ladder of an organization is a system. The line of communication in Telephone is a system. The schedule that guides our day is a system.
Most systems are made up of individuals—rational individuals. And it’s easy to assume that mistakes amplify up a supply chain because of irrational human behavior. But that’s not the case. The bullwhip effect occurs because of rational behavior within a supply chain’s infrastructure.
Each player in the beer game makes rational decisions. They observe demand for beer. They guess how much beer they need. They order too much beer. They don’t order enough beer.
Everyone does their best, as the system will allow, but the information they share distorts and twists and system produces poorly. The steady order pattern of consumption looks more and more like a monkey throwing darts at random numbers.
It’s no one person’s fault. Everyone acts rationally; the system doesn’t hold.
Robert McNamara was the Secretary of Defense during the Cuban Missile Crisis and the Vietnam War. An important lesson he learned: “rationality will not save us.”
He saw—firsthand—how close rational individuals came to destroying the world with nuclear weapons. Khrushchev was rational. Kennedy was rational. But their collective rational decision making brought us within a hair’s breadth of the world’s end.
We rarely, if ever, make decisions as consequential as this, but the point is not lost: we cannot rely upon rationality to produce desired outcomes.
Rationality does not prevent the third grader from mangling “chocolate is the best” into “chalk ate sweat.” It does not save the wholesaler from ordering too much beer. It does not ensure that the information we give our boss will make it to other employees as we intended.
Systems are sort of like pinball machines, guiding our behavior and providing structures that impact how we move through space.
We could give that pinball complete rationality—the most rational little pinball in the world—and the pinball won’t do anything beyond the structure of that pinball machine. If we want that ball to escape the pinball machine, we have to change its infrastructure. We have to change the system.
When playing the beer game, players aren’t allowed to talk with each other. The only communication is done through orders placed. This is how many supply chain systems work—raw information doesn’t move through the supply chain. Suppliers don’t see how much beer retailers order or customers drink. They only see the orders placed to them.
So how do we get players to make better decisions? How do we prevent information from distorting?
Let ‘em talk. Let the players communicate with each other. Let the suppliers see all the demand information. Change the system.
It’s that simple. When we change the way information travels through a system, we change the outcomes it produces. When suppliers can see the raw information, they can make decisions based on signal rather than noise—they can make better decisions.
If we want to see certain outcomes, we have to think about systems.
We have to understand how systems guide human behavior.
We have to understand how the systems we operate in work.
If we want real change, we need to look at the systems. It’s futile to change individuals. They’re merely acting in a system.