Leadership, information, and Poverty Traps: Rationalizing Social Equilibria

Wednesday, September 08, 2010

By J. Atsu Amegashie

 The context of economic development, one may see the role of leaders as providing the public information required to move away from “bad” equilibria to “good” equilibria.

One of the most striking and ubiquitous features of human societies is localized conformity. There are many socio-economic situations in which we are influenced in our decision making by what others are doing. We often decide which stores to patronize or schools to attend or what career to pursue on the basis of how popular they seem to be. Such observational “learning” also influences the decision to adopt new technologies. Voters are known to be influenced by opinion polls. Cultural practices, fashion, and fads are examples of localized conformity. And in economic development, such conformity can result in socially inferior outcomes (e.g., poverty traps).

Wikipedia gives the following two examples:

(a) Small protests began in Leipzig, Germany in 1989 with just a handful of
activists challenging the German Democratic Republic. For almost a year, protesters met every Monday growing by a few people each time. By the time the government attempted to address it in September 1989, it was too big to squash. In October, the number of protesters reached 100,000 and by the first Monday in November, over 400,000 people marched the streets of Leipzig. Two days later the Berlin Wall was dismantled.

(b) The adoption rate of drought-resistant hybrid seed corn during the Great
Depression and Dust Bowl was slow despite its significant improvement over the previously available seed corn. Researchers at Iowa State University were interested in understanding the public’s hesitation to the adoption of this significantly improved technology. After conducting 259 interviews with farmers it was observed that the slow rate of adoption was due to how the farmers valued the opinion of their friends and neighbors.

Some of the mechanisms for such uniform social behavior are: (i) sanctions on
deviants, (ii) a preference for conformity, (iii) communication, and the less
obvious one of (iv) positive complementarities (see below).

Another mechanism is driven by the individually rational but, in some cases,
collectively inefficient use of private information by agents in decision-making situations. This has been variously referred to as history or path dependence, informational cascades, or herd behavior. The theoretical analyses of this phenomenon was undertaken by Abhijit Banerjee in a paper in the Quarterly Journal of Economics and Sushil Bikhchandani, David Hirshleifer, and Ivo Welch in a paper in the Journal of Political Economy. Both papers were published in 1992.

To illustrate their theory, consider a situation where there is a population of 100 farmers who have to adopt a technology. There are two options: A and B. Everyone gets a private and imperfect signal about which technology is better. If technology A is indeed better, everyone gets a private signal 60% of the time that technology A is better. If technology B is indeed better, everyone gets a private signal 60% of
the time that technology B is better. So, like a medical test, the signal is helpful but it is not perfect (i.e., there is a 40% chance that the signal gives the wrong information. It gives the correct information 60% of the time). Everyone’s signal is equally good.

The farmers make their decisions sequentially. A farmer observes the choices of those before him, and then decides on whether to choose technology A or B.

Suppose that it is common knowledge that the second farmer got a signal that B was the better technology. This assumption is not necessary. The analysis holds even if only the action (i.e., choice of technology) of the second farmer is commonly known. However, the assumption that the second farmer’s signal is commonly known simplifies the exposition significantly.

Assume that you are the third person to choose, and you saw the first farmer choose B. The second farmer also chose B because we have assumed that he got a B signal.
There are two cases:

Case (a): Suppose your private signal says that technology B is better. Then you should choose B because there have been three B signals out of three signals. That is, your own B signal, the second farmer’s commonly known B signal, and you can infer that the first farmer got a B signal because he chose technology B.

Case (b): Suppose your private signal says technology A is better. It is still optimal to choose B. Why? You know that the first farmer must have had a signal that B was better because he chose B. You know that the second farmer got a B signal. So you have two B signals and one A signal. One of the B signals effectively cancels out your A signal. You are then left with one other B signal, so you should choose technology B.

Now, everyone after you will know that what you did had nothing to do with your private information because you would choose B regardless of your private information (signal). Therefore, your choice provides no information to the next person. Hence the fourth farmer should ignore your information and, like you, only use the fact the first two farmers got B signals. Then, like you, he should choose B regardless of what his private information (signal) says. Then the choices of the third and fourth farmers provide no information to the fifth farmer. Following a similar logic, the fifth farmer will also choose B and the process continues.

Therefore, you may get a million farmers choosing technology B just because the first two farmers chose technology B even if each of these million farmers got the signal that technology A was better. This leads to an informational cascade or path/history-dependent behavior. The choice of the first two farmers is the initial condition or history of this society.

What does this mean for society? To quote Ivo Welch, “Cascades predict that you can get massive social imitation, occasionally leading everyone (the "herd") to the incorrect choice. In an information cascades everyone is individually acting rationally. Even if all participants as a collective have overwhelming information in favor of the correct action, each and every participant may take the wrong action. The probability that everyone is taking the wrong action is less than 50%, but it is easy to construct examples in which everyone is wrong with 30-40% probability.”

In the above example, if technology A is indeed the better technology, there is a 16% chance that this bad social equilibrium will occur. This is the probability that the first two farmers got B signals, although technology A was the better technology.

As Ivo Welch notes “Because everyone knows that there is very little information in a cascade, cascades are "fragile"; a little bit of new public information can make a big difference). A little bit of public information (or an unusual signal) can overturn a long-standing informational cascades. That is, even though a million people may have chosen one action, seemingly little information can induce the next million people to choose the opposite action. Fragility is an integral component of the Informational cascades theory!”

The preceding point means that the theory of informational cascades can explain why mass behavior can be fragile in the sense that small shocks can lead to large shifts in behavior. As Sushil Bikhchandani, David Hirshleifer, and Ivo Welch observed in their seminal paper “… cohabitation of unmarried couples was viewed as scandalous in the 1950s, was flaunted in the 1960s, and was hardly noticed in the 1980s. Colleges in which students demonstrated and protested in the 1960s became quiet in the 1980s. The recent rejection of communism began in Poland and later spread rapidly among other Eastern European countries.”

It is important to note that the theory of informational cascades is different from the theory of payoff complementarity that is also used to explain social conformity. The latter theory is based on the assumption that some things are more worthwhile when others are doing the same thing. For example, a person may decide to join facebook if many of his/her friends are on facebook because it is more worthwhile to interact with them. His/her presence on facebook is complemented by the presence of his/her other friends. The theory of informational cascades is not driven by such
positive complementarities. To see this, note that in the example given above, when a farmer adopts a particular technology, there is no assumption that the benefits of the technology are higher when many other farmers adopt the same technology. The benefits of a technology to a farmer do not depend on the number of other farmers who also use the technology. Furthermore, unlike the theory of informational cascades, the theory of payoff complementarity does not involve the use of private information. Indeed, in this theory, information is not used as a guide to the “correct choice”. The role of information is a cornerstone of the theory of informational cascades.

In the context of economic development, one may see the role of leaders as providing the public information required to move away from “bad” equilibria to “good” equilibria. Suppose that a critical mass of the farmers realize that the society is stuck in a bad cascade and are willing to take some risk by ignoring the actions of thousands of those who were ahead of them and instead follow their own signal (i.e., technology A is the better technology). Then their action could influence subsequent farmers to choose technology A as well. This critical mass of people may be seen as leaders. Therefore, while informational cascades or history-dependent outcomes are challenging, they are not excuses for inaction or failure. Rationalizing these equilibria is not the same as justifying them.

* J. Atsu Amegashie teaches economics at the University of Guelph, Canada and he is an associate of AfricanLiberty.org