A Simple Example

Main Menu           Previous Topic                                                           Next Topic

How Likely is a Cascade?

It turns out that on the previous slide, the likelihood of a cascade at any time depends on the parameter p - the precision of every agent's private signal. If p is equal to 0.6 for a particular agent, then in 60% of the cases, this agent will receive the correct signal. An assumption of this model is that p always lies between 0.5 and 1.0, that is, every agent has at least some ability to estimate the truth, but he cannot do it perfectly.

Since the model is probabilistic, the only way to expose any kind of behavior is to run it many times. Here, we describe an experiment using this methodology. For several different values of p between 0.5 and 1.0, we ran the society 10,000 times and recorded the number of societies in which Agent 7 was part of an UP (i.e. correct) cascade, the number in which he was part of a DOWN cascade, and the number in which he was not part of any cascade. Dividing these by 10,000, we inferred the experimental probabilities of each event. Note that for every given p, the sum of all probabilities should be equal to 1.

"A Theory of Fads as Informational Cascades" makes a prediction of just what these probabilities should be, based on the mathematics. As a validation of our model, we've placed both our experimental probabilities and the theoretical ones on the graph below. Observe that the differences between the two lie within statistical error.

Note that as the precision of the agents goes down to 0.5, the probability of having an UP cascade by the time Agent 7 makes his decision approaches the probability of having a DOWN cascade. At this extremum, the agents are equally likely to receive correct and incorrect signals so a DOWN cascade becomes symmetrically likely to an UP cascade.

On the other hand, as the precision goes to 1.0, the probability of an UP cascade goes to 1.0, and that of DOWN cascade to 0.0, since the agents' signals are almost always correct.

                   Previous Slide                                                           Next Slide