The book talks about randomness, associated maths, and the psychological biases which interfere with a more stochastic approach to thinking about life.

Probability

  1. Probability is not about odds but a belief in the existence of an alternative outcome.

    • The right way to judge performance in any field is not by the results but by the cost of the alternatives. A million dollars earned through dentistry is more valuable than a million dollars won via playing Russian roulette. The alternative outcomes in the case of Russian roulette are far worse.
    • Our habitats have evolved faster than our ability to evolve with them. Things like probability do not come naturally to us.
  2. When a random process scales (“repeated”), its results regress toward the mean. A 15% return with 10% volatility implies a 93% chance of a positive return in a year (=> 7% bad years) but only a 67% chance of having a positive month (=> 33% bad months) and a 50.02% chance of a positive second (=> every other second will have a negative return).

    • Over a short time, one sees variability in the portfolio not returns. Over a long time, one sees returns, not variability.
    • The wise man listens to the meaning; the fool only gets the noise.
    • Law of large numbers - A population consisting of a large number of bad managers is virtually guaranteed to produce some who will have amazing track records. Moreover, the extent of the greatness of their results depends more on the initial sample set than their ability to produce results.
  3. In real life, a lot of events have skewed payoffs. The likelihood of an event can be lower but the corresponding payoff (or damage) can be much higher. Maximizing the probability of such events does not translate into maximizing the payoffs.

    • An option seller makes continuous “small” amounts of income while an option buyer loses money and makes money in one shot only in case of the occurrence of a rare event. Selling options give psychological kicks while buying is economically optimal.
  4. Path-dependent outcome - Computer keyboards are QWERTY because typewriters are QWERTY and that’s because that was the preferred way to slow down the typists to avoid jamming the typewriters. It is not rational to have them today but path-dependent outcomes are the cornerstone of life.

    • The same effect can explain why Microsoft’s Windows was able to win by creating a network effect around its operating system.
    • The endowment effect is a manifestation of that.
    • The Polya process is a more accurate model of the real world than the “independent events” approach. Economists fail to realize that.

Biases

  1. We have two systems of reasoning - one is for fast decision-making, and the other one is for slow decision-making (more details in Thinking Fast and Slow).
  • One major implication of it is that ideas do not sink in when emotions come into play.
  • Consumers consider 75% fat-free hamburgers to be different from only 25% fat hamburgers. Mathematically, they are the same.
  1. Affect heuristic - the emotions associated with an outcome determine the probability of that outcome in our mind
  2. Attribution bias - People ascribe skills to their success and random unfortunate events to their failure.
  3. Simulation heuristic - playing alternative scenarios in the head. “I was about to exit the market right before the 2008 crash, I just missed it”
  4. Hindsight bias - We an incident happens, we believe that we knew that it was going to happen all along. Even though, if earlier, we were asked about our belief in the occurrence of the incident we would not have been that certain. The worst aspect is that it fools us into believing that we can predict the future.
  • History appears deterministic even though it is just a manifestation of the potential random paths that were realized.
  • Unlike hard sciences, one cannot experiment in history.
  1. Survivorship bias implies that the highest-performing realization is most visible since the loser never speaks up (or survives).
  2. Availability heuristic - We assign probabilities based on how easily an instance of that incident be recalled.
  • People don’t like to buy insurance against something abstract, only the vivid risks merit their attention, therefore, it is easy to sell travel insurance against terrorist acts vs travel insurance against loss of life.
  1. Representativeness heuristic/ Conjunction fallacy - We assign the probability of a person belonging to a particular group based on how similar the person looks to that group.
  • A feminist student is deemed more likely to be a feminist bank teller than to be a bank teller. Even though the latter is a superset of the former
  1. Wittgenstein’s ruler - If you have a scale, whose accuracy you aren’t confident of, and you use that to measure the table then you are measuring the scale as much as you are measuring the table. Or more formally, unless the source of the statement is a qualified author the statement is more revealing of the author than the information intended by him.
  2. Sandpile effect - “The last straw on the camel’s back” are examples of how a linear change can have a non-linear impact on complex systems (causing a collapse).
  3. Firehouse effect - A group of people with a similar mindset, after spending too much time can come to conclusions that are ludicrous to an outsider.
  4. Psychologically, the frequency of positive events matters more than the magnitude. A negative event of the same magnitude is more devastating than a positive event of the same magnitude.
  5. The inverse skills problem - The higher up the person is on a corporate ladder, the lower the repeatability of their work, and hence, lower the actual evidence of a contribution. A person engaged in repeatable work is easy to judge. One engaged in non-repeatable work cannot be easily judged since their results might be a pure manifestation of randomness.
  6. Humans and even non-humans start seeing patterns in randomness and develop superstitions around how those patterns can benefit or harm them.

Rationality

  1. Rational thinking has little to do with risk avoidance, most of it is about rationalizing one’s actions by fitting some logic to them.
  2. Some details of our daily life like career decisions and investments can harm us or even threaten our survival, it is good to be rational and scientific about them. Other mundane details like the choice of religion can be very irrational.
  3. It does not matter how frequently something succeeds if the failure is too costly to bear.
  4. Common sense is nothing but a collection of misconceptions acquired by the age of 18.
  5. Depending on the use case, extreme values are noise or devastating signals. Average temperature (excluding extremes) is great to decide the next vacation destination but for climate scientists, it’s the extremes that matter. Similarly, extremely rare events can bankrupt a company.
  6. As per Karl Popper, there are two types of theories, falsified and yet to be falsified. Something which can not be falsified is not a theory.
  7. More knowledge does not always lead to more information, sometimes, it just leads to a stronger belief in meaningless noise.
  8. Markets always go up in 20 years more or fewer holds but only a few markets have really survived over time and it was not obvious with hindsight which ones will survive. For example, Germany, Imperial Russia, and Argentina blew up completely.
  9. Most humans stop when they are satisfied (satisfied + suffice) with a result than working towards the most optimal outcome. Satisficers are happier; optimizers end up being more successful on any traditional metrics of success. The causality is not clear though.
  10. At a given point in the market, the most successful traders are likely to be those that are the best fit for the latest cycle. This does not apply to dentists since that profession is more immune to randomness.

Wisdom

  1. People become leaders not because of the skills they possess but the superficial expressions they make on others (“charisma”).
  2. When people merely work hard, they lose focus and intellectual energy. Work ethics, however, draw people towards signals than noise.
  3. Extreme empiricism, an absence of a logical structure, and competitiveness can be quite an explosive combination.
  4. Someone’s raw performance and personal wealth can sometimes (but not always) be an indicator of their success.
  5. We learn from mistakes by doing them not by reading/listening to them. Learnings from history cannot be acquired via pure reading either.
  6. Listening/reading current news neither provides one with any predictive ability nor improves one’s knowledge of the current world.
  7. Spontaneous remission of cancer can suddenly cure the disease and the patient might think that whatever pill they have consumed in the meanwhile has the cancer-killing property.
  8. The unpredictability of the behavior is a deterrent. Sometimes, the government has to overreact to small things, so that, others cannot figure out the precise limits of tolerance.