Some Brain’s Biases
Here are some elements on brain’s biases in assessing risk, mainly drawn from texts of Bruce Schneier, Ian MacCammon and Vic Napier, linked at the bottom of this page. Bruce Schneier is an internationally renowned security technologist and author, Ian MacCammon is an avalanche accidents specialist and psychologist, Vic Napier is a psychologist also involved in skydiving.
Risk can be assessed with mathematics or on a computer, which gives a reliable idea when fed with the right infomation; sometimes on the playground we have only our brain, maybe handier but somehow less accurate. To make quick decisions in complex situations, we can not analyse all factors, this would be impossible or very tiring on the long term. So we use rules of thumb or short cuts called heuristics (McCammon, Schneier), selecting only the most relevant information and getting fast to a conclusion.
We use them intuitively, without even noticing it. Choosing the right speed in a curve in order to remain safe, but without losing too much time… This intuition comes from long ago and is very similar to the survival strategies of all living species. As we are still present today, our heuristics seem efficient and as they work so well in everyday’s life, we are easily misled, even if there are obvious signs of danger.
The species’ evolution is slow, following natural changes; but since the agricultural revolution around 10,000 BC, the human societies have evolved much faster and our heuristics have become less accurate. The most common heuristic traps or biases in assessing risk, when the perception and the reality of risk do not match, are summed up here (Schneier):
|People exaggerate risks that are||People downplay risks that are|
New and unfamiliar
Beyond control or externally imposed
Affecting them personally
Directed against their children
Under control or taken willingly
Directed towards themselves
Our innate capabilities to deal with risk can fail when confronted with modern society, its big cities, fast transport means, technology and the media. Some of those biases are explained below, where they come from and which weird results they can give.
Let us first understand how the brain processes risk. Two different systems are involved (Schneier):
The amygdala, a very primitive part of the brain, processes base emotions coming from sensory inputs, like anger and fear. When an animal feels a potential danger, the amygdala causes adrenaline to be sent into the bloodstream, increasing heart rate, beat force and muscle tension. The faster it can flee or fight, the more likely it is to live and reproduce. Perfect to run, no to think.
The neocortex, a more recent and advanced part that only appeared in mammals, analyzes risk. It is intelligent, it can make more nuanced choices, but works very badly when overwhelmed by adrenaline. It is also much slower and it is hard for the neocortex to contradict the amygdala.
All that results in the choices we make.
Here are some domains where heuristics are used, with the corresponding biases (McCammon, Schneier).
Several biases are related to probability. In general, we are uncomfortable with big numbers; the common saying “one, two, many” does not help much in statistics.
Availability refers to memory: easy to remember data are given more weight than hard to remember ones. We are more concerned by common events than uncommon ones. But spectacular events can be easy to remember too, even if they might be uncommon. So the events easiest to remember are actually not always the most probable ones. Several qualities refer to this category.
Our memory will be marked by the vividness of an event, which means those described with lots of circumstantial details rather than neutral ones. This does not always make a difference right after the event or the description, but later on, the detailed event is easier to remember.
Salience refers to the angle of view: someone involved in a conversation seems more convincing for those who are facing her/him, whatever the arguments used.
Events that have actually occurred are also logically easier to imagine and obviously more predictable than events that have not, which explains the hindsight bias.
Generally the availability heuristic is a good shortcut, but today it is well understood and used for example in advertisement, marketing or storytelling, then it often becomes less relevant.
Representativeness: an example represents the class it belongs to. The more details that are given in the description of the example, the better it will be remembered, while its probability decreases.
With the law of small numbers, people assume that long-term probabilities also hold in the short run, like in gambling, which is of course wrong.
We have all sorts of biases involving costs.
By mental accounting, people categorise costs. They automatically adapt to where something can be bought (luxury shop or supermarket) and accept to pay a price accordingly. People spend money more easily on cheap things, explaining why advertisers often describe large costs as “only a few dollars a day”. For risks, we accept higher ones in our leisure account (sport) than in everyday’s life.
The context effect: preferences among a set of options depend on the other options in the set. The rule of thumb there is: avoid extremes – except maybe when presenting a budget project…
The scarcity heuristic or endowment effect is the tendency to value opportunities in proportion to the chance that one may lose them, about two times more when they are being lost than gained. It works often exactly contrary to personal safety. In competition, some may be tempted to choose a difficult path on the chance that they will gain an advantage over their competitors.
Time discounting is the human tendency to discount future costs and benefits. Basically, owning something is worth more today than in the future, taking into account the time of use and the interest rates.
The magnitude effect: smaller amounts are discounted more than larger ones, and gains are more discounted than losses. Even if it may look surprising, all these effects are shown by numerous studies and we do that automatically.
Heuristics and biases also affect decisions.
Familiarity represents usual actions that do not require much thought; then we base our decision on what we did last time in a similar situation, but when there is a change, that rule of thumb can become a trap. Highly skilled persons in a familiar environment would take more risks, even if they were trained in the hazards. This suggests that familiarity negates the benefits of training!
With choice bracketing, people seem to overestimate the value of variety: when faced with repeated risk decisions, evaluating them as a group makes them feel less risky than one at a time.
Consistency or the confirmation bias: in sport or life, some decisions are not taken lightly; however, once the decision has been made, the person would prefer to stick to that decision. Most of the time it is reliable, but it can become a trap when consistency overrides critical new information about an impending hazard. In hindsight, it is often difficult to understand why somebody stayed with a course of action despite a worsening environment. We do not like to contradict ourselves.
Acceptance pushes us to behave in a way that we hope will get us accepted or liked by others. We are very vulnerable to that, even from an early age. Typically, aggressive or risk-taking behaviour is more prevalent with younger men when women are involved. Also, a person new to the group might be susceptible to this heuristic when trying to validate his acceptance by the others.
The expert halo refers to the leader of a group, often informal, who makes critical decisions, based on local knowledge or experience, or simply on the person’s age, assertiveness or past successes. Often, decisions made by the “leader” are followed by others despite there being information available that this might not be the best course of action.
Social facilitation: when the environment might warrant an individual decision not to practice, a group discussion with others may expose to accepting greater risk. By following the others, people can expose themselves to more risk than when they are outside a group dynamic. Social facilitation lulls its victims into a sense of feeling safe, even when dangers are obvious.
The prospect theory states that humans prefer a low sure gain or an high unsure loss. It was probably a better strategy to catch a young prey, than a bigger but stronger one (a meal today to wait until tomorrow); also as animals were always on the edge, any loss was a bad sign. Anyway, the animals making these types of choices survived and were our ancestors.
The framing effect refers to the way an option (the same) is presented. As a gain, people will tend to take less risk, while as a loss they will seek more risk.
The affect heuristic includes an affective aspect in choices and explains why we are more sensitive when humans, especially our children, are involved. Some animals spread thousands of eggs to reproduce and only some will survive, but it is enough. Instead, mammals have only few children and take care of them.
The last two biases are particularly interesting from an airsports point of view.
The risk homeostasis states that if some device or law reduces risk, people will take more risk to keep the same perceived level of risk. In reference groups studied on the medium term, the cars’ ABS systems did not change the accidents figures, as the drivers progressively behaved in a more risky way.
Lastly, the optimism bias is probably the number one bias involved in our perception of risk and in our relation with safety. “Accidents happen only to others” is well known: we believe that we succeed better than our fellows in the same activity. Again evolution explains it: the “loser” animals tended to disappear, those surviving have learned to underestimate loss and to accept risk.
Moreover, an interesting aspect to get in mind is our motivations to do such sports, usually (and maybe wisely) considered as risky. Here are four of them (Napier):
Thrill and adventure seeking leads to typical “close to the edge” sports like skydiving, motorcycling, base jumping or scuba diving, which provide the needed adrenaline.
Experience seeking attracts more intellectual oriented people toward art, music or culture, but always with some different or new aspect.
Disinhibition refers to socializing, outgoing, why not with a touch of alcohol or more. Fun and friends.
Boredom susceptibility means fear of routine and a constant search for new sensations or events.
Now it could be interesting to wonder why we need to fly? Are our motivations different from those ones, feel like a hero, assert our ego, fight against an old depreciation from a parent, achieve a childhood dream… or could we identify with one or more of those categories? Are we more at risk depending on our psychological profile? Would it influence if, when and how we make our decisions?
The lists above show that individual preferences are not based on predefined models, but instead are poorly defined, highly malleable, and strongly dependent on their context. Moreover, they show our intuition is guided by processes originated a long time ago in an environment very different from the big cities we are living in or the fast vehicles we are using nowadays.
Our brain, that we believe to be a reliable risk assessment system, actually does not work so accurately. The differences with the optimal solutions that would come from a computer are small, which explains why it is difficult to understand and accept its weakness. In Ian MacCammon’s studies, almost two-thirds of the participants behave dangerously even being aware of the hazard, and their training level does not change essentially their likelihood to fall prey of heuristic traps.
Knowledges in meterology or aerodynamics seem normal for anyone involved in airsports, while psychology focused on those traps does not. Some bases in that area may help change our mind and improve our safety.