Monday, April 4, 2016

Coincidences and Bias

In the "Introducing Statistics" section of Intuitive Biostatistics, the author explains how probability and statistical thinking are not, in fact, intuitive.  As humans, our brains are hardwired to look for patterns, even in data that was actually randomly generated.  As an example of this, the author points out that coincidences are actually much more common than we realize because "it is almost certain that some seemingly astonishing set of unspecified events will happen often, since we notice so many things each day" (p. 5). 

I have often noticed that soon after I learn a new word, I will repeatedly hear that word used over the next several days, even though I could never remember hearing it before in my life.  This happened to me a lot when we learned SAT vocab in high school English, I remember repeatedly hearing and reading the word "gregarious" outside of class after first learning what it meant.  Apparently this is a common enough phenomenon that it has a name; it's called the "Frequency Illusion" (or the "Baader-Meinhof Phenomenon").  Basically, learning a new word primes the brain to pay more attention when that word is heard again.  I probably heard "gregarious" no more often during my sophomore year of high school than in the previous fifteen years of my life, but because my brain was more aware of this word right after I learned it, it seemed like I was hearing it all the time. 

This example illustrates how pattern-seeking and unconscious biases can influence our perceptions, even in the most mundane of situations.  Because scientists are also human, our interpretation of our findings can also be affected by these cognitive biases.  Statistics and rigorous experimental design are imperative to prevent our biases from clouding our scientific judgement, causing us to seek results where no pattern actually exists.


  1. I was quite literally talking about this with my lab yesterday. Especially from a neuroscience perspective, pattern recognition (or pattern completion) is an extremely useful information processing shortcut, but you are right in that we find it difficult to prevent erroneous pattern completion.
    I find it even more interesting when you think of pattern recognition in the scope of sensory learning. In a lot of highly specialized fields, experts can recognize distinct patterns by 'training their ear', while being unable to explicitly state what patterns they are hearing. This happens in everyone from avid bird watchers to electrophysiologists listening for characteristics of hippocampus pyramidal neurons.
    So we need our pattern-seeking brains to guide us when our conscious experience fails, but unconscious biases can be the hardest to shake.
    Stupid brain.

  2. Great food for thought, Sarah. It is important to recognize that we are prone to aberrantly identifying important patterns. One part of the remedy for this, as you mention, is discipline in our statistical analysis and experiments. This is useful if we are biased into seeing patterns that aren't really there. The statistics can often tell us where there is a significant difference between groups or how far groups deviate from a norm, etc. However, I think another challenge which comes from our need to organize information and see trends is determining when these trends are important.
    For example, if I see three red cars parked at the chevron on Clairmont, it doesn't necessarily mean there is anything special about red cars or chevron or that day or Clairmont road. The same applies to science and is often where that element of uncertainty enters the equation. Sometimes I'll see a pattern (or what I think might be a pattern) that I wasn't expecting. Then, with statistical analysis to tell me whether this is a real trend and experience to tell me if anyone gives a rat's ass about the trend, I can assess how much meaning these data have.