By reviewing a random subset of
thoughts that occur to me throughout the day, it is clear to me that the
arguments that I make and conclusions that I reach on a regular basis are much
more likely to be biased in some way than to be accurate, truthful reflections
of the state of the things. This is likely to be caused by both innate biases
of the human brain, struggling to build operational frameworks of the world
based on limited input while minimizing time and effort (i.e. an evolutionary adaptation),
and a certain alignment of incentives in our lives that makes it rewarding enough
and/or not too damaging enough for us to warp our interpretations of reality,
intentionally or subconsciously, to fit a certain mold (see Dan Ariely’s TED
talk [here]).
Nowhere are there higher stakes
for recognizing and minimizing bias than in scientific research, which is built
on the foundation of seeing the universe as it is, rather than as we want it to
be. Still, internal and external biases abound, from large scale publication
bias and research “trends”, to reports that unexpectedly high percentages of
published research is poorly executed or flat-out wrong (see “Trouble at the
Lab” [here]). The incentive structure around science, which funds the grander
claims, rewards prolific publication of novel results, and undervalues quality
assurance and scientific “due process”, is not helping either. Every time
research funding is quickly cut at the first sign of economic uncertainty,
every time the media rushes to report unconfirmed results, creating extra
unneeded incentives, I can’t help but feel that science is still trying to
prove itself to society, to justify the increasing funds coming its way.
The hope of those pondering the
fallibility of science, including myself, is that it is a self-correcting
process. That amid the chaos, or precisely because of it, the truth will
prevail while houses of cards built on wrong hypotheses will come crashing
down sooner or later, however high they may rise. Perhaps by slowing the process
down a little bit, and investing a little more in the “uncool” science of
checking and verifying, we can make science a more efficient and less expensive
endeavor.