Experiments fail. Money and resources are wasted. Time is lost.
In today’s scientific climate, negative results, or non-significant results,
are synonymous with failure. Spending months to years pushing through an
experiment to end up with a non-significant p-value for a hypothesis is
discouraging. There is a pressure and bias towards publishing positive
findings, and withholding negative results.
Publication bias occurs when the “probability that a study is published depends on the statistical significance of its results”. Studies estimate that positive results are three
times more likely to be published than negative results. This bias exists on multiple levels. Groups
with negative results are less likely to submit to journals
for publications, assuming the negative outcomes are a mistake on their part, or that no one
is interested in reading about negative results. At a peer review level, scientists are more
likely to question the quality of science that is occurring when negative
results are obtained, especially if the results oppose ‘common knowledge’ in
the field. At a publishing level, journals, particularly ‘high-impact’
journals, want to publish groundbreaking research, which may come at the
expense reproducibility or other measures of good science. At this point, negative
findings, despite the quality of the science behind them, do not attract
publishers. Publication bias skews the literature, results in a waste of
time and money, and inhibits the potential of knowledge that may be obtained
from research.
When classmates casually comment that a study “contains
no negative results, so it deserves to be in *insert high-impact journal,” the magnitude of this bias is clear
(I also clearly have an issue with ‘high-impact’ journals). The quality of science
should not be based solely on results obtained. As scientists, we should be
driven to find answers to questions, without automatically discounting and discarding
negative results. Scientists, peer reviewers, and publishers need to take a
step back, become aware of these biases, and reevaluate the impact negative
results can add to the field.
No comments:
Post a Comment