Thursday, April 7, 2016

"Awareness reduces bias"

In a recent NYT column, Nicholas Kristof discusses the pervasiveness of racial bias in America.

He asks,

Why do we discriminate? The big factor isn’t overt racism. Rather, it seems to be unconscious bias among whites who believe in equality but act in ways that perpetuate inequality.
We have to assume, I think, that our proclivities to hold biases infect our research habits in the same way they affect other aspects of our lives, including how we interact with people who differ from us.

How can we control for bias?

In the context of racial bias, Kristof's argument, in effect, is that simply realizing you have these biases goes a long way to reducing your bias.

He concludes by describing a follow up study of NBA referees, who had been shown previously to penalize African American players at a higher rate than Caucasian players. Once the refs were made aware of their bias, the follow up study showed that their bias had been eliminated. "Awareness reduces bias."

So it seems that some simple level of awareness that we might be biased can be enough to actually reduce bias!

With biostatistics, try not to let the math, the jargon, the models, the symbolism or the protocol cloud the big picture.

The only reason to conduct proper statistical design and analysis of experiments is to control for bias. We follow these design and inference procedures not because they are better than alternative design and inference procedures, but because they make us aware of our innate tendency to be biased.


6 comments:

  1. Do you think that it is actually feasible to completely eliminate our biases? I don't think that it is of human nature to prove yourself wrong before you prove yourself right, we need to maintain some sort of confidence. Maybe the best way to eliminate your own biases is to set up a system of checks and balances, where you collaborate with other researchers (someone without a common interest in your research) and employ them to be your voice of reason. While we'd like to think that we can completely eliminate our own biases, at the end of the day we still are still driven by success (and perhaps paying the bills).

    ReplyDelete
  2. Do you think that it is actually feasible to completely eliminate our biases? I don't think that it is of human nature to prove yourself wrong before you prove yourself right, we need to maintain some sort of confidence. Maybe the best way to eliminate your own biases is to set up a system of checks and balances, where you collaborate with other researchers (someone without a common interest in your research) and employ them to be your voice of reason. While we'd like to think that we can completely eliminate our own biases, at the end of the day we still are still driven by success (and perhaps paying the bills).

    ReplyDelete
  3. I completely agree with the idea that being aware of our own biases allows us to consciously reduce our biases; however, I think that as humans, we are reluctant to acknowledge or admit that we have biases, especially if they are biases that affect us or paint us in a negative light (e.g. unconscious racism, holding an unpopular belief that you never thought was unpopular before someone challenged it/reacted negatively to it, etc.). I imagine that the reaction would be similar in science - being made aware of an unconscious bias to a certain condition or hypothesis and then realizing that your analyses and treatments have been biased all along. Of course, not everyone is uncomfortable with admitting that they have biases, but I also wonder if openly admitting and acknowledging that you have a bias can also lead to a different kind of bias (for example, seeing a borderline significant result but saying, "Oh, I must have been biased when analyzing this data or setting up this experiment, so this must not be significant" rather than simply trusting your skills and statistical analyses) or more bias. (But I guess that is also why randomization of samples and conditions is important in reducing experimental bias.)

    ReplyDelete
  4. I really like this point, awareness can be amazing for reducing bias, but I also feel awareness can cause bias in other ways. If we head into an experiment aware of possible results, we may start to look for those results in our data. Even if we are trying not to be biased towards a positive results, how do we prevent hypothesis driven questions from biasing us? We analyze data to see affirmation or rejection of a proposed question, but what if there is a completely different reason/interaction happening? Too much awareness might limit us from seeing the picture we should see.

    ReplyDelete
  5. I agree with the statement that awareness reduces bias. However, it is the “awareness” part that people have the most trouble with. People are usually ignorant about the fact that they are biased and sometimes when they're called out on it, they're reluctant to admit it, mainly because our biases usually come from who surrounds us and how we were raised. I believe the same can happen in science. Sadly, when we are first introduced to statistical methods, we are usually not taught the proper way to conduct statistical analyses (this is made evident by all the papers with bad statistics). In a perfect scenario, the statistical design and analysis of the experiments would help us control for bias. However, the lack of knowledge about the correct way to set up a statistical analysis sets us up for bias from the start, even if we have the best intentions of reducing this bias. In other words, ignorance (in both, social and scientific aspects) tends to direct us towards bias. This is why courses, like this one, where they teach us about the correct usage of different statistical tests are useful and necessary. So, even though I agree with the statement that awareness reduces bias, I would also like to add that in order to be aware, one needs the proper education.

    ReplyDelete
  6. Being "aware" of bias can mean a couple different things. The first interpretation warns us to be aware that bias exists and that we might be vulnerable to it. That much itself seems obvious. It would be hard to get to this point in our scientific careers without knowing how easy it would be to deliberately bias a result. The bias needn't be deliberate either, we have surely caught ourselves hoping for one outcome or treating certain data as more "important" based on how well it aligns with our preferred outcome. This is natural and human and I don't think people need to be taught that they might be biased.

    The second interpretation I could see would be to be aware of specific biases. In the example with the referees, it isn't as though they noticed their biases themselves. They were shown their bias and then could correct it. I think this gets at the heart of the matter. We have no way of knowing our mistakes before we make them, and so just deciding to "become aware" of bias isn't really doing much. However, a middle-of-the-road approach seems to be making note of how you have erred in the past, and taking care not to make similar mistakes in the past.

    Additionally it seems like one needs the right tools to examine a situation (experiment, repeat, project, paper, etc.) and find the biases in it. On one hand it can be hard for an arbitrary party to assess when bias came into it, since a lot is in our heads and in subtle changes to how we might pipette something or how much we might mix a solution. On the other hand, if we were blind to the bias in the first place, taking an inventory of biases is pretty likely to miss this bias as well, unless we have experience with it, or unless, as Roxana says, we are educated against these biases.

    ReplyDelete