Continuing on with my theme of what test to use in Prism
when doing any of the multiple comparisons tests we’ve learned about (typically
in an ANOVA option), the next test I’d like to describe is the Dunnett’s test.
Developed in 1955 by Canadian Charles Dunnett, this method
addresses what is referred to as the multiple comparisons problem (or the
Look-elsewhere effect in physics). This problem/effect occurs when you consider
a set of statistical inferences simultaneously, and basically is when a result
appears significant simply by chance. The largest problem with this in regards
to our statistical analysis is when we burn our type-one error to perform
multiple analyses of the same data.
In short, the biggest difference that sets the Dunnett’s
test apart is that is only used to perform multiple comparisons of the data to
a control group. Dunnett’s test addresses the multiple comparisons problem by
using the family-wise error rate at or below the subjectively set level of
type-one error. For this type of
experimental design, you wouldn’t use either the Tukey or the Scheffe method of
multiple comparisons because they will result in unnecessarily wide confidence
intervals. Additionally, since the Dunnett’s method of multiple comparisons
relies on comparing the t-statistic (from a student’s t-test) from any given
treatment value to the control group, the statistical hypothesis of the
experiment can be either one-tailed or two-tailed in nature.
A common application example of the Dunnett’s method for
multiple comparisons that one of us may easily encounter as biological research
scientists would be for drug treatment studies where a mock control is used as
the baseline comparison for subsequent dosage or treatment variables. Something
important to remember in the results of any multiple comparisons study is that
non-significant to know. For instance if you were
studying the comparative effects of novel drugs to the currently accepted drug
treatment regime for a disease any non-significant results may help prove the
efficacy of your new drug.
No comments:
Post a Comment