I found one interesting
common thread that many of the articles touched upon was how science is
portrayed to the public and media and how this creates both misrepresentations
of what the data actually says and the pressure of scientists to live up to
these misrepresentations. Based on
conversations I have had with family and friends who work outside of the world
of science, I think that many of these people’s perceptions of biomedical
research and the scientific process are far different from what actually
happens. Most picture that “eureka”
moment where one mysterious, colorful liquid is dropped into another one and a
miracle cure to a terrible disease is created.
As we all know, this is not what happens and very often not even the
goal of a given project. Media
portrayals of exciting and important science that does emerge often use
buzzwords like “miracle” and “breakthrough”, even when, as one the assigned
articles stated, the effects of these drugs do not even have human data to back
them up. Marketing science this way puts
pressure on researchers to produce results that can make headlines and
inevitably introduces bias into what is supposed to be an objective
process. Would it be better, then, to
give a more realistic portrayal to the lay public so that they have a better
view of what good science really looks like?
Perhaps, but some of the public’s trust in science and the reason they
see it as important comes from the idea that science provides grand, definitive
solutions to serious medical problems.
Do you lose public trust and media interest from more accurately billing
scientific discoveries as small, incremental steps that lead to a bigger
picture? Can we trust the public to see that bigger picture (one that sometimes
it is difficult even for the scientists to see)? I think it’s a tough, but
important problem for the scientific community to address in the war for
against biased research.
Showing posts with label pop journalism. Show all posts
Showing posts with label pop journalism. Show all posts
Monday, January 22, 2018
Monday, January 18, 2016
Misrepresentation: A Two-Fold Problem
I specifically chose to read the Vox pieces and the Economist article to be familiar with how misrepresentation in science is perpetuated by popular journalism. The Belluz article introduces the hyperbolic nature of popular journalism as it muddles the field of drug discovery while suggesting a couple of likely culprits for these misleading exaggerations. Belluz suggests medical overyhype as endemic in propagating the "breakthrough" culture of journalistic writing but also highlights that 55% of these misappropriations are generated by the journalists themselves. The difficulty in replicating many publications proposed in the Economist piece should not go unheard, but I posit that the scientific community will always experience a degree of misrepresentation and scrutiny when a non-scientist is asked to relay the meanings of advanced research. I find this suggestion especially relevant in Belluz's focus in the latter half of the article where despite acknowledge a majority of linguistic hyperbole is journalistic in its source she retains her focus on medical overhype while languishing over the medical community's smaller, but admittedly relevant, implication in causing these exaggerations. To me, this gap in relaying scientific writing into journalistic writing is just as important in reducing the pressure on scientists to discover "breakthroughs" in an increasingly competitive and unforgiving environment, let alone to muddle the true nature of a scientific discovery through polarizing rhetoric that is already rampant in popular journalism.
But to address the important points raised in the Economist as well as the Berg opinion piece, I feel that a greater sense of self-scrutiny must be present in the scientific community. If submitted manuscripts to large journals such as PLoS ONE fail the simple requirement of having a sound methodology then it becomes obvious that not only are requirements set so low to where significant discoveries might become increasingly insignificant, but that the barometer of what the scientific community (or a part of it) accepts as a threshold for contributing to work in your relevant field is alarmingly low. The utility of the PubPeer forum analyzed in the second Belluz piece presents a small solution to increasing the stringency of published work in a peer to peer manner. Complaints about its potential gateway to slandering sound work is understood as a risk, but the alarming numbers concerning current publication irreproducibility warrant this risk until journals themselves exert stricter requirements both in their review process as well as in their submission requirements.
But to address the important points raised in the Economist as well as the Berg opinion piece, I feel that a greater sense of self-scrutiny must be present in the scientific community. If submitted manuscripts to large journals such as PLoS ONE fail the simple requirement of having a sound methodology then it becomes obvious that not only are requirements set so low to where significant discoveries might become increasingly insignificant, but that the barometer of what the scientific community (or a part of it) accepts as a threshold for contributing to work in your relevant field is alarmingly low. The utility of the PubPeer forum analyzed in the second Belluz piece presents a small solution to increasing the stringency of published work in a peer to peer manner. Complaints about its potential gateway to slandering sound work is understood as a risk, but the alarming numbers concerning current publication irreproducibility warrant this risk until journals themselves exert stricter requirements both in their review process as well as in their submission requirements.
Subscribe to:
Posts (Atom)