Science constantly gives rise to new information, new technologies, and new ethical dilemmas. To keep abreast of such changes, we need good science reporting in the newspapers, television and online. However there is a fundamental disconnect between the way science works and the way the media works which leads to big problems in mainstream science reporting. This is excellently illustrated by two of today’s news stories.
From the BBC, we hear that daily caffeine protects you from dementia by blocking the effects of cholesterol. This is just another story in the on-going cycle of coffee/wine/beer/chocolate/fat/carbohydrates/sugar being good/bad for you. Not a month goes by without some new scientific study bearing in upon the health effects of one of these foods. Such articles typically mention a single study of a single benefit of the food, paying no attention to other health benefits or detriments. They are often deliberately deceptive in their narrow focus and are useless in actually assessing the food. The second article (from the Guardian) makes this point well by reporting scientists’ complaints that there is actually no hard evidence that drinking several glasses of water a day does you any good.
The great problem with such stories is that science is a vast pile of knowledge, which grows at the edges in fits and starts. Many suggestive studies that come out are quickly overturned. The investigators may have used a poor technique or have achieved a result due to chance alone. With so many papers published, scientists know that many false or misleading claims will appear at the edges, but that further investigation will ultimately sort things out. Science is thus particularly ill suited to journalists who wish to report only the latest outlandish claims. Such claims may sell papers, but it is irresponsible to report them. If journalists really wanted to inform us, they would stop reporting the wild news from the frontier and start reporting the considered findings that follow years later when (and if) the results are confirmed.
Since threatening news also are more salient, there will be a tendency to overreport risks and for the public to notice articles mentioning risk. This produces a multiplicative bias. Extra levels of transmission, like having an expert give opinion or editors selecting what is “hot” compound this.
However, where does the ethical onus lie here? Ideally researchers should explain clearly the limitations and context of their research, journalists report accurately, and readers critically compare different sources and interpretations. In practice of course all have their agendas and biases (getting more funding, selling newspapers, getting entertained/informed etc). But if one group were to try to straighten up, which one should it be?
From the above multiplicative bias standpoint, the way of reducing bias the most would likely be by reducing the number of levels of filtering between the researcher and public, i.e. for the public to become scientifically literate and look up the original research. Does this mean that the public has a greater onus to educate itself than journalists have to report well, since it would improve understanding the most? Or would the cost of educational effort across the population be so large that it would be better to focus on the smaller group of journalists and editors even if they would be less likely to produce the same level of quality?
Toby, while I fully agree with the basic point of your post, I have some misgivings about one of the examples you use to illustrate it. The beneficial effects of moderate coffee drinking for a variety of disorders is by now well established. The case of coffee provides, if anything, a good example of the disconnect between the scientifically literate public and the general population that gets their news from the popular media. As this article notes, “recent advances in epidemiologic and experimental knowledge have transformed many of the negative health myths about coffee drinking into validated health benefits.”
Here’s the link to the article (apparently the html tags didn’t work):
http://www.sciencedaily.com/releases/2007/04/070430125523.htm
Comments are closed.