Reports of new research studies can be found almost everywhere on a daily basis: On the nightly news, in mainstream magazines, in forwarded e-mails, on Facebook and Twitter feeds. But can these reports be taken at face value? Let's look at a few examples:First a small blurb published in Reader's Digest in February 2007:
Fresh flowers can cheer up any drab room -- not to mention score points with your sweetie. But make us kinder as well? So says Harvard psychologist Nancy Etcoff, PhD.
She sent 54 people either a mixed bouquet or a candle in a hurricane glass. Flower recipients said they felt more compassion toward others than those who got the candles, and reported more enthusiasm at work too. Place blooms in the kitchen or bedroom so you'll see them first thing in the morning. That's when moods tend to be lower and blossoms can provide their biggest boost.
This sounds like a very official, authoritative study. It comes from a "Harvard psychologist" and even details a little about the methodology. However, I searched pretty extensively and was not able to find that it had been published in any academic journal. The researcher, who is an instructor teaching one class at Harvard, does not even list it as part of her published research on her Harvard web page. The study is, however, extensively touted on the web page of the Flower Promotion Organization and the Society of American Florists, which includes as part of the study "documentation" an article by an interior designer on where to best place flowers in your home. To me, this brings the study into some question. Who paid for the study? Was it commissioned by one of the organizations that exist to convince people to buy flowers? If so, there is a possibility of a bias in the methodology or conclusions. It's difficult to evaluate a study that has not been published. Why was it not published? I am left with more questions than answers, and I'd hesitate to place too much faith in this study.
As I suspect in this case, all too often media reports of studies are based on a press release alone, and the reporters never see the real study. Press releases are written in a way to make headlines, not necessarily for complete accuracy, and reporters under a deadline don't always do much more than use the facts in the press release.
Another example, taken from my local newspaper:
Wow, pretty interesting findings, aren't they? Well, tracking down the actual study turned out to be difficult (you can read about it here) but when I did, it turned out to be a study on rats. Yes, rats. The newspaper article, with its references to "you" and "mom", implied that the research was done with humans, but in fact it was animal research. Results from animal research may or may not be applicable to humans.
In another article from Reader's Digest, the author lists a bunch of research findings about the potential harms of vitamins, among them this claim:
Researchers at the National Cancer Institute (NCI) found that men who took more than one multivitamin daily had a higher risk of prostate cancer.
However, the author oversimplified the NCI report in a way that can be easily misunderstood. To quote from the NCI report:
We found that multivitamin use was unrelated to overall risk of total and organ-confined prostate cancer. (p. 761)
The possibility that men taking high levels of multivitamins along with other supplements have increased risk of advanced and fatal prostate cancers is of concern and merits further evaluation. (p. 754)
A subtle difference, true, but also an important one. Multivitamins did not affect the overall rate of prostate cancer, or of the subgroup of cancers that have not spread. High levels of vitamin use (more than 7 multivitamins in a week, combined with other supplements) may have had an effect on the more rare, advanced forms of cancer.
That word “possibility” is very important. It signals that there is a gray area here where something has not been proven or disproven, but could go either way. It may be the vitamins and supplements contribute to the aggressiveness of cancers, but it also may very well be that they don’t. That’s why the authors recommended further research. When you read the word “possibility” think of it as a hypothesis – a guess – something that should be researched further.
In this study, researchers had a fairly small number of people with advanced cancer – only about 1/10 the numbers they had for localized cancer. This makes it hard to draw strong conclusions about the advanced cancers. Yet those conclusions are the very thing that the popular media article focused on!
These kinds of issues are fairly common in the popular media. Remember the writers and reporters at the various magazines, newspapers or web sites aimed at the general public are likely NOT trained in reading and understanding research. With this lack of knowledge, they are likely to make easy mistakes, like broadly generalizing and not picking up on subtleties like the prostate cancer article.
With a little education and a careful reading of research, you can skip the mainstream media, learn to read the actual studies yourself, and avoid pitfalls like these. Next time, I’ll discuss how you can find the studies to read them for yourself.