What Do Scientists Think of Science Coverage in the Media?


What can journalists do to improve science reporting? Monkey Business/Jeffrey Coolidge/Getty Images
What can journalists do to improve science reporting? Monkey Business/Jeffrey Coolidge/Getty Images

Did you know grilled cheese sandwich lovers have more sex? Time magazine reported on a study that says so. Don't take it too seriously though — it came from a dating site that surveyed 4,600 of its users, just in time for National Grilled Cheese Day.

In other words, it was totally unscientific. And Time covered it tongue-in-cheek. But that's not always the case. Sometimes a legitimate study might get covered in an illegitimate (but entertaining) way.

Dr. Martin Gibala is the chair of the department of kinesiology at McMaster University in Ontario, Canada. His study about sprint interval training was repeatedly characterized in headlines as being a "one-minute workout," even though the referenced 60-total-seconds burst was only a portion of a 10-minute regimen. "Some media did a fantastic job and they provided a balanced piece that was pretty much right on the mark," he says. "Other media jumped on a headline and sometimes you deal with media that haven't even read the study."

But he's sympathetic to the plight of journalists. "On the one hand, the media often wants to work in sound bites, or they want a message boiled down quite simply, and that's often a challenge because scientists are often by nature cautious and that frustrates the media because scientists are unwilling to go outside the bounds of their specific study," he says. "Ideally, they're able to meet in the middle and both be happy."

Slight misrepresentation wasn't exactly irksome for him, however. "In some respects, it's the price to pay a little bit because obviously as a scientist you want to make an impact, have people talk about your research," he explains, noting that a short piece could lead to an in-depth 45-minute podcast.

Sometimes, the misleading information is not the fault of the journalist. "Sometimes the study's not really good," says Dr. John H. Johnson, economist and author of the book EVERYDATA: The Misinformation Hidden in the Little Data You Consume Every Day. "Unless you're an expert in the field it can be hard for a journalist to know."

One instance of this would be the now completely discredited vaccines-cause-autism study, which has done untold damage to the immunization movement.

One of Johnson's favorite examples of wildly exaggerated "science" is the Mozart Effect study, which says that playing classical music to babies can make them smarter.

 "The study was actually done on three to four dozen college kids," he says. Somehow, this was parlayed into music improving the cognitive skills of infants and very young children. This is one case where Johnson believes it could have helped reporters, had they delved deeper. "Looking a little closer at what the study means and really thinking hard before extending it to the broader population." In fairness, the author of the paper never intended for her study to apply to infants.

Media are in constant competition for clicks, which can lead to sensational headlines. John Oliver, host of "Last Week Tonight" recently opined on the topic, referring to sensationalistic coverage of science as nothing more than "morning show gossip."

"Journalists want to tell stories that people are interested in, but the reality is that you have to deal with details," Johnson explains. "Most research, you're usually making small, incremental progress. It's a rare case when 'wow, today we found out this revolutionary thing is going to change all our lives.' It's a lot more interesting to tell a story about grilled cheese improving your sex life." 

Improving Science Coverage

So, how can media coverage of science be improved?  Research scientist Luz Claudio, Ph.D., author of How To Write And Publish A Scientific Paper: The Step-By-Step Guide, advises researchers to write their own press releases with the help of their institution's press office to get the message out. "Also, they should answer reporters' inquiries in writing rather than giving telephone interviews in order to reduce the potential for misinterpretation of results," she says in an e-mail interview.

And what can readers, and journalists themselves, do? They can see who funded the study — would it be surprising that a coffee company funded a study that said coffee was good for you? They can look at how many people were studied; if it's just 20 women, can the finding apply to all women?. And, most of all, they can understand the difference between causation and correlation.

For example, there was a study that said using an iPhone makes you smarter. If that were really true, that would be causation (just by using an iPhone you would suddenly be able to win on "Jeopardy.") "But when you look at the study," says Johnson, "what it actually says is that people who live in states with the highest percentage of users also have the highest percentage of bachelor degrees. One has nothing to do with the other!" That, my friend, is correlation, but not causation.