If a local journalist were to write an article about a local pol, using phrases like “this suggests that,” or “may indicate,” or even “potentially,” they would rightly be lambasted for speculation. In conventional journalism, you aren’t supposed to fill in the gaps with too much of your own analysis.

In covering science, you also cannot fill in gaps too readily. For accuracy’s sake, you probably shouldn’t attempt to fill in the gaps at all. But counter-intuitively, if you do not use speculative phrases like those above, you’re likely guilty of as serious a sin as if you had used them in another context.

A recent Poytner article collates some of statistical guru Nate Silver’s admonitions to the rest of journalism about statistics. For the most part, those same rules apply to science journalism:

4. Take the average, stupid. Recent stories that reported Oreos are as addictive as cocaine failed to reflect the subtleties of the research that prompted the articles, Silver said. Doing the work isn’t necessarily difficult: Silver’s blog, FiveThirtyEight, uses a simple count of polls and averages in some of its analyses.

One real problem is that, even more than political coverage, audiences and journalists alike tend to take science news as “news x prejudice.” If the new data supports a held belief, it is an amazing new study. But if it challenges that belief, then it must be dubious science, no matter the methodology.

The trouble is: studies rarely if ever “prove” anything. All studies are on some levels flawed, and if that study is in any way psychological or requires the honest participation of humans who know they’re being watched, it is necessarily riddled with pitfalls. The only thing one study can do is to add to or subtract from the body of evidence supporting a theory.

It is also worth noting that university press releases serve the purpose of advertising that school’s work and helping them secure more grant money. Even if the science is sound, the resulting press release may itself be a bit over-blown.

In particular, the recent University of Rochester news about a sleep study, suggesting that sleep is when the brain does the lion’s share of its cleanup, has received a huge amount of press, both locally and nationally. Most of it has gushed effusive enthusiasm, because of course, we all love sleep.

But while the findings of the report are significant, they are probably more significant in the context of the work the University has been doing on the glial gridwork of maintenance systems in the brain. That work is ongoing, very significant, but ultimately inconclusive. On it’s own, this one study just confirms a long-standing assumption of the nature of our circadian rhythm: that our down time is spent patching up our bodies for the next day.

Since that thirty second segment on the morning news may be the only science news a lot of people ever see in a day, it is important not to over sell it. And if a significant part of journalism is trying to convince an audience that the news you’re reporting on is worth reading (it is), then the temptation to amp it up is a pretty natural outcropping.

But science “facts” have a curious way of boomeranging in our culture, disappearing into the echo chamber and coming out as something bizarre. It is worth slowing down when discussing science publicly to at least acknowledge what we do not know.