Junk Science

junk science

Baby, you believe what you want to believe.” – Tom Petty

This past weekend there was a report that the mysterious case of Jack the Ripper had been solved. Not by Scotland Yard, or INTERPOL, or even the Smithsonian Institution.

No, the case was solved by an amateur detective and a university professor. While I am not qualified to comment on their qualifications as experts in their fields, I do note that it didn’t take long to see a follow-up article that reminded us to curb our enthusiasm for the mystery having been solved.

The story and subsequent article reminded me about how common junk science is these days. Now, I’m not here to tell you that the work of Russell Edwards and Dr. Jari Louhelainen is junk science. Like I said, I’m not qualified to make such a judgment. But to choose to have their story published in a newspaper (I think the Daily Mail qualifies for that, still) rather than a peer-reviewed publication seems like an odd choice.

So, what is Junk Science? We hear the term a lot these days with regards to global warming (now renamed “climate change”). The idea that we humans are having an affect on our climate is a political issue which causes the phrase “junk science” to be lobbied back and forth. But to me junk science (or junk research), is something that shows a lack of quality, a lack of asking the right questions, and a lack of being open to the idea that your conclusions may be wrong. You know, all the things that make up the scientific method.

There is another issue at play here. This type of junk research could be submitted to 100 peer-review publications, have 98 reject the research for publication, but the 2 that do publish the research now result in those ideas being considered mainstream. For an example of this you need to only look at the MMR vaccine research. This is a problem, and one that I find that happens all too often these days. It only takes one false report being broadcast on a “news” station to suddenly become fact.

Perhaps Mr. Edwards felt that, given how unreliable information can be for those of us living in the Information Age, releasing the results to the Daily Mail first was just as good a place as any peer-reviewed publication. After all, it’s not like anyone is going to go to jail, or die of measles if they are wrong.

But this is an example of how bad things have gotten for the scientific community. People have no idea what to believe. What we are left with are people taking alliances with scientific results that align with their core values. In short, if the science backs their current beliefs, then it must be true, and everything else is false, no matter what the source.

As data professionals we often fall into the same trap. We prefer to work with tools and results that are familiar to us and support our beliefs. How many times have you heard people vehemently oppose anything made by Microsoft, no matter if it is the right choice for them or not?

If you find yourself needing to persuade others that your research and data has led to a proper conclusion then here are the steps I suggest you follow:

  1. Ask yourself a question. (What is this data telling me?)
  2. Research and gather metrics regarding your question.
  3. Form a hypothesis. (I think this data is saying that…)
  4. Test your hypothesis with an experiment.
  5. Test again, trying to prove your hypothesis wrong.
  6. Analyze your data.
  7. Draw a conclusion.
  8. Communicate your results, even if they don’t support your belief.
  9. Accept the fact that you may still be wrong.

That last step is a doozy for most. But it’s how you build and earn trust. It’s also how you learn to be cautious when drawing conclusions without having sufficient data or evidence to support those same conclusions.

It’s hard to believe anyone when everyone believes only what they want to believe.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.