From chemistry site Compound Interest:
The vast majority of people will get their science news from online news site articles, and rarely delve into the research that the article is based on. Personally, I think it’s therefore important that people are capable of spotting bad scientific methods, or realising when articles are being economical with the conclusions drawn from research, and that’s what this graphic aims to do. Note that this is not a comprehensive overview, nor is it implied that the presence of one of the points noted automatically means that the research should be disregarded. This is merely intended to provide a rough guide to things to be alert to when either reading science articles or evaluating research.
A great chart to print out offers comments on
sensationalized headlines
misinterpreted results
conflicts of interest
correlation & causation
unsupported conclusions
problems with sample size
underrepresentative samples used
no control group used
no blind testing used
selective reporting of data
unreplicable results
non-peer-reviewed material
One thing the News desk would add is that peer review is not a cure-all. See, for example: If peer review is working, why all the retractions? and independent site Retraction Watch.
But the chart is a good start, and armed with it, a science teacher can dissect with students the far out (as in Pluto’s orbit) claims one sometimes encounters in pop science media.
Follow UD News at Twitter!
Search Uncommon Descent for similar topics, under the Donate button.
Hat tip: Stephanie West Allen at Brains on Purpose