One of John Oliver’s comedic reports is making the rounds in my Twitter feed this week. In the Last Week Tonight video, Scientific Studies*, Oliver calls out the absurdity of headlines driven by narrow research results – and worse – by complete misinterpretations (or misreporting) of research results. It’s both funny and scary at the same time, even more so because the diatribe hits so very close to home. One particularly important point from the video:

“Scientists themselves know not to attach too much significance to individual studies until they are placed in the much larger context of all the work taking place in that field. But too often, a small study with nuanced tentative findings gets blown out of all proportion when it’s presented to us, the lay public.”

Since I’ve written a number of blog posts and articles on research in L&D (frequently for ATD’s Science of Learning blog), my first reaction was to worry whether my own reporting has ever selectively highlighted results to improve the headline (so tempting, so easy to do!). Reading my posts, I think that I have been fair about reporting what the research says, noting the nature of the study, and sometimes pointing out that that the findings were from only one study. But those reading headlines and bullet points may not have caught those nuances. Headlines, especially, are meant to grab attention and are often a bit hyperbolic.

When you work in learning and development, research abounds. We have a huge cache of research and evidence-based frameworks related to overarching learning processes (e.g. social learning, cognitive processes) and related to specific techniques that have impact on learning, retention, and performance. Practitioners in our field are often very interested in hearing the evidence and understanding how to best apply it in their own contexts.

But here’s a worry: In our quest to be more evidence-based, it can be easy to miss-step – to think we are basing our practices on well-vetted ideas when we’ve actually stumbled into misrepresented science. Here are some ways you can avoid being taken in:

Read beyond the bullet points. When you hear about research that seems to be relevant to your work, take the time to read the full study before you act on the headline. It is very tempting to read bullet points and believe you have the gist of what the theory and research actually says, but it is also too easy to misinterpret the meaning of those bullet points. The internet gives us plenty of slide decks and short articles that summarize research, but these often miss important nuance and context that you need to know before you apply the ideas yourself.

Vet the sources of information. Check into the background and expertise of the people who are conducting studies and publishing reports. Consider whether research is being played up in order to sell a particular product or service, and investigate those claims thoroughly. Of course, sponsored research isn’t necessarily bad research; but it is important to check the study’s design and data sources.

Review cited materials. It is also important to go to the source of findings and recommendations reported by others. In a recent discussion in one of my online courses, the students latched onto the message in a short article that contained a statistic from a recent study by a reputable research organization. The presence of that bit of research evidence gave weight to the author’s message. But the author’s point was not actually supported by the study (which I only knew because I had read the study being quoted). The lesson is to go to the source of interesting findings and advice, and draw your own conclusions.

Look for support and critique from others. Use internet search techniques to find articles that reference the work you find interesting. This is often useful when you are intrigued by popular books and videos (or when your client brings one of these to your attention). It might be important to find critiques of the work; It’s possible you won’t find the critiques credible, but sometimes they rightly call conclusions into question. (Some tips: In internet search, couple a book title with the word “review” or “critique.” Use the Boolean search operator, link: <URL> to find sites that link to that URL. Use Google Scholar to find other works that cite a particular research study.)

Ask your network for advice. Tap into your colleagues and professional social network to explore whether others are following the guidance you hear and whether they are having success with it. This is one of the real benefits of having a wide personal learning network.

Develop mutually supportive relationships with scholars. When the research you read is full of obscure language, it can be helpful to speak with an academic colleague about your interpretation of the material. You might even consider contacting the researcher directly. When I wrote some research columns a few years ago, I was able to learn that my interpretation of results wasn’t quite justified by the evidence, and the individual researchers were able to help me better refine my recommendations. A scholar in a relevant field of expertise may also be able to point you in the direction of additional theory and research related to the topic at hand. It can be invaluable to a scholar to have an inside view of practice as well, so initiating conversations in the academic community may be more welcome than you think.

Strengthen your own knowledge base. Make it a point to follow the research in your area of practice. Identify a few leading research journals and regularly scan abstracts to find articles of interest. Read scholarly books that synthesize research into practical recommendations. As you are building your personal leaning network, follow scholars as well as practitioners so that you are linked into deeper academic conversations relevant to your work.

Following this advice may help you to avoid being taken in by research reports that don’t tell the whole story.

*Caution: some profanity