There’s an often-used saying that applies to learning and development interventions: garbage in, garbage out. If your inputs are bad, the outputs will be, too. If you don’t want garbage, it’s not as helpful to catch the stinky stuff coming out as it is to make sure there’s no trashy stuff going in.
L&D professionals get a lot of advice on how to thoroughly evaluate a program – at four or five levels. At significant cost in time and effort, they create instruments, they lobby clients to allow them to gather data, they analyze and report results – mostly to show the program did what it was intended.
But consider design and evaluation practices in other domains: for example, we don’t ask architects to subject every completed building to stress tests (evaluation after the fact). Instead, we rely on them to use their expertise to evaluate the site and design a solid structure; we have other experts sign off on the plans; and while building, we submit critical elements to inspection and testing to ensure standards are met.
I think that the better use of L&D resources would be to ensure quality programs before they go out. That requires two investments: better analysis up front, and better attention to quality in process.
Better needs analysis. A frequent complaint that I hear from designers is that they don’t get to “do” needs analysis. Projects are handed to them from others who haven’t necessarily done a thorough review of the situation at hand. L&D professionals are too often designing without a solid understanding of the problem training is meant to solve, the skills being developed, the people who are the intended learners, or the environment in which they work. It’s critical to ensure we’ve identified the right knowledge base or skill set that needs strengthening. And it does little good to boost people’s skills if their everyday workplace doesn’t support using those skills, or indeed actively discourages those skills in favor of other actions that appear more efficient or more rewarded.
L&D professionals should continue to strengthen their skill at needs analysis, including the judgment needed to determine how to approach each project uniquely and efficiently. It’s about asking better questions to the right people, and about influencing sponsors and stakeholders to allow the kind of assessment that will clearly define the knowledge and skills needed and the barriers and supports for enacting them in the workplace.
I know getting a good needs analysis underway is not always easy. I’ve done my share of projects that – due to a variety of constraints – have had a shaky start without the depth of needs analysis which might have led to stronger recommendations. Designing a program can be done with light needs analysis – especially if you bring a lot of experience into the project – but there are a lot of risks in that approach. Better to invest time in analysis up front than find out afterwards through evaluation that what was done missed the mark.
Better in-process quality review. As they design interventions, designers need to focus on the degree to which those programs are built in alignment with what is known about learning. They should run mental checklists (if not actual checklists) of the quality indicators for active training design, or learner experience, or visual design and accessibility, or whatever is best applied to the training and development program at hand. In that spirit, for example, webinars would follow recommended practices, mentoring programs would account for what research has shown regarding what makes mentoring work and learning games would meet the criteria for effectiveness as a support for learning.
L&D needs more up-front quality review and formative evaluation in the design process. Building with quality assumes designers are versed in the proven techniques that support learning – and cognizant of the technique’s nuances, not just a set of bullet points that sketch out the approach. A quality review effort might include devising quality checklist to ensure programs are well-designed and fit to purpose, and it might also include conducting real pilot tests – of specific activities or the whole program – with enough time to make changes before roll-out if results are not strong.
I’ve been using my own design quality checklist* for quite a long time, and I add to it regularly when I come across additional research that I want to remember when I’m designing programs. I run that checklist before I finalize the design – to make sure I’ve accounted for everything that’s important for the project, and to remind myself of guidance to keep in mind as I develop the program further. I certainly don’t hit every item every time – some items don’t apply, and some can’t be achieved within constraints, and some I don’t check because <reasons>. But running the checklist helps me to refine my approach, and it makes me more confident that the resulting program will hit the mark.
Using evaluation for decision-making
Despite this post’s provocative title, I’m not suggesting that evaluation is useless. L&D professionals should pursue post-training evaluations when those evaluations are going to be used to make important decisions. Evaluations can be especially useful if designed to assess the results of a pilot and to identify needed improvements before wide roll-out. They can help to guide learners to extended activities if the training program didn’t quite work for them. They can validate that learners have the knowledge or proficiency to perform key tasks. If designers can identify specific decisions that will be made based on the data collected, then by all means, invest in making data-informed decisions.
But let’s not slavishly implement evaluation procedures just because. If we don’t allow garbage in, we’ll seldom get garbage out. And maybe our time is better spent focusing on the quality of the inputs rather than performing a sniff test on the outputs.
* My checklist shorthands some detailed guidance and may not be useful to those who don’t have knowledge of or access to the research from which it was drawn. (Some items I even have to refresh myself on if I think they apply.)
Leave A Comment