Why is calculating the ROI of L&D like finding a needle in a haystack?

This article was published in People Management

Learning professionals have long wrestled with showing the business impact of their work – could coronavirus be the catalyst for change?

 

For decades, L&D teams have come under fire for not being able to show the impact of what they do. ‘Happy sheets’ is a loose – and not altogether complimentary – reference to evaluation by learner feedback, or the Kirkpatrick model of tracking impact that dates back to the 1950s, which is still very prevalent today. A number of models for measurement have emerged in the intervening years, with varying levels of complexity. But the fact remains that measuring whether a particular learning intervention has delivered real return on investment – or ROI – is genuinely tough. 

“We’ve been beating ourselves up about this for a long time, but there are lots of reasons that it’s difficult,” says CIPD head of learning Andy Lancaster. “L&D professionals are often identifying what they’re trying to do too late in the equation, for example.” Other factors include pressures on resources and managers not being involved in the process. Add to this the fact that employees often learn from watching colleagues or searching online for ‘just in time’ support, and it’s hardly surprising L&D teams struggle to evidence hard impact metrics. 

Which is not to say they’re off the hook. With the pandemic shifting budget priorities and how we work – not to mention how employees learn – the pressure to show that value is greater than ever. “We’re in a climate where every penny counts; business leaders want to know what’s being done is delivering,” says Paine.

According to the CIPD’s Learning and skills at work report, published earlier this year, 70 per cent of organisations do evaluate the impact of their L&D initiatives in some way. The rub is that measures tend to focus on satisfaction and feedback (those happy sheets again). Just 16 per cent assess any behaviour change in participants and how learning is transferred to their day-to-day work, and two out of the top three measures cited by organisations were ‘learner reflection and feedback’ and ‘manager reflection and feedback’. Just 14 per cent use measures linked to business strategy. 

Another issue is capability – a recent survey by Emerald Works found 98 per cent believe programme evaluation is a critical skill, but only 38 per cent think they have it. More worryingly perhaps, the CIPD research found 16 per cent rarely use findings from data gathered, while 17 per cent ‘do not know how the evidence they gather is used’. 

Dr Ina Weinbauer-Heidel, founder of the Institute of Transfer Effectiveness, has dedicated her research to looking at how training can make an impact in employees’ daily lives. She worked at a business school delivering leadership programmes and began to question what was happening with the learning taking place: “I came to a point, as so many L&D people do, where I asked myself: ‘do I really make a difference?’ It’s not about putting knowledge into people’s brains, it’s about changing behaviour.” She researched countless learning impact models, including Kirkpatrick, and found there were around 100 determinants, “so no wonder it’s too complicated”. 

So she boiled these down into a series of trackable levers that have a tangible influence on outcomes. They focus on contextual measures, such as what the learner’s motivation is for learning, the role of their supervisor or how they interact with peers. “We sometimes follow a myth in L&D that training alone will change people and their behaviour,” she explains. “But in the end it’s about the organisation around them – was their supervisor on board? Were they committed to making a change? It’s not always the fault of the training if it fails.” 

Similarly, it’s not always a straight-forwardly positive sign – slightly counterintuitively – if training appears to succeed. Which is why completion rates are often a false friend when it comes to measuring true impact. In fact sometimes high engagement, particularly with digital learning, should prompt more questions rather than a round of applause, according to learning analyst Laura Overton: “Employees are probably looking for something that makes their lives easier, but if you could fix the task or the process before sending someone to the learning, surely that would have an immediate impact? Once you’ve cleared the obstacles, you can really see the skills that are needed and invest in that. Focus on what needs to change, rather than how to get a return on investment.” 

Too often though, L&D teams focus on the wrong reasons for measurement or ask the wrong questions, adds Overton: “If we are looking to measure the impact of learning interventions because we want to prove our worth then we’re on the back foot. If we’re trying to prove ourselves, it’s a fixed mindset: to validate what we’ve just done.” 

Steve Dineen, founder of learning technology platform Fuse, believes organisations too often neglect where most learning is happening: on the job. “Lots of learning technology focuses on testing how much knowledge you have retained rather than how much you have applied,” he says. “Instead, they should be looking at what the right metric is to connect to business performance, and design the programme to be in tune with that.” 

That doesn’t mean organisations should abandon “operational efficiency” measures altogether, says Kevin M Yates, a learning impact consultant who calls himself the ‘L&D detective’. “You just need to tell the whole story,” he explains. “So if you have a sales force of 500 people with a growth goal of 20 per cent, a large percentage of that population needs to do the training to drive that growth goal. And if only 100 people consume it, you might not hit that goal. Where we get stuck is ending the story there. The next question is: what is the result where people completed the training?” 

One challenge learning teams often face is pinning down those in the business looking for a change in performance so they can work with them to identify what a positive impact might look like. “The business often doesn’t have time to have that conversation to work out what they need,” Yates explains. “In this case, L&D needs to be honest with the business that not investigating their needs more deeply might mean they can’t demonstrate how they’ve moved the dial.”

David Wilson, CEO of analyst group Fosway, agrees: “There’s been a ‘conspiracy of convenience’ in the past, where it’s suited L&D and the business to put on courses, define a learning outcome and then show that they’d met it. In commissioning that course the business didn’t really want to be challenged about the outcomes. The money’s already been signed off and no one questions it too much. But it should be the business objective, not the learning objective, that the learning supports.” In this quest, measuring outcomes rather than financial metrics may be more realistic, he adds: “Look at the time it takes someone to do a job; how many calls were answered correctly the first time, for example. Focus on the things the business already measures and values, and how you impact them – then it’s up to the business to put a financial value on that impact if it wants.” 

Encouragingly, working closely with the business to define these impacts is exactly what many savvy learning professionals have been doing during the Covid crisis – with the pandemic in many cases accelerating this positive trend. The team at Ann Summers, for example, when faced with the shutdown of its physical stores and sales reps being unable to host parties, beefed up online learning so reps could become skilled at hosting Facebook Live parties – resulting in party planning sales going up by 300 per cent. Another retailer, beauty specialist Deciem, found many employees felt isolated after its shops and labs closed. So it focused learning content on mental health and community building – and productivity rocketed. “This isn’t traditional L&D fayre but these aren’t traditional times – the team was meeting immediate needs,” says Paine. “Once you get that co-creation model in your head, you stop asking stupid questions.” 

This collaborative approach also means L&D teams can work to continually improve content and delivery in a clear-sighted, business-focused way. New tech-driven delivery methods of course make this process much more effective and instantaneous than ‘happy sheets’ for in-person training perhaps ever did. Leeds Building Society, for example, has found that moving to a more blended learning environment has meant it can keep track of metrics around user engagement and behaviour more easily and tweak things accordingly. “We are still learning how to use the data we have to best effect,” admits Becky Hewitt, director of people. “But we use this to refine our offering. If feedback tells us material is difficult to navigate, we listen and change it in the next update.”

Pandemic-fuelled virtual working this year has arguably been a blessing in disguise for L&D, then, in terms of accelerating this kind of business-focused delivery and impact monitoring. But the crisis also presents a vital challenge for L&D teams, Wilson believes: “The genie is out of the bottle now and many firms are looking at where money is being invested. Lots of stuff that had borderline value will never be restarted, and L&D will need to focus on topics more aligned with where the business needs to go. This will change the conversation around what has value and what has impact.” 

The important thing is to remain curious, concludes Yates: “The traditional view is that the thing we create, the thing we design, is where we end our job. But if we’re curious, we’ll focus on what the result is of that thing we delivered – whether it activated performance. And this is more critical now than ever.”  

L&D measurement in action

Domestic & General

Domestic & General (D&G) provides service plans for around 23 million appliances across the world. In the UK alone, around one in three households will have a service plan with the company. Aligning D&G’s learning strategy with its core business goals of international expansion, digital transformation and embracing new ways of working is a priority for group head of learning operations Sonal Sutcliffe.

Evaluating the impact of learning can be more challenging in some areas than others, she says: “It’s much easier in the contact centre as we can say that a certain intervention helped to generate a certain amount of sales. But when you get to leadership training, it’s much harder to articulate how we put a group of people through a programme and how it’s influenced something they’ve done.”

In contact centres (where employees have been home based since March), the learning team uses a balanced scorecard looking at multiple metrics for call quality, customer satisfaction, how many calls have been taken and whether employees are covering the right things in their calls. For new staff, they look at speed to competency – the time it takes to reach the level of a more experienced and successfully performing employee. The company’s usual three-week face-to-face induction has been adapted into a
four-week virtual programme, and new starters are up to speed in half the time compared to before the pandemic, says Sutcliffe.

In January, the team ran a development centre for future leaders comprising interviews, team exercises, management presentations and reports. “These were observed by eight other leaders at different points so the feedback was really rich,” Sutcliffe explains. “We turned this into a feedback report with a coaching session to discuss where they stacked up in terms of the competencies we were looking for.” The metrics produced will enable D&G to personalise further development for participants as well as tweak the content for the next cohort.

Avon

Measuring the impact of learning on hundreds of thousands of Avon sales representatives, or ‘beauty entrepreneurs’, distributed across more than 45 countries, is no small challenge. But digital experience manager Andy Stamps has been able to connect an increase in average order value, and in retention of beauty reps, to engagement with the company’s learning platform, Avon Connect. 

Reps can access learning content “in the flow of life”, according to Stamps, as many of them will be juggling sales with other aspects of their lives, such as parenting. As well as digital learning content about products, they can follow a learning path through to advanced sales techniques and leadership training. 

Around 75 per cent of the content is user generated. Reps post videos or suggestions that gain traction with comments, likes and shares, and this social dimension increases their engagement with the company. Compared with those who don’t access the system, there’s a 20 per cent uplift in retention for those who do and a 6 per cent increase in order value. 

Steve Dineen, founder of Fuse, which provides the platform, says “creating the habit” that keeps reps coming back to discover new content has the greatest impact on performance – with a 10-fold difference in performance figures over six months. A further key metric has been to ask the reps themselves how the learning has affected their relationship with the company and, crucially, their sales. A recent survey of reps in South Africa found 73 per cent feel closer to the brand and 75 per cent had seen an upturn in business as a result of using the platform. “A big motivation for reps is earning, but also that sense of belonging,” says Dineen. “If we can make them feel closer to Avon through this, we get a sense of how powerful it is.”

Read the full article here.

Share This