By Kirstie Greany, Senior Learning Consultant | February 2019
Your ultimate guide
Evaluating elearning isn’t complicated or scary, and it doesn’t require long processes like the Kirkpatrick model. There are many easier and more modern ways to measure success and show the return on investment for your online learning strategy to business leaders.
This guide shows you what learning measures you should really focus on to demonstrate the biggest elearning ROI, and how to use learning analytics to increase impact and tell a simple story of success.
Learning is now the #1 reason why a person would want to join an organization. Employees are prioritizing their own professional development over other job perks, such as salary, and actively seizing opportunities to learn and improve their own performance. From hunting out answers on Google, YouTube and social networks to taking up modules on Lynda.com, Coursera and similar, modern employees are freely accessing online learning content to help them, in seconds – and all with the swipe of a finger.
Everyone wants workplace learning, and you produce workplace learning. Win-win, right? Wrong. Despite all that’s created, 85% of employees struggle to access learning directly related to their job. Learning value misfire.
Despite a huge appetite to learn and develop, employees are struggling to find what they need and businesses are struggling to keep hold of top talent.
94% of employees would stay at a company if it invested in their career development.
Yet according to Towards Maturity, just 15% of employees can access learning directly relating to their job.
And, according to LinkedIn Learning, businesses are fighting to stay ahead of the curve, trying to hold onto their best talent and struggling to fill key positions.
Training organizations and L&D departments need to prove their value
There’s something about workplace learning offerings that does not demonstrate value to end users. Perhaps it’s because it can’t be found. Perhaps it’s because it’s not meeting their needs. Perhaps it’s because many aren’t using learning analytics well enough to find out.
According to Towards Maturity, over 50% of those working in L&D admit they lack the skills to measure and use data to tell a story about impact.
With top executives recognizing the importance of workplace learning to business success now and in the future, the pressure is on for learning teams to get up to speed with learning analytics, measure the impact of their learning projects – digital and otherwise – and prove their worth.
This guide will walk you through some practical steps you can take to evaluate elearning more easily, more readily, and turn data into a stakeholder story you can tell throughout your learning project lifecycle.
What ‘learning value’ actually means
First, let’s consider what we might mean by delivering value from learning projects – digital or otherwise. Long gone are the days of showing a workshop happy sheet, list of module completions or assessment scores and considering it a “job done.”
Learning value is multi-faceted.
Towards Maturity’s research with over 6,000 organizations finds there are three key areas of KPI that matter most to business leaders:
Business impact – ability to implement change faster, productivity and ultimate customer satisfaction
Staff impact – staff engagement, qualifications and time to competency
Efficiency – linked to cashable savings and time savings
This tallies with our own thinking around people-centered learning. Our research, based on the behaviors of millions of learners, shows that the organizations that get the best ROI and can prove high value from their online learning focus in on the value for end users just as much as, or even more than, face value for the business. And they do this from the off.
When planning how to measure the success of your workplace learning, try to see value through users’ eyes first.
Employee engagement, using time wisely, being useful, supporting tangible performance improvements are win-wins for people and businesses, after all.
Why it’s best to see value through the learner’s eyes
You can’t expect individuals to change habits or improve performance unless you offer something genuinely useful and helpful that meets their needs and expectations.
And since learning is so important to employees, it’s a key vehicle to employee engagement. Engaged employees are more productive, more innovative, and more loyal. Successful learning is the key to retention.
Successful workplace learning = employee engagement = productivity and retention
Think holistically about value, and be people-centered
This diagram shows how personal value and business value share a common ground and feed into one another.
4 steps for evaluating elearning
1. Get your goals straight
Before you begin your learning project, hone in on the specific goals that will deliver the highest value to end users and business leaders. You’ll need to work with both parties to get there. It’s best to go for a small amount (1-3) of high impact measures of success, and really focus on meeting them.
Once you have your success criteria set and agreed upon, you can then focus on collating and measuring relevant data along the way. With a refined set of goals, you’ll likely only need to track a handful of data measures.
Look to tools with built-in data dashboards, or consider setting up your own.
3. Ensure your elearning evaluation plan covers all your digital learning
When putting together your evaluation plan, don’t forget to evaluate the impact of performance support resources, job aids, downloads, activities and any other digital learning components, as well as your “core” learning elements. You might find it’s these that are delivering more impact!
4. Get tracking!
Not sure what to track? Read on for ideas.
20 ways to measure impact when evaluating elearning
Don’t let learning analytics overwhelm you!
Data is much more readily available nowadays, but don’t feel you need to monitor every data point possible. While we share 20 learning evaluation measures here that you can track, pick just a few key ones.
Six top tips to help you choose what learning data to track:
Pick just a handful of key trends and inputs that, when combined, show the impact your elearning is having in relation to your project goals
Go for a combination of quantitative and qualitative data to get a balanced view
User feedback and user engagement data does matter. How else do you know which piece of digital content and learning design approaches gain your audience’s attention most?
Involve managers
Include peers
Don’t be afraid to A/B test solutions, and use data to guide your design direction
Be people-centered with your learning evaluation. Involve end users, involve managers, involve peers, use learning analytics, and don’t be afraid to A/B test solutions.
If your project aims to increase sales, drive up customer ratings, reduce errors, increase retention, improve Glassdoor reviews, or for whatever other reason it deserves to exist – that’s what you need to track. But be holistic about it. Try:
Hard stats – benchmark the current status of performance for your audience groups, and work with managers to monitor improvements. They should have a way to measure this in the first place. For example, if the target is to reduce customer response time by 20% by Q4, response times need to be tracked before and after roll out.
And people-based data like:
Self-assessment – Because we want to engage and empower people to learn and improve, right? Ask users to evaluate themselves against certain success factors, and plot where they feel they need to improve. Then, ask them to track their performance against these goals during and after they’ve taken part in your project. There’s plenty of survey tools out there to help, or consider building it into part of your digital content.
Manager assessment – The key to successful employee engagement and the success of your learning project is managers (CIPD). Involve them in the evidence-based evaluation of their team members (if they aren’t involved, how can they motivate and support employees to reach goals?).
Peer assessment – Get peers to evaluate one another and help provide evidence-based feedback on how a teammate is performing against their measures.
Team assessment – People-centered learning doesn’t just mean looking at individuals. Often, setting a team a goal and tracking their improvement helps boost performance. Monitor the improvement of certain teams on their hard stats (see below), and have teams self-assess their performance and peer-assess other teams’ performance in key areas too.
5 ways to measure how your learning program is working
To show how your initiative is adding value, you need to track the correlation between the learning you’ve invested in and your audience’s performance. This is where data dashboards are extremely helpful. Here’s some items to look out for.
Track usage – Are you engaging the number of users you’d expected? (If not – see below for other measures to help you find out why). Which pieces of content are getting the most users? Use this data alongside some performance tracking to see if you can spot any trends.
Track sessions – Are users coming back to the same pieces of content time and time again? If you’ve designed performance support resources to be used on the job, this would be a key success factor to track.
Track shares – Which pieces of content are users sharing with others? Which are instigating conversations on social platforms? If one of your goals is around employee engagement or creating a social “buzz,” this helps you spot success.
Run surveys – Ask users for feedback on whether they feel the learning resources and experiences you provide are helping them meet their goals, and what else is helping them too. Drop in a simple and quick user survey at the end of each piece of content.
Spot spikes – Can you see spikes in social activity, performance improvement data or performance evaluations – self, peer or by a manager? Do they map to moments when content was launched or got a lot of hits? What else helped drive these? e.g. If two out of ten pieces of content got particularly high usage and high user ratings in a given week, and the week after saw a peak increase in performance, this would give clout to your success story.
5 measures that help you increase the value of your live learning program
If you’re spotting wider trends and want to know why a certain element of your project is or isn’t working as expected, you need to dig a little deeper. Here’s some starting ideas.
Most popular pages – Which particular activities, pages or resources are getting the most and least hits? Take a look at what they have in common. What’s helping or hindering them, do you think? Make some edits, see what happens.
Responses to questions and activities – Are users responding to questions and activities as you’d expected? Are they too hard/too easy? Shift tact if so.
Take-up by location – Most dashboards will show you usage stats by location. Is a target location lagging behind another? Investigate why and intervene to increase your stats. Could it be down to comms strategy, manager engagement, or is it crying for content to be localized?
Devices used – If you were expecting the content to be used on the fly, on mobiles in short sessions, and most users are accessing it via desktop (for longer periods of time), consider why. Is this negatively affecting their experience (check surveys and impact on performance) or just a surprise to you? Consider if you need to realign your design in any way.
Iterations – If surveys and other data show that improvements are needed to a certain aspect of your program, roll with it. Make some edits, make them live, and monitor your data to see if you’re now hitting the mark.
5 measures to help you design for value, from the beginning
The most successful learning designers lead with concepts they test out and amend before developing the final solution. Help align your project with value from the beginning by running some A/B testing. Digital learning gives you huge opportunities to do this, yet not many take advantage.
You could try A/B testing:
Different lengths – of videos, audio clips, resources, pages, topics, interactive challenges and so on.
Formats – what works best? See above for ideas.
Roll out times and communication strategies – experiment with different roll out times and comms methodologies to see what gets you the most user hits.
Different platforms – find out where your content performs best, consider LMSs, learning portals and email drops/yammer feeds that link to standalone content
And run Surveys and user groups – yet again capture user feedback to help you gauge what your audience say they prefer, how they feel about the different options you present, and tally this up with some of the stats you’ll get via your data dashboards. While this is different to performance evaluation, you can’t improve performance if your content isn’t engaging your users in the first place. So, make sure it hits the mark.
Turn learning analytics into an elearning success story
Measuring learning impact and making improvements to your project is only half of the story. The other half IS the story! Shift from drowning stakeholders in #learninganalytics to telling a captivating success story.
Don’t wait until the end to do a big reveal. Work with stakeholders along the way to create a story together, building your team’s internal brand value as you go.
Involving managers and end users right from the start, seeking feedback, listening to their views, and reporting back on progress and what you’ve done (or not) as a result of that feedback not only boosts your project’s chances of delivering value, but means your stakeholders are all ears when it comes to playing back the “final” results. And if your results aren’t all perfectly rosy, showing yourself as a learning team that’s passionate about adding real value and proactively works with internal customers, seeks feedback and makes improvements is not a bad story to tell.
Learning and design processes are both a journey of improvement – it’s okay to bring stakeholders along for the ride
Create a communications strategy to work alongside your learning strategy, so you can shout about what you’re doing, why and what you’re doing next.
Report on key stats that are directly related to the goals you set out for the project.
Include quotes and narrative stories as well as data as part of your sharing.
Most importantly, be open about improvements you will make in the future to your project or learning strategy, and talk through how you know those are the improvements that will make a difference (because you’ve used data, gone under the bonnet to take a closer look at why an element is or isn’t performing and have talked with end users via surveys or otherwise to verify this).
Let us help you create an elearning success story with the help of learning analytics, from the start.
Book a demo of Elucidatto see our analytics dashboards, and find out how our professional services team can help you deliver real-life impact with your elearning.