Get the measure: Learning analytics

In my previous blogs I have  talked about engagement; how to get qualitative learning experiences. It is essential to keep in mind the learner's perspectives when creating a meaningful learning experience, but can we measure how well it is working? 

What is it exactly?

Taking your organisational learning practices above and beyond is a guaranteed challenge. There are many different factors to consider, one of which is the use of learning analytics and assessment. Its relevance as a source of quality control is recognised more frequently across data-driven organisations. In effect, learning analytics are used to gauge whether the ends justify the means and provide you with insight to make improvements to your learning programmes. 

The question to explore is whether learning analytics can pose a positive effect on learning. I have done some digging for you. 

What is it exactly?

More than meets the eye

The strategic relevance of learning is mostly tied to financial cost. What kind of initial investment is needed to set up learning interventions, how much is the upkeep and how much time is there between training and productivity? Whilst these financial indicators justify spending, they leave us in the dark regarding learning impact.

A better impression of learning impact requires you to look at learner engagement, transfer and satisfaction. In other words, is that what is learned also applied in practice? The answer could be no and diagnosed much earlier. For instance, when there is no intention or relevancy to apply what is learned, transfer is always going to be low.

More than meets the eye

It's the context, stupid!

One of the critiques regarding the use of learning analytics is to dumb the entire learning process down to a few variables. Other than financial metrics, common items are tied to the learning experience platform. Completion times, number of log-ins, and online interaction are exemplary.

While these sort of metrics are definitely beneficial in justifying usage of a digital platform, they fail to take into account the specific conditions of the learner. What is their motivation? Do their managers endorse learning? Has the right instructional method been chosen to support learning? 

The problem with omitting specific learning metrics is that it can lead to surrogation, creating confusion on what is actually intended vs. what is measured. Such as, learners might look to fastest completion times in spite of actually learning what compliance is. 

It's the context, stupid!

Let's retract

To come back to the original question. How are learning analytics helpful in the approaches you are creating? Learning analytics can certainly have a positive effect on learning, but only when you design for it properly and make it actionable.

It is necessary to have financial metrics in place to justify spending. Without specific learning metrics though, these do very little to create qualitative learning experiences or opportunities for improvement. 

I encourage the use of learning analytics, especially since they can provide you real-time information about learning progress. It allows you to grasp much more quickly whether the learning content is the right fit or needs tweaking. To measure is to make things efficient. 

Need help? Get in touch!

Do you need help to set up your learning analytics effectively? Get in touch. We would love to hear you out. 

Read more