Skip to main content

Sponge Analytics

A data platform that pairs with our solutions to reveal deep insights into impact & learner behaviour.

Learn more

We’re hiring!

We have exciting new roles available. Join our growing team and begin an unforgettable journey.

Learn more

Looking for something?

Home / Resources / Your 5 minute guide to the perfect learner survey

Your 5 minute guide to the perfect learner survey

Date:

Learner surveys are one of the most popular techniques used to evaluate elearning.

They can play a useful role in measuring your training as part of a wider evaluation strategy.

But too often, surveys focus on the wrong things, or miss opportunities to gather more valuable learner feedback.

We’ve compiled this quick guide to getting the most out of learner surveys.

There’s even a list of sample questions to help make your life easier.

Why do learner surveys?

Understanding elearning from the point of the view of learners is essential for a number of reasons.

On a basic level, you need to know their reaction to the experience (known as Level 1 in the Kirkpatrick model).

Finding out whether the audience found the elearning user-friendly, relevant and engaging are useful metrics to add to the evaluation mix.

These indicators can also help you to create better learning experiences in the future.

Beyond learner reaction, surveys can contribute to the job of measuring the impact and value of the learning.

They can help you assess what people have actually learned, particularly if combined with pre-elearning surveys and other assessments.

While surveys won’t provide an objective measure of behaviour, done correctly, they can examine behavioural impact from the learners’ angle.

Asking questions about your employees’ intentions, confidence and commitment to implement what they have learned, will allow you to measure their perspective against fact.

The Kirkpatrick model

It’s the world’s best-known method of evaluating the effectiveness of training. Dr Donald Kirkpatrick, Professor Emeritus at the University of Wisconsin, pioneered the model in the 1950s. It uses four levels to assess the value of training to an organisation.

  • Level 1: Reaction

Examines the extent to which learners found the training favourable, engaging and relevant to their jobs.

  • Level 2: Learning

Covers the extent to which learners improve their knowledge, skills, capability or confidence following the training.

  • Level 3: Behaviour

Gauges the degree to which learners changed behaviour and applied what they learned during training to their job.

  • Level 4: Results

Looks at the impact the training has on the business in terms of tangible results, or Key Performance Indicators (KPIs).

The model was updated recently to include extra explanations and definitions.

If you plan carefully, as part of a wider evaluation strategy, you can make your learner survey work harder and smarter.

What to include

Each elearning project will have specific learning objectives so the focus for each learner survey may be slightly different. However, there are some guiding principles that will help you target the most valuable insights. Here are some things to include:

  • User experience

Increasingly, people are judging elearning by the same standards as other digital content, such as websites. Broadly speaking, this is user experience (UX); what it feels like to use the elearning.

Choose some initial questions that will help assess the learners’ experience in terms of usability, accessibility and expectation.

  • Value to the learner

Try to get to the heart of the value and impact the course will deliver for learners.

Was it useful? Was it worth the time they spent on it? How will it help them in their job? Do they think their knowledge, skills or capability has increased?

Focus on three key areas from the learners’ perspective: knowledge, confidence and perception.

Marc Rosenberg points out that by doing this you also create a useful baseline of results that can be followed up with individuals later to feed into Level 3 behavioural evaluation data.

  • Comment boxes

Where appropriate include space for people to explain the score, rating or option they have chosen. These comments can be particularly helpful to shed light on a trend in the data, or for gathering individual stories or examples on impact.

  • Pre and post-elearning comparison

If possible, survey learners before and after the elearning course. This will allow you to compare the results around learner perception of knowledge, capability and confidence. It is also useful in helping to guide course design and identify areas where previous elearning had failed to deliver.

  • Recommendation

This may seem like an afterthought question, but it can reveal a lot. Used in conjunction with other question metrics, it can help to build a picture of the overall learner perception of the course.

What to avoid

  • Design questions

It is tempting to include questions about things like visual design, particularly if you are close to the creative process. But asking learners whether they liked the images in your course will do little to deepen your understanding of its impact.

  • Keep it short

If your survey is too long learners may not spend the time to complete it. There is no perfect length but try to limit it to between 10-15 questions.

  • Single thought

If you are trying to keep the number of questions down, it is tempting to add multiple elements to a single question. The danger is that your questions will be confusing. The rule of thumb is stick to a single thought or topic for each question.

You can download a learner survey example to give you inspiration for your next elearning evaluation survey. It’s a good place to start if you want to refocus the way you approach surveying.

Evaluation strategy

A learner survey is just one evaluation tool. If you really want to assess the business impact of your elearning programme, you will need to identify and use other methods of analysis.

Techniques such as learner observation, behaviour measurements, ROI and monitoring of KPIs can all feed into the story.

So, don’t stop with a learner survey, build a whole evaluation strategy as an ongoing process rather than a one-off event.