How do you evaluate skill-based training?

Opinions expressed by Entrepreneur contributors are their own.

As a former trainer and consultant for many companies, I've had the opportunity to observe the biggest problems they tend to encounter regarding training performance firsthand. Namely, a lot of money and effort gets invested in training, but it's often unclear how to demonstrate the return on investment (ROI).

Companies also expected major improvements in job performance, without knowing the training path leading to this result. And some had unrealistic expectations, like looking for training to drive significant improvements in their bottom line, without aligning their training plan with their business goals in the first place.

So, where do we start? Is there a secret recipe to knowing if your corporate training is working?

Not exactly. A good trainer will tell you that business goals translated into training objectives leads to improved results — whether that's greater employee satisfaction, behavior change or increased job performance. The metrics and processes are unique to each company. But without having the right objectives and a way to measure them, evaluating corporate learning programs is pure guesswork.

The solution? Aided by people, process and technology, you can set and track training objectives and measure results.

Here are four questions to ask (and ways to answer them) to evaluate if your corporate training is working:

1. Does the training program cover what employees need to know?

Ensuring that your training program covers each learning objective and key skill may seem like an obvious step — but trust me, it's very easy to leave important competencies out. This oversight can be due to different factors, including a faulty needs assessment process and pressure for instructors to prioritize certain skills over others, due to time and money constraints.

An important trend toward "competency-based learning" (emphasizing the demonstration of concrete and measurable skills) can help drive balance. Companies can add their curriculums to their learning platforms and then tag each resource, module or assessment with a certain competency that it should cover.

For instance, if an employee needs to handle difficult customer support requests (main learning objective), they must know how to respond patiently (a key competency). Then, the training modules that teach learners how to deal with this issue are tagged with the corresponding competency.

It's easy for companies to see whether all competencies related to the training program are covered, as well as identify gaps in training.

Related: 5 Training Lessons Companies Have Learned From the Pandemic

2. Is training performance up to par?

Certain metrics, such as the number of learners who completed an online course and their assessment scores, apply to any type of training program. These are the first indicators you should look at, since low participation rates are a predictor of subpar training performance.

Combine that with a competency-based approach to see if your training covers all bases, and evaluate training performance based on how well employees have mastered key skills.

For instance, say a learner completes a module and scores above 80% on the related quiz. Since both the module and quiz measure a related competency, the instructor can easily track learner progress and see that they're on track.

On the other hand, if the learner scores below 50% (or another designated level), the instructor can intervene with additional learning material or activities. Instructors can also see an individual's or team's overall performance based on competencies, and decide whether the training program needs adjustment.

Related: 5 Ways to Improve Corporate Learning Initiatives

3. Is employee feedback positive and accurate?

This seems like an obvious statement, but bear with me. At many companies, a problem isn't necessarily a lack of employee feedback on training programs; it's that companies don't ask the right questions.

For a long time, I used reactive surveys in my training programs, giving participants a questionnaire to fill out after a training session. These short feedback forms are supposed to measure learner satisfaction, but what they do, in reality, is gather impressions: How well did it go? What were a few key takeaways and suggestions for improving future sessions?

I'm not saying the subjective opinions of learners don't matter, but they tend to be superficial since people usually want to move onto something else after a learning activity is over.

And while learners may remember a few key session takeaways, it doesn't mean that they will apply them in the future. So, measuring learner knowledge through employee surveys after the training can help you see the bigger picture.

Follow-up quizzes are also very useful for measuring behavior change. For instance, you can send a follow-up quiz a month after the course has ended, having employees answer questions about how much new knowledge and skills they've used during this time and whether they still remember important parts. If the feedback is still positive and learners feel confident about applying new skills, you'll know that your training program is doing its job.

Related: 3 Corporate Training Resolutions for 2022

4. Is employee engagement with training soaring?

If we look beyond measurable results, there are other secondary benefits of training. Employees are more confident in their abilities, satisfied with their jobs and view the company more positively.

They're also less likely to suffer from work-related stress since they have the resources to do their jobs well. Another key indicator is that they simply enjoy their learning journey, and it's not considered an onerous task.

Companies can gauge how much learners enjoy their training based on engagement, participation rates, platform activity and even their interactions with other learners. Do they participate in forums and groups? Are they willing to leave course reviews without being asked?

Positive learner behavior during and after training means that they're finding it useful — so it's a good indicator that training is working.

Related: Why Your Business Needs a Great Employee Training Program - and How to Implement One

Like many business processes today, training is also data-driven. You can use training analytics to assess your program's efficacy.

Moreover, instructors don't have to become data analysts to know what works, why it works and how to improve employee training. Even managers who are not trained instructors can understand if the program covers all the skills employees need, find skill gaps and determine the effect of learning interventions on employee engagement.

The main takeaway is that knowing if your corporate training is working or not saves you a lot of hassle, time and money in the long run — giving you confidence to tweak your training strategy, discard programs that aren't working and create even better ones.

How do you evaluate skill-based training?

Employees’ professional profiles are primarily the sum total of their knowledge, skills, and abilities – often abbreviated simply to KSA. But how do you measure and evaluate your employees’ knowledge, skills, and abilities in practice? And what tools are available to help? Here’s how to do it!

KSA explained

The abbreviation KSA stands for knowledge, skills, and abilities. They form a major part of an individual’s personal and professional profiles.

Knowledge

Knowledge is primarily theoretical in nature. If you’re knowledgeable about a certain subject, then you’ve acquired a lot of facts and mastered the concepts and theories underlying the topic in question. We usually gain knowledge from information sources such as books, journals, internet, or traditional classroom-style courses and lectures.

Skills

Skills are primarily more ‘practical’ in nature than knowledge. They’re rooted in knowledge, but are generally acquired by means of training courses and work experience. A skill is the ability to perform a certain task or role competently and relates to application of knowledge in a particular situation or context.

Abilities

Abilities are very similar to skills in many respects. However, there are important differences. An ability is broader – a combination of knowledge, skills, attitudes, and other personal traits.

About KSA

The sum total of knowledge, skills, and abilities define a role or job title. Does a candidate or employee have the right KSA combination for a specific opening? Using the KSA model, you can see quickly and clearly if the right person is in the right role/job.

The US government still regularly uses the KSA concept, especially at a federal level, to recruit suitable staff, using a scale from 0 to 100. A score of 70 is generally a minimum requirement to be eligible for a job opening or role.

Nowadays, the model is primarily used to map and analyze the success of, and necessity for, a particular training program. In other words, a useful tool for identifying potential skills gaps and finding concrete solutions.

How to measure knowledge, skills, and abilities

You can evaluate each of the three KSAs, assuming you have the right tools and adopt the right methods. It’s high time we looked at exactly how to evaluate these three components.

Evaluating knowledge

Knowledge is a partially abstract and somewhat fluid term. If you think of knowledge as a weighty tome full of theory, facts, and figures, then it’s highly unlikely you’ll remember every tiny piece of information.

Nonetheless, you can evaluate someone’s knowledge levels in several different ways:

  • Certification. Certification is ‘proof’ of competence and shows that someone has mastered both the theoretical and practical basics required for a certain role, task, or job. Examples include CERT and SCC certification.
  • Qualifications also demonstrate a degree of acquired knowledge, for example a bachelor or master’s degree in a certain subject area.
  • Workshops are a great way to evaluate knowledge levels and test to what extent an individual can apply theoretical knowledge in practice.

How do you evaluate skill-based training?

Evaluating skills

Skills are more ‘practical’ in nature than knowledge and are evaluated differently. You can examine previous examples of someone’s work. This will provide a good indication of an employee’s skill levels, the way they apply their knowledge, and their attitude to their work.

By examining results over an extended period, you also avoid evaluating an employee based solely on a one-off snapshot. This is a common problem associated with traditional evaluation systems such as performance reviews, questionnaires, and standard tests.

Practical tests are also a good method for assessing skill levels. They demonstrate how an individual applies their knowledge and experience to solve practical problems and challenges.

Besides assessing previous work and taking practical tests, skills matrices are another useful option. A skills matrix is exactly what its name suggests – a snapshot of all your employees’ skills and qualifications laid out in matrix form. A well-laid-out matrix allows you to determine knowledge levels, proficiency levels, and valid certification at a single glance.

Read more about how to create a skills matrix and what the benefits of maintaining such a system could be to your organization.

Evaluating abilities

Abilities require a combination of knowledge and skills, but also a third component – certain character traits. For example, analytical problem solvers can pinpoint the essence of a problem, draw logical conclusions, and make a sound analysis. But a certain degree of inquisitiveness (character trait) is also an important piece to the puzzle.

You can evaluate abilities in one of several ways. For example, you could rely on managers’ observations over an extended period. What have they observed and how do employees respond to feedback?

Alternatively, you can evaluate employees’ abilities based on examples they present themselves. A useful tool for this type of evaluation is the STAR method. Adopting this approach, an employee would present answers to the following questions:

  • What is the situation?
  • What task did I have to perform? I.e. what was expected of me?
  • What action did I take in this particular situation?
  • And what was the result of my action?

Other options include validated personality questionnaires, interviews, or 360° feedback. This last example involves asking several people who directly interact with the employee in question to fill out a questionnaire. The broader cross-section of people involved, the better, for example a close co-worker, a manager, a client, and a business partner.

The questionnaire highlights several relevant behavioral indicators, such as punctuality, decisiveness, or collegiality – each of which relates to a particular ability. An average is then calculated from all the feedback gathered. In this way, it’s possible for you to gain a more objective view of the extent to which the employee in question has acquired and mastered certain abilities.

Skills management software

First, you need to be able to pull up a complete overview of all your employees’ knowledge, skills, and abilities quickly and accurately. It’s best to organize these in some form of foolproof system that isn’t too susceptible to human error or ineptitude. A popular solution is to use a spreadsheet such as Excel, but this is anything but foolproof.

Fortunately, alternatives to Excel exist. A prime example is AG5’s skills management software. Armed with such a tool, you can access all the vital information about your employees’ knowledge, skills, and abilities – anytime, anyplace, anywhere. What’s more, all this information is stored centrally in the cloud.

For example, AG5’s software allows you to:

  • enter updates and training results real-time from the shop floor
  • link projects to specific expertise and experience
  • set notifications for employees, groups, or qualifications
  • find the best replacements for employees off sick
  • search for the most qualified employee to retool a production line
  • replicate organizational structures and link employees to qualifications using drag ’n’ drop menus

Curious about the benefits that AG5’s software could provide your KSA efforts? Feel free to get in touch or schedule a live demo.

How do you evaluate skill-based training?


Written by Rick. Rick van Echtelt is CEO at AG5. You'll often spot him out on the soccer field coaching talented, young soccer players.