July 2024

Applying Analytics to Learning Evaluation, What We have Learned

By Adam Hayden

Summary

Anthony Randolph, PhD, explored how HR professionals use data analytics in training evaluation. He identified three key factors influencing their adoption: performance expectancy, social influence from internal champions, and enabling organizational resources. He also described four informal roles that HR professionals take on to utilize analytics effectively: system administrator, frontline analyst, objective data analyst, and impact tracker. His findings highlight the importance of cross-functional collaboration and early evaluation planning in successful learning evaluation programs.

In this series of posts (part one and two), we are summarizing the key insights from our Learning Evaluation subject matter expert, Anthony Randolph, PhD, through his dissertation, Human Resources Department: Professionals’ Experience Utilizing Data Analytics in the Training Evaluation Process. In this final post, we share Anthony’s answer to the research questions and apply the theoretical work to everyday practice in Learning and Human Resources teams.

Theory Meets Praxis: Answering the Research Questions

When he scoped the study, Anthony asked two research questions. First, what factors influenced HR professionals to use HR analytics in the training evaluation process?

Anthony found three factors, rooted in the User Acceptance (UTAUT), beginning with individual and organization performance expectancy to carry out training evaluation. This determined the evaluation team’s pursuit of judging training effectiveness. In other words, the perceived expectation that a tool would be useful in achieving their goals led the organization to adopt a new technology solution, Metrics that Matter (MTM), a software analytics tool that integrated with the HRIS (Human Resources Information System). Perceiving that it would work well influenced the organization’s adoption of the program.

The second factor that influenced the HR professionals’ use analytics in learning evaluation was the social influence that was driven by an internal “influencer” on the Organization Learning and Development (OL&D) team. This role was defined as someone with analytics expertise who championed the project and program.

The enabling conditions, internal resources which the organization had set up to aid in the use of the analytics, served as the third factor that contributed to influencing the HR professionals to adopt the new technology that would help them achieve the study aims to improve learning evaluation.

In short, the answer to the first research question is this: HR professionals are motivated to use HR analytics when they perceive they are using the right tool, internal champions influence that tool or technology, and proper organizational resources for support are in place.

Next, Anthony asked, how do HR professionals utilize HR analytics in the training evaluation process? Here, a Sociomateriality perspective on technology enabled a more robust investigation of the HRD practitioners’ work. Feel free to refresh on the theoretical work of the project in part two of the series.

Anthony found that four contributing informal roles described how HR professionals were using analytics. We note that these informal roles were given to existing members of the team. Rather than hire for these roles—though, that could be a needed direction, these informal roles align with the natural strengths and interests of the in-house team. The following informal roles were assigned:

  1. The HR professionals benefited from a defined system administrator to work directly with MTM vendor consultants.
  2. A frontline analyst ensured that the development of the evaluation survey questions met the assessment needs of a program initiative, including reinforcing the standard ADDIE (Analysis, Design, Development, Implement, Evaluate) model, focusing particularly on step 1, Analysis.
  3. HR professionals assigned an objective analyst of the data who would interpret the data in as close to a bias-free manner as possible.
  4. A designated member of the OL&D team was committed to track the impact of the training, to assign a designated point person for tracking the impact over time.

These Sociomaterial team-based, social-behavioral roles complemented the individual user’s perceptions, as defined by the Unified Theory of Acceptance and Use of Technology UTAUT that shaped the earlier part of this post.

Study Conclusions

Anthony’s study findings revealed three factors that have influenced HR professionals to use analytics in learning evaluation. These factors included performance expectancy, social influence, and enabling conditions. Additionally, the results revealed four ways that describe how HR professionals use analytics in the learning evaluation process. These included the informal roles of system administrator, frontline analyst, objective data analyst, and an individual dedicated to tracking the impact of the training.

This research can serve as a useful reference for organizations and HRD professionals looking to apply data analytics in their learning evaluation process. Though Anthony’s study with the OL&D staff selected the MTM program as its technology solution, these learnings are technology agnostic and may be applied in a variety of learning evaluation settings.

An important organizational practice revealed from this study was the interconnection between the HRD practitioners and internal analytics teams within an organization. This gives good reason to connect cross-functional organizational teams from HR and more data-centric IT or analyst teams. A further implication drawn from this study was the use of an evaluation plan in training initiatives, especially early in the discovery, needs assessment, or analysis stage. Evaluation should be center stage, even before training design begins.

Application to Client Work

When we consider applying these lessons to client work in the sphere of learning evaluation, we find many lessons, including individual user feelings about an analytics tool, perceived ease of use, internal influencers, organizational support, and defined informal team roles for discreet tasks are the ingredients for success.

Considering evaluation at the very beginning of the project lifecycle is also critical to evaluating learning effectiveness. Building cross-functional relationships between internal departments or multi-vendor teams in modular environments are also key components of a successful learning evaluation program.

For Anthony, he stresses the value of peer relationships in the completion of his study. “The most rewarding part of pursuing this doctorate was being around other scholars, learning about the global issues they were trying to solve through their research, the classroom discussions on possible solutions, and avenues of research around those issues.”

Despite the popularity of analytics, HR professionals face a research gap in the practical application of analytics toward evaluating learning effectiveness. We extend our sincere thanks to Dr. Randolph, whose research bridged the gap HR professionals face when using analytics for learning evaluation.

Future Research Directions

There are several tools on the market that could assist HR professionals in their pursuits of applying data analytics in the evaluation process. (1) Future research could investigate if the use of different technology would provide similar findings in those HRD practitioners’ processes and practices. And (2), the expansion in the investigation of the different technologies used in the application of data analytics could remove the lack of generalization with a specific tool.

What lesson from Anthony’s research will influence your pursuit of more sophisticated learning analytics? Think it through, tell us in the comments, or send us a note about your analytics wins and needs!

Recent Posts

August 2024

Unlocking Success in Medicaid Transformation: Briljent at MESC 2024

Today, in Louisville, KY, we are excited to kick off MESC 2024 (Medicaid Enterprise Systems Conference). MESC is a national conference and community for state, federal, and private sector individuals…

July 2024

Applying Analytics to Learning Evaluation, What We have Learned

In this series of posts (part one and two), we are summarizing the key insights from our Learning Evaluation subject matter expert, Anthony Randolph, PhD, through his dissertation, Human Resources…

July 2024

Mind the Gap: A Theoretical Look at Analytics in Learning Evaluation

It’s not just about the technology itself, but about how people and technology shape each other in the real world. -Briljent Introduction This three-part series summarizes the key insights from…