The Complete Guide to Learning Analytics in 2026
- Learning analytics exist on four levels — most teams only measure the first two
- The xAPI standard allows learning data to travel between systems and get correlated with performance data
- Behavioral signals (time on task, skip rates, reflection quality) are better predictors of learning than assessment scores
- Privacy-by-design matters: people learn differently when they know they're being watched
Learning analytics is one of the most talked-about capabilities in L&D and one of the most poorly understood. Most teams think they're doing learning analytics because they can pull a completion rate from their LMS. They're not — they're counting clicks. Real learning analytics connects what people are learning to how they're performing, and uses that connection to continuously improve programs.
Level One: Activity Metrics
These are the metrics most LMS platforms surface by default. Enrollment counts, completion rates, time spent, assessment scores. They're easy to collect and easy to report — which is why they're so common. They're also largely uninformative about learning outcomes.
A 95% completion rate tells you that 95% of enrolled learners clicked through to the end. It tells you nothing about retention, application, or behavioral change.
Level Two: Learning Metrics
These measure actual knowledge acquisition — the gap between what learners knew before and what they know after. Pre/post assessments, skills self-rating before and after a program, performance on simulations or case studies.
Level two metrics are better, but they still measure knowledge acquisition rather than application. Someone can know how to give effective feedback immediately after a workshop and still give the same kind of feedback they always did.
Level Three: Behavioral Metrics
This is where learning analytics starts earning its name. Behavioral metrics measure whether learning has changed what people actually do. They require getting outside the LMS: 360-degree feedback, manager assessments, performance review data, operational metrics tied to the skills being developed.
For a program on customer empathy: do learners' customer satisfaction scores improve after the program? For a manager development program: do their direct reports' engagement scores change?
Level Four: Impact Metrics
The highest level correlates learning programs with business results at an organizational level: does higher learning engagement correlate with lower attrition? Do teams with higher average skill scores outperform on revenue targets?
This level of analysis requires joining learning data with HR and business data — technically and organizationally challenging, but increasingly feasible with modern data infrastructure.
Building Your Analytics Stack
The technical foundation is data portability. If your learning data is locked inside an LMS and can't be joined with your HRIS or performance data, you're limited to level one analytics no matter how sophisticated your analysis.
xAPI (the Experience API) is the standard that allows learning events from any system to be recorded in a shared data store (Learning Record Store) and correlated with data from other systems.
A Note on Privacy
Learning analytics done poorly creates a surveillance problem. If learners know every click is tracked and reported to their manager, they'll engage defensively — skipping content they find challenging, rushing to completion. The data becomes polluted by the observation.
Design learning analytics with privacy as a constraint. Individual-level data should serve the learner first: show them their own progress, their skill growth, what their engagement patterns suggest about how they learn best. Aggregate data serves the L&D function. The data flows up in aggregate — it doesn't flow down as surveillance.