As you read this post, remember that learning technology is in a good position - enterprise giants and many social media platforms use activity streams and a format similar to the Actor, Verb, Object structure of the Experience API (xAPI) for evaluation.This means that, ultimately, the act of gathering and using activity stream or interaction data in Learning & Development (L&D) so that we can evaluate, react to, and improve training is not a new or futuristic concept. We have been wanting to do this since the beginning of digital learning. Google, Facebook, Amazon, retailers, credit card companies, and more, have been utilizing activity stream data for some time. We can stand of the shoulders of these giants. This big data technology is part of everyday processes. This stuff is made relatively easy by those who came before us and we even have a specification (xAPI/LRS) that has been designed for learning activity streams.
In part 1 of “Better Inform your Training Evaluation” we discussed the value of gaining data to provide a better informed Level 1 and 2 of Kirkpatrick. It was determined that once you can gain the data to better inform Levels 1 and 2, you “kick open the door” to better inform all of the Levels. The majority of Talent Development does not currently evaluate training beyond tracking completions, but we can now use data driven training evaluation. Remember, these posts are not strictly about applying data to Kirkpatrick. Any method you use to evaluate your training is going to be improved with contextual learning activity streams. This is how you gain real insights into your learners. Better informing your evaluation starts with better learning data.
Level 3 of Kirkpatrick's is ”Behavior” and we can define this stage as beginning to connect training to performance. Behavior can be measured a variety of ways and the training that was supposed to effect the behavior can be measured as well. Measurements for behavior can include follow up surveys, subjective feedback, observation, action plan monitoring, and systems that are collecting performance data. All of these methods can, and should, be done digitally and they could be reporting to an LRS. One good reason to get all the data into an LRS is to limit data transformation and complexities if you want to automate or react to the data streams in real time. These measurement tactics are being used, and they could be automated in a way that leads to instant adaptation. If you refer to the Kirkpatricks usage data from ATD’s research study: “Evaluating Learning”, it is levels 3-5 that are not being used as much as the practitioners would like. What if we could get performance data into the same data format as the contextual learning interaction data? Specifically, if we have an LRS database of learning activity data from each piece of learning content that maps to the behavior or skill that it is trying to improve, and we have the performance data, then we limit data transformation, help future proof the ecosystem, and gain a holistic view to map training and performance goals.
The general consensus is that business effectiveness skyrockets when levels 4 and 5 of Kirkpatrick's are able to be utilized, yet, these two levels are the LEAST used in learning and development today. How do we get “there” from “here?” Big data concepts can be utilized to ultimately tell an up-to-the-minute data story about learning and performance insights. An LRS is a big data concept. How did your training impact the business? The chart below is from ATD’s research study: “Evaluating Learning.” Not surprisingly, we see a positive trending sliding scale when it comes to the organizational value of each level. What the study showed was the learning professionals who were polled have a desire to get more out of the higher levels of evaluation but are unable to.
Source: ATD; Evaluating Learning: Getting to Measurements That Matter, 2016
Within the Riptide Learning Division we tend to follow four simple steps when we are approaching LRS work and implementations of our Riptide Elements learning products with our customers.
Riptide as a software company has been working for over 20 years adapting and working with each “technology wave.” We have a company-wide general philosophy when dealing with the complexity of enterprise ecosystems: Crawl, Walk, Run. You will see us proposing to work in this way in nearly every size proposal (RFP) because it is how we succeed. This is how we approach our analysis, planning, and execution. What we mean by crawl, walk, run is this; you do not have to implement the entire solution right away, you can get there incrementally, lay the groundwork, and work toward the final goal and you will make it there. Please refer to the graphic below.
- Design, Enable, and Apply Data Reporting - Enable contextual learning and performance activity reporting. This is where you use Instructional Design principles and identify the learning objectives, learning goals, measurements, and data “profiles” in context. If you are using distance learning courseware or authoring tools that produce HTML this is easy. We share in open source the type of default activity reporting you can add to any HTML training and it will not disrupt the way it is currently reporting on your LMS. We have easily gotten gaming technology and proprietary training system data to report to an LRS. If you are using Instructor Lead Training (ILT) or a blended curriculum, you want to find the ways that you can incorporate digital technology into the process if you want to capture activity data. Can you track attendance with mobile check in? Can you deliver the assessments and surveys via HTML? Subjective analysis of performance via a text input, video, and audio can be stored in an LRS with context and meaning. If there is digital data available from performance systems you can easily enable them to report to the LRS if desired.
- Collect and Store Data - An LRS database collects the learning activity stream and performance data. It is really important that you use an xAPI Conformant LRS if you want to continue to gain the benefits of the technology and you want to remain interoperable. This LRS endpoint can be receiving and sharing data from training and performance systems, business intelligence and analysts, and other LRS in the enterprise. Notice the double arrows in the above diagram that are showing data flowing in and out of systems. This method allows you evaluate all of the data in aggregate and puts you on the road to having historical training and performance information.
- Report, Distribute, and Evaluate Data - Now that you have the data, you can create meaningful reports, dashboards, other systems can subscribe to activity streams, and more. This allows you to better inform multiple audiences, the learner, the educator, the managers, researchers, etc. The ability to provide actionable data views or visualizations to multiple audiences within your organization is an essential way to tell the story and aid in “connecting the dots.” Riptide Elements Storepoints LRS includes the ability to directly query the database and build visualizations, construct dashboards, and distribute these dashboards as live data feeds or static reports that you can deliver to any audience that is important to your organization. In business training we refer to, or define, “Results” as “Business Impact” and the educator sees learned “competencies” as “Results.” There are many similes in education and training. This is where you answer how your training is impacting the overall business goals or how you can verify that person who earned a Biology degree is a Biologist. Examples of successful evaluation of Business Impact might be improved inventory control, quantifying cost savings, sales performance increase, reduced accidents, or more efficient production that is due to your training. If you can make performance data and training evaluation data available you can better inform the evaluation insights that are valuable to your organization or discipline.
- Adapt to Data - The insights gained by step 3 involves learning content that maps to the behavior or skill that it is trying to effect and correlating the learning activity data with the performance data. Manually adapting to the data can improve your process greatly, a first step might be to simply refine the assessments, or change a scenario that is not effectively teaching a concept. In the last article I showed a diagram that was validating the effectiveness of an assessment question pool. Based on the Step 3 report I can see anomalies in the assessments and clear indications of where the change or adapting to the data is needed. Step 4 is where you can really get into big data adaptive learning concepts. It is possible to create rules, take action, automate, adapt to patterns in data streams. The arrows in the above diagram that are going in both directions are indicating that the LRS and other systems may inform each other based on rules and patterns in activity.
Remember that each of the four steps can be iterative and you can Crawl, Walk, Run within each. Best practice dictates that you do evaluation, analysis, and planning and we recommend that you shoot for the moon in terms of where you would like to end up. It is also good to understand that you may encounter unknowns. We have learned to welcome unknowns as part of the nature of working with technology and data. Many of the learning insights we will gain are currently unknown.