Reflections: HPT Learning Goals

As this semester comes to a close, I would like to think about how my client work and project team related to the Instructional Design learning goals set forth. There are many core tenants of the OPWL Instructional Design process and, as practitioners, it is important that we follow them in order to provide the most value to our client, their organizational goals, and the community at large.

Conduct the HPT process in a way that is systematic

A systematic approach refers to constructing and following a ste-by-step plan, similar to climbing a ladder one step at a time or following a road map, until reaching the desired performance level.
— Chyung (2005)
A systematic approach to instructional design means that the conditions, performances, and criteria in the critical job tasks that the ID team identified in the task analysis remain consistent throughout the subsequent design phase deliverables…
— ID Course Handbook (2021)

Without a systematic approach, practitioners risk missing vital analysis that informs final deliverables or skipping steps that ensure the findings of the analysis are correctly adhered to. The Learning and Performance Support (LeaPS) model lays out clear steps that ensure no information or relevant analysis is missed. My team referenced this model during the semester, and at times even revisited and revised our previous work, in order to properly conduct a systematic approach.

Conduct the HPT process in a way that is systemic

A systemic approach refers to considering all the necessary components that are mutually influential on one another by identifying often complex linkages among them.
— Chyung (2005)

Again, we consulted the LeaPS model to ensure our project team was considering all aspects of the organization, the learners, and the community that could influence our process or end result.

Conduct the HPT process in a way that is consistent with established professional ethics

Following established professional ethics protects the practitioners, the learners, the organization, and the community. At every stage in the project, it is vital that practitioners conduct themselves in a manner consistent with ethical standards and extend those same standards to the project work. The International Society for Performance Improvement (ISPI) outlines six principles that guide behavior in our profession.

Conduct the HPT process in a way that is consistent with established professional standards

Similar to following professional ethics, the ISPI suggests ten performance standards that practitioners follow. Following such standards ensures that our profession as a whole offers value and is highly regarded by clients and learners. We honor ourselves, our peers, and our profession when we uphold these standards.

Align performance improvement solutions with strategic organizational goals

As HPT practitioners, we are adept at offering value to our clients while also focusing on the community at large. Both these aspects should be in alignment in order to follow ethical standards and to increase value to organizations. As a part of the upfront analysis, practitioners should understand how and why any intervention directly follows organizational goals. This is a part of the systemic process that ensures interventions are applicable and ideal across the organization.

Make recommendations that are designed to produce valued results

Practitioners are adept at understanding the ask and the organization thoroughly enough to make tailored suggestions that produce desired results. From the beginning, we should focus on the end-user and analyze their needs, environment, and characteristics. From there, we can utilize our systematic processes and models to arrive at an intervention that produces highly valued results backed by evidence-based practices and proven solutions.

Collaborate effectively with others, in person and virtually

In a new virtual world, connecting with peers and clients is more difficult. Our team and client were spread across three time zones, so from the beginning, we emphasized establishing consistent and effective communication. Because of the challenges of this format, it was important that we utilize multiple forms of communication and keep in mind the preferred channel of our client. This demanded we be flexible, available, and considerate. A practitioner should, at all times, work towards the clients’ preferences to ease their workload.

Communicate effectively in written, verbal, and visual forms

Similar to the previous goal, practitioners should be able to communicate effectively regardless of the format. This comes from practice and following the successful examples of others. I talk a little about my practices communicating with clients here, and I carry those practices into communicating with a team as well. I focus on concise but thorough communications and try to be as available and accommodating as possible.

Use evidence-based practices

Savvy IDs can defend the decisions they made in creating, implementing, and maintaining training.
— ID Course Handbook (2021)

The value in HPT comes from practitioners’ ability to translate evidence-based practices into results for clients. We increase the credibility of ourselves and our field when we can communicate and show evidence supporting our decisions and recommendations. This is also why staying up-to-date on new findings and innovations is so important for practitioners, and building a community of collaborative individuals can help support the field.

Instructional Designers as Business Partners

Before I entered the field of instructional design, I had never heard of the field. I have a theory that unless the instructional design is bad (or nonexistent), then it goes unnoticed. Everyone can attest to a training they’ve sat through that was A. boring, B. ineffective, and/or C. a waste of resources. Luckily trained instructional designers can offer an objective perspective to addressing performance opportunities. We have the expertise to follow and systematic and systemic model that allows us to identify and analyze opportunities, and design, develop, implement, and evaluate interventions.

I love the field of instructional design because it gives structure and guidance to a subtle and widely-applicable specialization. We’ve been trained to study the learner, anticipate their needs, and spark their motivations. We can triangulate data to identify organizational performance gaps, suggest innovative and effective interventions, and deliver evidence-based results to the client.

Given that we can offer such unique perspectives and are trained to deliver cost-effective, evidence-based, learner-focused results, you should look into contacting an ID to be part of your team!

eLearning Object Review: My Original eLearning Object

Over the course of the semester, we have been designing and developing our own original eLearning objects. The objects are scenario-based and allow learners to work through multiple branches based on their decisions to arrive at different outcomes. I decided to create a scenario in which an employee is asked to interview a candidate to decide their eligibility for a position using behavioral interview techniques. Scenario-based eLearning was the perfect opportunity to run a simulation of what an interview could look like while preserving a risk-free and safe environment in which to learn from your mistakes.

There are eight learning domains within workplace training: interpersonal skills; compliance; diagnosis and repair; research, analysis, and rationale; tradeoffs; operational decisions and actions; design; and team coordination (Clark, 2013, p. 23). This object falls within the learning domains of

  • Interpersonal skills: the learner needs to communicate effectively with the candidate to achieve the desired outcome

  • Research, Analysis, Rationale: the learner is required to make a recommendation about the candidate’s eligibility based on the information learned during the interview

  • Team coordination: the learner must to communicate effectively with the interviewing team to achieve the desired outcome

Introducing: Wait for the Great!

Screenshot of the first decision in Wait for the Great! eLearning

We were asked to develop a high-level design document and a storyboard that describes at least a few of the paths and outcomes that a learner could encounter.

Click here to preview the design document.

Screenshot of high-level design document.

Click here to preview the storyboard

Screenshot of the storyboard document.

Here, I expand on 3 evidence-based principles and describe how they are applied in this object:

  • Workplace trigger event: A critical component of scenario-based eLearning is setting up a trigger event. This is the event that the learner will encounter in the workplace that demands the application of the knowledge previously learned. Not only does it create a need for the learner, but it also sets a realistic stage for the learner and gives them the context they need to address the demand. In this object, a learner is asked to be the interviewer for a candidate and use behavioral interviewing to make a recommendation about the candidate. Within the object, the learner is provided resources such as a guide to behavioral interviewing and the candidate’s resume to help build out the context of the situation and give them everything they need to prepare.

  • Closed response options: At each decision point, the learner is able to select one question to ask the candidate. While this does not emulate how a real-life conversation would work, it does emulate how an interviewer would prepare questions beforehand. Closed response options also limit the object’s complexity, and therefore manage cognitive load. And while the multiple-choice options don’t always reflect the breadth of the conversation that could take place during an interview, it provides a good demonstration of a basic conversation within a defined environment.

  • Intrinsic and instructional feedback: Learners will receive immediate intrinsic feedback (changes in physical reaction) from the candidate as they progress and immediate and delayed instructional feedback both after each decision and at the conclusion of the scenario. In a real-life conversation, a learner would respond to their counterpart’s physical as well as verbal communication. Building in intrinsic feedback was an important feature to build into this object as the learner will be able to gauge their progress in part by the candidate’s physical response. The learner is also given instructional feedback as they progress and at the conclusion of the scenario. This is an attempt to correct the learner as they move through the object and provide scaffolding if they are in need. Clark (2013) mentions that “incorrect responses, if not immediately corrected, can embed the wrong knowledge and skills in memory,” (p. 104). The prior experience of the learners varies, so providing more specific feedback when the learner selects the “OK” or “Bad” options compensates for a learner’s lack of experience or prior knowledge.

As much as I tried to stay true to Tim Slade’s 5-step Storyboarding Process, I ended up adapting the process based on my previous experience and my strengths:

  • My Ah-Ha! moment: Unsurprisingly, my favorite part of planning the eLearning was working with the flowchart. Because this is a conversation, it can go in many directions and I wanted it to be as organic as possible. Working with LucidChart allowed me to be very flexible with my planning of the learner paths. Doing most of my planning within the flowchart allowed me to keep the project within scope (not include too many tangents) and re-use screens and responses easily.

  • My Oh No! moment: I decided on this topic because it was one that I had previously built training on. And having sat through multiple behavioral interviews, I felt comfortable crafting a realistic conversation. This ended up being much more difficult than I anticipated. I had a stock of questions in mind but had trouble coming up with realistic candidate responses. I also narrowed down the conversation a bit to keep the project in scope. In reality, there are many directions the conversation could go in based on candidate responses that include more nuances than a 40-screen eLearning object could capture.

  • Areas for improvement: The goal of this scenario-based eLearning object is to allow learners to practice behavioral interviewing in a low-risk setting. If I were to redesign this object, I would want to make it more similar to a realistic setting in which they would apply this knowledge. This involves using audio for the candidate responses (and allowing the learner to replay and/or review the transcript) and video to capture more realistic responses and non-verbal communications.

I really enjoyed creating this original eLearning object and having the chance to share it. Feel free to leave comments or suggestions!

Performance-based Training Design and Development

Performance requirements describe the intended outcome of instruction, a.k.a. what learners should be able to do upon completing the instruction (OPWL, 2019, p. 175). By specifying performance requirements, IDs practice designing with the end in mind. They are used to create performance assessment instruments and instructor guides during the design and development phase of the LeaPS Model (p. 173). While crafting performance requirements, IDs will pull information from the task analysis and SME-provided information associated with each critical task. This knowledge gathered during the analysis phase determines the conditions, performances, and criteria of the job task that should stay consistent throughout all deliverables. Performance requirements also help to determine what should be included/excluded from training and should be referred to during the creation of future deliverables. For each critical task, performance requirements will use the 3-part format that describes: (1) the learner’s on-the-job performance; (2) the conditions under which individuals will be expected to perform; and (3) the criteria that define acceptable performance.

Example:
“Given a list of competitors, the sales representative will be able to identify the names of the four major competitors with 100% accuracy.”

Job aids can be used to eliminate/reduce training times or provide post-training performance support while in the workplace (p. 152). They can be used as standalone performance interventions or alongside training. IDs can use job aids to serve the client and provide the most value in a cost-effective way. During the analysis phase, we have to ask questions to determine if a job aid is feasible. These questions consider the learner, the environment in which the job aid would be used, and modality.

Work Situation Could a Job Aid Be Appropriate?
Consider Work Situations Where a Job Aid Could Be Appropriate.
Is sequence is critical for task success? Yes, No, Need more information
Could a job aid enhance performer confidence? Yes, No, Need more information
Are the consequences of workplace error high? Yes, No, Need more information
Is the task performed infrequently? Yes, No, Need more information
Is the task easy to get wrong? Yes, No, Need more information
Does the task performance depend on frequently changing information? Yes, No, Need more information
Can complex task performance be described in detail? Yes, No, Need more information
Does task performance require the use of a large body of information? Yes, No, Need more information
Consider Work Situations Where a Job Aid Could Be Inappropriate.
Could use of a job aid damage credibility or customer confidence? Yes, No, Need more information
Would use of a job aid slow or degrade performance? Yes, No, Need more information
Does the workplace environment lend itself to a job aid? Yes, No, Need more information
Is performer memory a better option? Yes, No, Need more information

Performance assessment instruments are used to measure the mastery of these objects you have created during the design and development phase of the LesPS Model (p 192). This is again an example of designing with the end in mind. Putting performance assessments instruments in place early ensures that excessive or unnecessary training is not created. Again, you will use information from the task analysis and the objectives to create an assessment that asks the learner to perform the task under the job conditions and judges their performance according to the job standards. It tells learners what they’re doing well, where they could improve, and how they can make those improvements (p. 195). Proper learning objectives will help IDs create authentic assessment instruments because they can match the performance, the conditions, and the criteria. Mind that if you are creating a stand-alone job aid with no training, then no performance assessment instrument is needed.

A Chain Linking Job Tasks to Instructional Materials, from ID Course Handbook (2019)

I appreciate so much that the LeaPS Model builds upon solid analysis. The analysis phase is my favorite part of any ID model as that is where I get to ask the questions, get in touch with the learners, and organize and triangulate data. We focus a lot of our time on the analysis and information gathering so that our trainings are concise and yield the maximum benefits. Unfortunately, because we are a training team, we are constantly asked to build just eLearning modules. We take our time to offer alternatives, such as job aids, that could save time and resources, but clients rarely pivot away from a training module. Because of this, we also have very specific performance assessments instruments that appear as test-out quizzes after the training. This is a massive missed opportunity both for us and for our organization in many ways, unfortunately. We maintain our expertise and improve on our practices when we can, but it is a slow struggle.

References

OPWL 537 ID Project Handbook. (2019). Training Requirements Analysis (TRA) Template.

Organizational Performance and Workplace Learning. (2019). Instructional Design Course Handbook (4th edition).

eLearning Object Review #4: Reverse Engineering Storyboard

Last week, I had the opportunity to deep dive into understanding a found eLearning Object, Car Sale Dialogue. This week, I continue that analysis by building out a visual storyboard based on the final content shown in the module. For more information about storyboards, scripts, and templates, check out this previous blog post.

For this storyboard, I updated a template from OPWL 551. I considered making my own since I was familiar with the content of the module, but decided to use this template as it allowed me to fill out more detail and has space for reviewer notes. Building out a storyboard for this module was not nearly as intense as deciphering its wireframe and all the potential pathways. This module contains no audio and no multimedia outside of images. Therefore a visual storyboard was easy to reverse engineer.

There are only three main screen types found in this module: a dialogue/decision screen, a feedback screen, and an outcome screen. The only graphics that change from screen to screen are the customer’s image and her mood meter. Her dialogue is always in the same spot as are all the responses that the learner has to choose from. And while there are no explicit guidance techniques, the module draws emphasis to clickable objects by coloring them with contrasting blue and adding hover states.

If I were to improve on this module, I would offer an optional help screen or some forced navigation directions to inform the learner of navigation. I am also a huge fan of consistency when it comes to designing interactive objects, so I would have preferred if all the buttons were the same color and hover state throughout. The images below are of the buttons found in the module. While they both contain the same blue color that indicates they are interactive objects, they don’t utilize the same style that could help the learner recognize them as interactive objects immediately.

Overall, I think this module accomplishes what it intended to do. Its strength lies in its detailed and lengthy branches that give the learner a sense of how an interaction with a customer could go. And while I am a fan of fancy multimedia, I don’t think it is necessary here. The subject matter is not emotionally-driven, so video and audio aren’t missed. The use of simple graphics and a simple interface make development easier and more cost-effective.

References

Cannon and Harding (2017)

Slade (2016)