Sub-Badge 1: Evaluate instructional and non-instructional interventions
Challenge 1: Implement formative evaluation plans
Criteria for successful completion of this challenge: Evidence of implementing a formative evaluation plan to provide information that can be used to make adjustments and improvements in the design. Evidence must show a formative evaluation plan (expert review, one-to-one evaluation, small group, and field trial). Reflection must address: Which phase(s) of formative evaluation did you conduct? Which data did you collect (e.g., clarity and accuracy of instruction, general attitudes, procedural issues, etc.)? What were the results of formative evaluation and how did it affect your design?
Examples: Evaluation Plan (EDCI 528), Design Documents (EDCI 572), Learning Module (EDCI 575), eLearning Project (EDCI 569), artifacts showing strategies for implementation of an evaluation plan (design, performance, workplace, educational, other).
Reflection
I submit an evaluation plan I created as part of my coursework in EDCI 528, “Human Performance Technology,” during the Summer of 2023. This artifact showcases my ability to select appropriate interventions based on a thorough performance issue analysis and develop an evaluation plan built on the principles of the Kirkpatrick four-level model. One of the primary learning goals of this course was to demonstrate the need for ID professionals to develop a deep understanding of human performance issues. For this assignment, I relied on experts to provide the necessary feedback to guide my development of this evaluation plan.
The impetus for my evaluation plan was rooted in a real-world problem I identified at my place of work. For some time, I had been aware that the support and training structure I played a significant part in creating was not leading to consistent, high-quality outcomes. As I investigated the performance issues and identified the likely root causes, I developed a series of interventions to remedy the current system's shortcomings. Once I had established my proposed plan, I conducted two types of formative evaluation to solicit feedback from learners and experts. First, I conducted one-on-one interviews with a sample of the PT instructors that these interventions would affect. I received confirmation during this phase that the problems I had identified were valid. Second, I asked one of my peers to provide feedback on the proposed changes. He confirmed that the existing systems had room for improvement and agreed that my plan was feasible.
I frequently reach out to my peers for feedback on the clarity, accuracy, and general feasibility of training interventions I develop. I am currently working with an SME workgroup on a project significantly outside my expertise. I rely heavily on these SMEs to provide formative feedback as I develop and modify the design as indicated by their feedback.
Working with SMEs is a skill set I am still developing. I look forward to continuing to refine my ability to solicit meaningful feedback from small group interactions and field tests before deploying new training interventions.
Artifact -- Evaluation Plan
Challenge 2: Implement summative evaluation plans
Criteria for successful completion of this challenge: Evidence of implementing a summative evaluation plan to evaluate the effectiveness of the instruction and decide whether to continue to use instruction. Evidence must show an evaluation plan (e.g., Kirkpatrick’s Four Levels of evaluation). Reflection must address: If the implementation of the summative evaluation met your expectations. What were the results of the summative evaluation (did you continue with program/instruction, did you cancel it, did you modify it)?
Examples: The following assignments are applicable if implemented: Evaluation Plan (EDCI 528), Evaluation Plan (EDCI 577), artifacts showing implementation of an evaluation plan (design, performance, workplace, educational, other).
Reflection
I submit an evaluation plan I created as part of my coursework in EDCI 528, “Human Performance Technology,” during the Summer of 2023. This artifact includes formative and summative assessment items that provide necessary feedback. I followed the Kirkpatrick four-level model as a framework to identify evaluation methods for each level.
The primary concern when developing this evaluation plan was to ensure congruence between training interventions and the purpose of the training program. The root cause of the performance problem this training seeks to solve is that PT instructors do not receive sufficient support or mentoring to perform optimally. I followed the Dick and Carey method of designing and conducting a summative evaluation to ensure I maintained congruence while evaluating the training. I considered the organization's needs, the availability of resources, and the existing training materials. I discussed the organizational needs with an expert, the training director, to identify organizational needs. After the changes highlighted in my training plan were implemented, I reached out to the learners to assess how they perceived the relevancy and transferability of the training content. This feedback from a closed loop so that I could make necessary changes if indicated. The final step, which is currently ongoing, is to measure the outcome data of the training intervention. I plan to collect this data through observations and performance ratings in the future. Once this last piece of data is assembled, I will reevaluate the training program's current state and make any necessary changes.
I am minimally experienced in conducting summative evaluations of training interventions other than making “gut” decisions based on my perception of success. I am beginning to understand the value of collecting and collating data that can be analyzed and formulated in an action plan. This is especially important when communicating to stakeholders how the training went and areas where we can adjust to optimize future training.
I tend toward a data-driven mindset when making evaluations. Though not all evaluation points can be objectively measured, I am interested in expanding my ability to collect both subjective and objective evaluation data and synthesizing the results to understand better the efficacy of current training interventions with an eye on future improvements.
Artifact -- Evaluation Plan