Overview
My curriculum vitae (CV) is a comprehensive, detailed description of my professional, scholarly, and academic credentials and achievements. Click on the button, below, to download and view my CV.
Overview
My curriculum vitae (CV) is a comprehensive, detailed description of my professional, scholarly, and academic credentials and achievements. Click on the button, below, to download and view my CV.
Practical Application Essay of IDD&E Knowledge, Skills, and Attitudes
Amber A. Walton
12 June, 2019
Scenario
I am employed as a curriculum specialist in the role of Director of the Clinical Performance Center (CPC) at Eleanor Roosevelt College of Nursing (ERCON). As director, my responsibilities include administration of the CPC; managing a team of 10 full-time CPC employees, oversight of coordination and organization of quality and performance improvement activities for students and staff at ERCON; teaching and coaching of performance quality initiatives; and overseeing the Simulation Center, which features mannequin-based simulation, web-based simulation, the Simulated Patient Program, and the Hybrid Simulation Program which includes augmented- and virtual-reality technologies and partial task trainers.
Much of my work (outside of research) is focused on coordinating with content experts at ERCON to use an instructional design approach to the clinical performance curricula within the College of Nursing. Analyzing current student performance, designing and developing instruction which may or may not include CPC technologies (depending on the needs of the performance gap to be closed), overseeing the implementation of programs, and conducting in-depth evaluation of educational programs are all the ways I bring an ID focus to clinical education at this college.
Identifying the Performance Issue
Registered Nurse (RN) students enrolled in NUR 102, “Holistic Health Assessment,” spend 22.5 clinical hours actively engaged in patient care under the supervision of a Clinical Instructor (CI). Patient care activities include conducting a patient health interview and performing a physical examination, among other responsibilities. RN students are required to complete at least 3 patient interviews/examinations per day for each of the 7 days spent in the clinic. CI’s observe student-patient interactions and complete an assessment checklist which includes written feedback. At the end of the week, CI’s meet with each student individually to review their assessment checklists and engage in a feedback dialogue. Students also meet with their academic instructor(s) to brief and debrief each week’s clinical duty hours.
This semester, the CIs’ assessment checklists indicated very low levels of competent performance in one area: entering patient data into the computerized Electronic Medical Records (EMR) system while simultaneously maintaining eye contact and building rapport with the patient. The course director for NUR 102 has approached me to determine what solution should be made to close this gap of knowledge, skills, and attitudes. The Dean for Curriculum at ERCON has requested that I personally work with this course director, instead of engaging one of the other educational specialists at the CPC, because the course director is rather new to the college and does not feel confident in her course design/redesign skills.
Drawing upon my training as an instructional designer, I formulate a process to understand, correct, and evaluate this performance problem following the ADDIE instructional design process: Analyze, Design, Develop, Implement, and Evaluate. To help the course director gain confidence in ID skills, I have made all steps in my process completely transparent and scheduled semiweekly meetings with her.
Analyze
Using an updated version of Gilbert’s Behavior Engineering Model (Chevalier 2013), we attempt to understand the performance problem… and to determine if the problem can be solved via instruction, in other words, is the problem is due to a gap in knowledge, skills, or attitudes (these can be solved via instruction).
Gilbert’s classic model separates performance issues into external (Environmental Supports) and internal (Person’s Repertory of Behavior) factors, and further organizes into Information, Instrumentation, and Motivation factors. The updated model renames the columns as 1. Information, 2. Resources, 3. Incentives, 4. Knowledge/Skills, 5. Capacity, and 6. Motives. Using this chart, we can organize the front-end analysis questions that we should begin to ask.
Several sources may be used to collect the data necessary to answer these questions:
1) An extant data review can answer questions in the 1. Information, 2. Resources, 4. Knowledge/Skills, and 5. Capacity domains.
2) Surveys may be used to assess student and CI information in all 6 domains.
3) Focus groups may be used to assess students’ 3. Incentives, 4. Knowledge/Skills, and 6. Motives.
4) Targeted interviews may be conducted to assess CIs’ responses to questions in all domains. One-on-one interviews may be used to follow-up after focus groups with students.
5) Observations of actual student-patient interactions (and student-CI interactions) would be very valuable, but most likely restricted due to patient confidentiality rights.
For the purpose of this scenario, let us assume that the following results were identified during front-end analysis:
1) Students, CI’s, and the course director all agreed that the instruction about the EMR system, building patient rapport, and maintaining appropriate eye contact with the patient (6 hours of instructional time) was adequate.
2) Students strongly felt that inadequate time was allotted to practice the skill of simultaneous EMR data entry and communication skills with the patient, including rapport building and eye-contact, leading to their lack of confidence. Student-to-student practice was helpful, but unrealistic.
Thus we may state that the performance problem is about skills and attitudes; an instructional intervention is an appropriate way to close this gap. The course director and I may now proceed to design an instructional intervention.
Design
The first step in the design process is to identify learning outcomes for this revised training. The course syllabus already states course learning goals and unit learning objectives, so newly created learning objectives for this new instruction will need to align with the existing goals and objectives. I encourage the course director to draft the new learning objectives using the ABCD method of writing learning objectives (Ferguson 1998): A stands for Audience, B stands for Behavior, C stands for Condition, and D stands for Degree. With some support, she creates the following learning objectives:
1) NUR 102 students will maintain eye contact with the patient at least 50% of the time while entering the patient’s medical interview data into the EMR system.
2) NUR 102 students will engage in rapport building dialogue with the patient at least 50% of the time while entering the patient’s medical interview data into the EMR system.
The next step in the process is to align learning assessments with learning objectives. The course director and I agree that formative and summative assessments will be most appropriate to measure student performance (change in skills and attitudes), since students identified that inadequate time for practice decreased their confidence in performing these skills in the clinic. The same assessment form will be used after the formative/practice session and the summative/real patient encounter sessions.
Design of the targeted educational intervention includes:
1) A 1-hour workshop for students about specific tips and tricks to use while entering EMR data simultaneous to rapport building and eye contact.
2) A 2-case Simulated Patient (SP) activity for formative assessment. Each student will conduct a medical interview on an actor portraying a simple illness (allergies in one case and acid reflux in the other) while inputting patient data into the EMR system for each of the two different SPs. SPs will use a silent clicker device in their right hand to track each time eye contact was used, and a similar clicker in their left hand to track when rapport building dialogue was used while students inputted data into the EMR. The SP sessions will be video recorded and students will watch their videos, tracking eye contact and rapport dialogue in their recorded performance. SPs will complete formative assessment checklists and will provide written feedback from the patient’s perspective. Students will self-reflect after each encounter and after the video review. A group debrief will be conducted by the course director and will include all students.
3) As summative assessment, CI’s will be trained on the clicker system and will record student behaviors during real patient interviews in the clinic. CI’s will complete an assessment checklist and will provide verbal and written feedback to the student during their end-of-week CI meeting.
Develop
The course director and I will work to develop the instructional materials required for the intervention. These will include:
1) The materials for the 1-hour workshop (PowerPoint slide show, handouts, introduction to the clickers, mock-ups of the EMR system
for in-class practice)
2) The SP case scenarios, including SP training materials, SP checklists/comment forms, student self-reflection forms, debriefer outlines
3) Revised summative assessment checklists and feedback forms
4) Student and CI evaluation forms to evaluate the instructional intervention
Other activities undertaken during the development phase include: recruiting, scheduling, training, and briefing the simulated patients; scheduling the Clinical Performance Center; setting up video recording and playback technologies; devising the project management plan; and ensuring all content and activities align with the learning objectives.
Implement
By following the project management plan, implementation of the project should unfold in three phases: didactic workshop, SP simulations, and CI-supervised patient interactions in the clinic. I will serve as the project manager and will share weekly status reports with the course director. The administration of assessment and evaluation forms will be coordinated through the CPC since checklist analysis software and staff support is included as a client benefit for course directors.
Evaluate
Student performance analysis will be conducted using the completed assessment checklists, feedback forms, debriefing notes, and self-assessment forms. Qualitative and quantitative analysis of both student performance and the instructional program as a whole are standard job responsibilities in my role as director of the CPC. I will complete a comprehensive final report and will share it with the course director first, for client input and revision, after which the report will be submitted to the Dean of Curriculum. Students and CI’s will have access to the final report via the Office of Curriculum website (student & CI identifying information will be blinded). Input from stakeholders about the final report will be solicited and collected through the Office of Curriculum. At the end of the semester, a final meeting will be conducted to debrief the project and plan for future curricular revisions.
Reflection
Of the 22 ibstpi (Koszalka et. al. 2013) Instructional Designer Competencies, those shown in the table, below, are the four that are most important to this ERCON scenario. A brief rationale is listed as well as my self-assessment of my current competency level for each.
Competency #6, needs assessment, is crucial for this project because, though we have CI-completed student performance assessments which clearly indicate the poor performance on eye contact/rapport-building dialogue while entering EMR data, we cannot rely on one measure alone to determine if the problem is, in fact, performance-related and what the exact problem is. Analyzing extant data, building a deep understanding of stakeholder experiences (i.e., student attitudes about this skill), acknowledging environmental barriers and resources – these are some of the important considerations that must be made to truly understand the performance problem. We can only proceed to the Design stage of the process after a deep, accurate understanding of the problem has been achieved.
My skills as a needs assessment analyst are still in development, though I do have experience conducting informal needs assessments from my work as a SP Educator (SPE). IDE 712 provided excellent information and multiple chances to practice developing a front-end analysis plan. Additional real-world experience, such as the ERCON scenario, will increase my confidence with these analysis skills. Right now, I would rate myself with a B+ in this competency.
Competency #12, design instructional interventions, is also very important in this scenario. We build upon the discoveries from the front-end analysis to pinpoint learning objectives, their associated assessment activities, and a project plan to implement instruction to close the performance gap.
I feel fairly confident in my instructional design skills and would rate myself with an A-. This ERCON scenario, in particular, features SP encounters; my 12 years of experience as a SPE make me feel confident about these tasks. It is also necessary in the design phase of the project to collaborate with the content expert to ensure accuracy of the instruction and assessment. Almost all of my course work in IDD&E has helped me develop greater confidence in this competency.
Competency #17, evaluation, is important in understanding the success of the project and how this may impact future instruction in NUR 102. Program evaluation, in my experience, is often overlooked, minimized, or outright denied; this is a lost opportunity, in my opinion. Many visual models of ADDIE (Taylor 2013) feature “Evaluation” as the central through line, connecting the A, D, D, and I phases through constant, iterative evaluation. In this scenario, evaluation is used at the end of the Implementation phase to review the project as a whole, include multiple points of view from myriad stakeholders, and set intentions for future course changes based on this project.
IDE 641 included a real-life evaluation project; working in a real context with a real client has helped me feel fairly confident about my evaluation skills. I would still rate myself as a novice evaluator and would score my current competency with a B+.
Finally, Competency #21, managing partnerships, is essential for almost every ID project, including the ERCON scenario. Respecting the Dean’s wishes, coordinating and communicating with and coaching the course director throughout the project, building rapport with students and CI’s during front-end analysis interviews, and overseeing the staff at the CPC – these relationship-building/maintaining responsibilities are essential for the role of instructional designer, as well as Director of the CPC.
I would assess my competency with partnership management with the grade of A. This was a daily task as a SPE and I learned and refined my skills on the job. Course work in IDD&E reinforced the importance of this skill, in particular, during group projects. Striving to maintain partnerships and collaborative relationships will be ongoing professional development work, even as I feel confident in my skills.
References
Chevalier, R. (2003). Updating the behavior engineering model. Performance Improvement, 42(5), 8-14. doi:10.1002/pfi.4930420504
Ferguson, L. M. (1998). Writing learning objectives. Journal of Nursing Staff Development, 14(2), 87-94.
Koszalka, T., Russ-Eft, D., Reiser, R (with Senior-Canela, F.Grabowski, B. & Wallington, C.J.) (2013) Instructional Design Competencies: The Standards (4th Ed). Information Age Publishing, Charlotte, NC.
Taylor, D. (2013). The ADDIE model for online course design. Retrieved from https://elearningmasters.blogspot.com/2013/08/the-addie-model-for-online-course-design.html