SSH Journal Publishes ASPE Abstracts

By Val Fulmer

 Every year, after the ASPE abstract review sub-committee has poured through hundreds of poster, workshop and research abstracts to select those most suitable for the ASPE annual conference, a selection of the best abstracts are then forwarded on to Rachel Yudkowsky MD, for yet another review.

 Dr. Yudkowsky, of the University of Illinois, Chicago, is a member of the editorial board for Simulation in Healthcare, the Journal of the Society for Simulation in HealthCare (SSH), and a longstanding member and mentor of the ASPE community.  She received the ASPE Educator of the Year Award in 2009 (http://aspeducators.org/node/91), and continues to expand the network of paved paths for other SP educators to explore with her widely recognized research and publications in the field of SP/Simulation Methodology. 

 After receiving the selected abstracts, the process of inviting a selected number of abstracts for formal publication in Simulation in Healthcare can begin.  Dr. Yudkowskyfirst coordinates a review committee consisting of past chairs of the ASPE Grants and Research Committee; then the committee selects abstracts that advance the field of SP methodology by describing well-evaluated innovative uses of SPs and rigorously conducted research studies.  These abstracts represent ASPE to the larger simulation world, and reflect the breadth and depth of our creative and scholarly activities.

This year, in the June 2013 edition of the journal, eight abstracts were published.

A brief summary follows the title and author list of each abstract below. The complete abstracts may be found in the Vol. 8, Number 3, June 2013 issue, Journal of the Society of Simulation in Healthcare.  http://journals.lww.com/simulationinhealthcare/Fulltext/2013/06000/Association_of_Standardized_Patient_Educators__.11.aspx

INTERPROFESSIONAL ERROR DISCLOSURE SIMULATION BENEFITS BOTH STUDENTS AND FACULTY

Carla Dyer, MD1, Gretchen Gregory, MS, RN2, Erica Ottis3, Dena Higbee, MS1, Les Hall, MD1

1UNIVERSITY OF MISSOURI SCHOOL OF MEDICINE, COLUMBIA MO, USA 2UNIVERSITY OF MISSOURI SINCLAIR SCHOOL OF NURSING, COLUMBIA MO, USA 3UNIVERSITY OF MISSOURI-KANSAS CITY SCHOOL OF PHARMACY, KANSAS CITY MO, USA

Abstract Summary:

Error disclosure is a challenging part of clinical practice. An interprofessional error disclosure program was adapted to include standardized family members. The goal was to demonstrate adoptability of the program in addition to evaluating improvements in student self-reported knowledge while promoting interprofessional collaboration.

183 health professional students (medical, pharmD, and nursing) participated in teams of three to four students. Following a lecture on disclosure techniques, error was disclosed by the team to a standardized family member who reacted uniquely to each group. Interprofessional faculty facilitated each group interaction. Students and faculty members completed pre/post activity qualitative and quantitative evaluations. Results demonstrated significant improvement in self-reported knowledge of disclosure and comfort with the skill by all professional groups.

AN INTERPROFESSIONAL ROUNDING SIMULATION WITH STUDENTS OF MEDICINE, NURSING, AND PHARMACY

Lee Ann Miller, EdD1, David Wilks, MD2, Jay Martello, PharmD3, Charles Ponte, PharmD3, Jason Oreskovich, DO2, Daniel Summers, RN, BSN, CEN, EMT-P1, Gail Van Voorhis, MSN, RNC4, Rebecca Kromar, DNP, MBA, RN4, Jon Wietholter, PharmD3, Lena Maynor, PharmD3, and Gina Baugh, PharmD3

1WEST VIRGINIA SIMULATION TRAINING AND EDUCATION FOR PATIENT SAFETY (WV STEPS) 2WEST VIRGINIA UNIVERSITY SCHOOL OF MEDICINE 3WEST VIRGINIA UNIVERSITY SCHOOL OF PHARMACY 4WEST VIRGINIA UNIVERSITY SCHOOL OF NURSING

Abstract Summary:

Interprofessional education occurs when two or more health professions learn with, from, and about each other to improve collaboration and quality of care. The inclusion of both high fidelity simulators and trained standardized patients allowed the students to experience a diverse set of patient medical conditions. Faculty members from pharmacy, nursing and medicine constructed an authentic two station rounding experience of for small working groups consisting of one medical student, two nursing students and two pharmacy students. An SP with a bee sting with cellulitis and a manikin with an ischemic stroke requiring ventilation were presented to each group.  Encounters were video recorded for subsequent review and evaluation. Pre-and post-tests of knowledge with questions drawn from all three disciplines were administered. Results of this project suggested students who learn via simulation in interprofessional groups are challenged, yet positive about the experience. Gains in basic knowledge scores suggested there was sharing of information in the small groups.

USE OF STANDARDIZED PATIENTS TO EVALUATE MEDICAL STUDENT CLINICAL SKILLS EVALUATION POST ENCOUNTER NOTES

Rhonda A Sparks, MD, Michelle D Wallace, BS, and Britta M Thompson, Ph.D. CLINICAL SKILLS EDCATION AND TESTING CENTER, UNIVERSITY OF OKLAHOMA COLLEGE OF MEDICINE, OKLAHOMA CITY OK, USA

Abstract Summary:

Multiple station Clinical Skills Evaluations (CSE) typically include a student produced standard subjective, objective, assessment, plan (SOAP) note after simulated encounters. These notes are usually scored by faculty in order to provide accurate student evaluation and feedback. The large number of notes generated can be a significant burden for clinical faculty time and usually demands a quick turnaround. To facilitate prompt evaluation of notes, standardized patients (SPs) with a clinical background were employed to assist using a standard checklist format. 161 fourth year medical students completed an eight station CSE complete with a SOAP note. Prior to the CSE, a group of eight faculty members determined the content of the checklist to be used by the SPs to grade the student notes. SPs were trained and graded two cases; a clinician was available to answer questions. A clinical faculty member graded a random sample of notes for each case to validate the evaluations. Analysis of the data indicated almost perfect agreement overall between SP note graders and the clinical faculty grader, indicating that SPs with previous experience in a healthcare field can accurately evaluate medical student CSE SOAP notes using a carefully developed checklist.

ADEQUATE REPRESENTATION OF SOCIO-CULTURAL ISSUES IN OUR STANDARDIZED PATIENT SCENARIOS?

 Karen Szauter, MD1, Valerie Fulmer, BS2, Dehra Glueck, MD3

 1THE UNIVERSITY OF TEXAS MEDICAL BRANCH, GALVESTON TX, USA 2UNIVERSITY OF PITTSBURGH SCHOOL OF MEDICINE, PITTSBURGH PA, USA 3WASHINGTON UNIVERSITY SCHOOL OF MEDICINE, ST. LOUIS MO, USA

Abstract Summary:

Healthcare workers interact, and must be prepared to engage, with people from diverse backgrounds. Additionally, awareness of personal socio cultural bias is critical to ensure optimal patient care. Standardized patient (SP) experiences provide important opportunities for students’ clinical skill development and an ideal environment for personal reflection following an encounter. This study was performed to evaluate the representation of demographic, social, and cultural variables in SP scenarios used in medical education.

SP training materials from three universities were reviewed. In addition to the learning objectives and presenting problem, case details such as age, gender, ethnicity, educational background, employment, sexual orientation, life details, and substance use were gathered. The study identified an imbalance in representation of socio-cultural diversity and a need for comprehensive review of SP case libraries to ensure broad inclusion of socio-cultural content. Opportunities exist to enrich SP scenarios to better represent the diverse populations served by the medical community.

COMMUNICATING THE DIAGNOSES: ARE WE ALL ON THE SAME PAGE?

Karen Szauter, MD, Lori Kusnerik, AAS, Anita Mercado, MD, Michael Ainsworth, MDTHE UNIVERSITY OF TEXAS MEDICAL BRANCH, GALVESTON TX, USA

Abstract Summary:

Synthesizing patient information and communicating diagnostic impressions are complex skills. What patients are told, what they comprehend, and what is documented in the medical record ideally should align. This study examined the association of diagnostic information between what is said (by students), what is comprehended (by patientt) and what is written (by students) in patient notes. Two scenarios from a Clinical Skills Assessment (CSA) were studied.  Four standardized patients (SPs), trained to portray/ score the cases, were asked to document the content and clarity of diagnoses provided by the students. Video recorded encounters were transcribed and reviewed by two investigators to identify the diagnoses that had been communicated to the patients. Three sources for diagnoses were compared: what students said (from transcriptions), what patients heard (SP recall/documentation) and what students wrote (patient notes).  Descriptive analysis was performed. When comparison of “heard” to “written” information was made notable content variation was revealed.  In over half of the encounters, diagnoses written in student notes were not identified by SPs as being “heard”. Upon review, the majority of these were not discussed during the encounter. This work demonstrated that SPs can accurately recall diagnosis that they were told but diagnoses documented were often different. This difference requires further investigation. 

USING STANDARDIZED PATIENTS TO PREPARE “SUPER USERS” FOR AN EMR ROLLOUT

Jeanette Wong, RN, MPA, Celeste Villanueva, CRNA, MS HEALTH SCIENCES SIMULATION CENTER, SAMUEL MERRITT UNIVERSITY, OAKLAND CA, USA

Abstract Summary:

The rollout of an electronic medical record (EMR) system is a significant event in a healthcare system. The traditional EMR preparatory training includes a combination of in classroom and on-line training modules. The use of standardized patients (SPs) was included to add realism to the training, to allow learners to experience the potential obstacles in providing patient focused care and to develop implementation strategies. A group of interprofessional healthcare providers volunteered to go through the training to assist their colleagues during later implementation. These “Super Users” experienced a standardized patient portraying a case which required a nursing assessment, admission orders, a respiratory treatment, and medication delivery. These interventions allowed for an interdisciplinary team to interact with the patient and chart on the EMR system. The encounters were videotaped and debriefed with all involved. The feedback from the over 90 participants was overwhelmingly positive and learners clearly understood the difficulties of patient-focused care while working with EMR. The feedback from this simulation experience provided the team with strategies to share with their colleagues during the implementation phase. 

FLYING HIGH: INTEGRATING HYBRID STANDARDIZED PATIENT SIMULATION MODALITIES IN TRAINING PROGRAMS FOR FLIGHT MEDICS AND OTHER CRITICAL CARE TRANSPORT SPECIALISTS

Jorge D Yarzebski1, BA, EMTP1, Wendy L Gammon1, MA, Med, Adam Darnobid2, MD, William Tollefsen2, MD, Angela Talbot2RN

1INTERPROFESSIONAL CENTER FOR EXPERIENTIAL LEARNING AND SIMULATION (ICELS), OFFICE OF CONTINUING MEDICAL EDUCATION, OFFICE OF MEDICAL EDUCATION, STANDARDIZED PATIENT PROGRAM, OFFICE OF EDUCATIONAL AFFAIRS, UNIVERSITY OF MASSACHUSETTS MEDICAL SCHOOL, WORCESTER MA, USA 2EMERGENCY MEDICINE, UMASS MEMORIAL HEALTHCARE, UMASS MEDICAL SCHOOL, WORCESTER MA, USA 3EMERGENCY MEDICINE, UMASS MEMORIAL EMS/LIFEFLIGHT, DEPARTMENT OF NURSING, UMASS MEMORIAL HEALTHCARE, WORCESTER MA, USA

Abstract Summary:

Standardized Patient (SP) modalities are not commonplace in allied health curricula as they are in graduate medical education.  This program developed SP curricula to train pre-hospital emergency providers (PEPS) utilizing an Objective Structured Clinical Exam (OSCE) to prepare the paramedics for work in critical care transport (CCT).  The office of Continuing Medical Education and the SP program created a critical care transport specialist test that focused on communication and simple crisis mitigation based on established principals. After a four month didactic and practical orientation the OSCE was implemented to test the paramedics’ ability to manage clinical competence, difficult patient hand-offs, professionalism, and discord among team members in CCT. Performance was measured using checklists and performance scales, and learners rated the OSCE pre and post encounter with a likert scale rating. Positive evaluations highlighted opportunities to learn and practice skills in a non-threatening and positive environment. This program demonstrates the OSCE model is achievable and affordable to paramedic training institutions.

COMPARING THE VALIDITY OF CLINICALLY DISCRIMINATING VS TRADITIONAL “THOROUGHNESS” CHECKLISTS

Rachel Yudkowsky, MD, MHPE1, Yoon Soo Park, PhD1, Janet Riddle, MD1, Catherine Palladino2, and Georges Bordage, MD, PhD1

1DEPARTMENT OF MEDICAL EDUCATION, UNIVERSITY OF ILLINOIS AT CHICAGO COLLEGE OF MEDICINE, CHICAGO IL, USA 2UNIVERSITY OF ILLINOIS COLLEGE OF PHARMACY, CHICAGO IL, USA

Abstract Summary:

High-quality checklists are essential to the validity of performance tests. A previous study found that physical exam checklists that clinically discriminated between competing diagnoses provided more generalizable scores than thoroughness checklists. The purpose of this study was to compare validity evidence for clinically discriminating checklists vs. traditional thoroughness checklists, hypothesizing that validity evidence would favor clinically discriminating checklists. Faculty developed six SP cases, for a 4th year summative Clinical Skills Exam (CSE), with case-specific history and physical exam checklists of about 20 items. Four clinician experts independently identified a subset of items that discriminated between the competing diagnoses of each case. Half of the SPs for each case were trained to complete the traditional (“long”) checklist, and half to complete the clinically discriminating “short” checklist. Video review was conducted to ensure score accuracy. Validity evidence for CSE scores based on the “long” checklist was compared to the evidence for scores based on the “short” checklist. Validity evidence favored the “short” checklist in two areas; response process, reflected in higher SP checklist accuracy, and internal checklist structure. There were no significant differences overall in relevance ratings, difficulty, or cut scores of the short vs. long checklist items. These results indicate that limiting checklist items to those that impact the diagnostic decision provided improved accuracy and psychometric indices.  “Thoroughness” items that are performed without thinking do not reflect students’ clinical reasoning ability and may contribute “noise” to the score.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s