0
CME: ACCP Evidence-Based Educational Guidelines |

Lessons for Continuing Medical Education From Simulation Research in Undergraduate and Graduate Medical Education: Effectiveness of Continuing Medical Education: American College of Chest Physicians Evidence-Based Educational Guidelines FREE TO VIEW

William C. McGaghie, PhD; Viva J. Siddall, MA; Paul E. Mazmanian, PhD; Janet Myers, MD, FCCP
Author and Funding Information

*From the Feinberg School of Medicine (Dr. McGaghie), Northwestern University, Chicago, IL; American College of Chest Physicians (Ms. Siddall), Northbrook, IL; Virginia Commonwealth University (Dr. Mazmanian), Richmond, VA; and Uniformed Services University of the Health Sciences (Dr. Myers), Bethesda, MD.

Correspondence to: William C. McGaghie, PhD, Office of Medical Education and Faculty Development, Northwestern University, Feinberg School of Medicine, 1–003 Ward Building, 303 East Chicago Ave, Chicago, IL 60611-3008; e-mail: wcmc@northwestern.edu


Reproduction of this article is prohibited without written permission from the American College of Chest Physicians (www.chestjournal.org/misc/reprints.shtml).


Chest. 2009;135(3_suppl):62S-68S. doi:10.1378/chest.08-2521
Text Size: A A A
Published online

Background:  Simulation technology is widely used in undergraduate and graduate medical education as well as for personnel training and evaluation in other healthcare professions. Simulation provides safe and effective opportunities for learners at all levels to practice and acquire clinical skills needed for patient care. A growing body of research evidence documents the utility of simulation technology for educating healthcare professionals. However, simulation has not been widely endorsed or used for continuing medical education (CME).

Methods:  This article reviews and evaluates evidence from studies on simulation technology in undergraduate and graduate medical education and addresses its implications for CME.

Results:  The Agency for Healthcare Research and Quality Evidence Report suggests that simulation training is effective, especially for psychomotor and communication skills, but that the strength of the evidence is low. In another review, the Best Evidence Medical Education collaboration supported the use of simulation technology, focusing on high-fidelity medical simulations under specific conditions. Other studies enumerate best practices that include mastery learning, deliberate practice, and recognition and attention to cultural barriers within the medical profession that present obstacles to wider use of this technology.

Conclusions:  Simulation technology is a powerful tool for the education of physicians and other healthcare professionals at all levels. Its educational effectiveness depends on informed use for trainees, including providing feedback, engaging learners in deliberate practice, integrating simulation into an overall curriculum, as well as on the instruction and competence of faculty in its use. Medical simulation complements, but does not replace, educational activities based on real patient-care experiences.

This article has four sections. The first defines medical simulation and summarizes presumptive findings about simulation-based continuing medical education (CME) from the Agency for Healthcare Research and Quality (AHRQ) Evidence Report.1 The report aims to synthesize the results of nine literature reviews about “the effectiveness of simulation methods in medical education outside of CME.”1 It also serves as a foundation for other sections of this article. The second section amplifies the findings from best evidence medical education (BEME) in one of the nine reviews.2 The BEME review warrants special attention because it receives superficial coverage in the AHRQ Evidence Report,1 addresses a different question about simulation in medical education beyond its effectiveness, is not confined to a single medical specialty or medical simulator, and offers practical advice about simulation- based medical education program planning and operation. The third section distills and presents lessons learned about best educational practices drawn from both the AHRQ report and the BEME review as well as from two graduate medical education (GME) and research programs and other published sources. The fourth section presents implications for CME grounded in the preceding narrative, the changing focus and professional character of CME, and observations about education for other learned professions.

Medical education using some form of simulation generally has been aimed at the junior trainee both for undergraduate medical education (UME) and for GME. The benefits of simulation derive from its standardization and reproducibility in contrast with the traditional apprenticeship approach to teaching where medical students and residents learn through practice with real patients in the clinic or hospital setting. With the increasing number of patients who are hospitalized and the shorter lengths of hospital stays, requirements for limited trainee work hours, and an emphasis on patient safety, simulation has received greater attention at the UME and GME levels. However, simulation seldom is discussed in the context of CME.

This article reviews the use of simulation education in baseline assessment of knowledge and skills, education grounded in learning objectives, intended outcomes expressed in metrics, deliberate practice with feedback, rigorous outcome evaluation, and professional accountability. These constructs are addressed thoroughly in a call for CME reform in the United States.3 This article urges the physician-learner to participate in CME activities that include deliberate practice and where he or she can work toward a mastery learning of CME objectives. Physician-teachers should design CME activities that make use of teaching techniques that assist the physician-learner in mastery learning and deliberate practice, embrace outcome measurement, and address cultural barriers to incorporate these educational approaches. Simulation as a teaching technique can be used by the physician-teacher to achieve these goals.

Medical simulation is defined as “a person, device, or set of conditions which attempts to present [education and] evaluation problems authentically. The student or trainee is required to respond to the problems as he or she would under natural circumstances. Frequently the trainee receives performance feedback as if he or she were in the real situation.”4 “Simulation procedures for evaluation and teaching have the following common characteristics:

  • Trainees see cues and consequences very much like those in the real environment.

  • Trainees can be placed in complex situations.

  • Trainees act as they would in the real environment.

  • The fidelity (exactness of duplication) of a simulation is never completely isomorphic with the real thing. The reasons are obvious: cost, [limits of] engineering technology, avoidance of danger, ethics, psychometric requirements, time constraints.

  • Simulations can take many forms. For example, they can be static, as in an anatomical model [for task training]. Simulations can be automated, using advanced computer technology. Some are individual, prompting solitary performance while others are interactive, involving groups of people. Simulations can be playful or deadly serious. In personnel evaluation settings they can be used for high-stakes, low-stakes, or no-stakes decisions.”4

Medical simulations are located on a continuum of fidelity, ranging from detached, multiple-choice examination questions;5 to more engaging task trainers (arms for phlebotomy practice); to full-body, computer- driven mannequins with sophisticated physiologic features that respond to pharmacologic and mechanical interventions.6 Simulations also include standardized patients who are live persons trained and calibrated to portray patients with a variety of presenting complaints and pathologies. Decades of experience and research demonstrate that standardized patients are highly effective for medical education and evaluation.7 Standardized examinees (students) also have been used as a way to calibrate and improve clinical skills examinations.8,9 Medical educators have recently combined these modalities where standardized patients, inanimate models, and medical equipment are integrated to evaluate trainees' technical, communication, and other professional skills simultaneously.10

The AHRQ Evidence Report included a review of nine systematic reviews published between 1990 and 2006 that sought to evaluate the effectiveness of simulation methods in medical education outside of CME. The investigators abstracted data about study characteristics, educational objectives, learning outcomes, summary of results, conclusions, and quality of each review and graded the evidence of these articles according to each educational objective and outcome related to participant knowledge, attitudes, skills, practice behaviors, and clinical outcomes. The quality of each review was established using criteria derived from the Quality of Reporting of Meta-analyses statement,11 which is intended for use only as a guide for preparing reports on quantitative metaanalyses that include randomized controlled trials.

The AHRQ report has several limitations, the most important is the review methodology, which failed to find two eligible reports.12,13 This and other issues make the report's findings about simulation difficult to interpret unequivocally. Nevertheless, the AHRQ report1 argues that the overall “direction of evidence points to the effectiveness of simulation training, especially for psychomotor skills (eg, procedures or physical examination techniques) and communication skills,” despite the low strength of the evidence “due to the small number of appropriate studies and scarcity of [reliable] quantitative data.” We add to these deficits the narrow focus of eight of the nine-included reviews (ie, single medical specialty, single simulation method) and the weakness of most primary studies covered in the reviews. These limitations are attributed, in part, to the lack of consensus about standardized methods to quantify clinical competence, a persistent problem in medical education research. The AHRQ authors also speculate that other limitations may include difficulty of establishing “clinical realism [high-fidelity] for participants,” and “other features that may be responsible for inadequate quality of evidence in this field.”1

One of the nine literature reviews cited in the AHRQ report but not explained in depth is a systematic review done under the auspices of the BEME collaboration.14 The collaboration “involves an international group of individuals, universities, and organizations (eg, AMEE [Association for Medical Education in Europe], AAMC [Association of American Medical Colleges], ABIM [American Board of Internal Medicine]) committed to moving the medical profession from opinion-based education to evidence-based education. The goal is to provide medical teachers and administrators with the latest findings from scientifically grounded educational research.”2

The broad scope of the BEME systematic review2 is distinct from the narrower focus on the effectiveness of simulation compared to other educational techniques. This article addressed best educational practices, reviewing 670 journal articles published between 1969 and 2003. Despite the original intent to conduct a quantitative metaanalysis, the studies were so heterogeneous and weak methodologically that the investigators resorted to a qualitative, narrative synthesis. The primary outcome of the BEME review is an inventory of 10 features and uses of high-fidelity simulations that lead to effective learning. These features are listed in order of reported frequency (percent) among the final BEME pool of 109 articles, and the report concluded that “the weight of the best available evidence suggests that high-fidelity medical simulations facilitate learning under the right conditions.2 The 10 conditions2 are as follows:

  1. Feedback is provided during learning experiences (47%).

  2. Learners engage in repetitive practice (39%).

  3. Simulation is integrated into an overall curriculum (25%).

  4. Learners practice tasks with increasing levels of difficulty (14%).

  5. Simulation is adaptable to multiple learning strategies (10%).

  6. Clinical variation is built into simulation experiences (10%).

  7. Simulation events occur in a controlled environment (9%).

  8. Individualized learning is an option (9%).

  9. Outcomes or benchmarks are clearly defined or measured (6%).

  10. The simulation is a valid representation of clinical practice (3%).

These findings, chiefly from UME and GME, are a guide for defining a research agenda on simulations as educational technologies. In contrast with the eight other literature reviews covered in the AHRQ Evidence Report, the BEME review spans a wide variety of medical specialties and simulation technologies across a long time frame. In addition, its emphasis on features and uses of medical simulation that lead to effective learning (eg, feedback, repetitive practice, curriculum integration), not just comparative effectiveness, sets a standard for understanding the benefits of simulation for medical education and personnel evaluation. The BEME review2 on high-fidelity medical simulations also included a call for increased rigor in original simulation- based medical education research and improved journal-reporting conventions. In particular, the authors suggested that journal editors insist that all reports of primary research include basic descriptive statistics (eg, means, SDs, effect sizes, number of cases per group) that will permit quantitative synthesis in subsequent metaanalyses.

A subset of 31 journal articles reporting 32 research studies within the 109 articles was found to contain enough empirical data to permit a quantitative metaanalysis. The studies were framed to address the question, “Is there an association between hours of simulation-based practice and standardized learning outcomes?”15 Measured outcomes from these studies were cast on a standardized metric termed average weighted effect size. Hours of simulation-based practice in each study were grouped in the following five categories: none reported, 0 to 1.0, 1.1 to 3.0, 3.1 to 8.0, and 8.1 +. Data analysis revealed a highly significant “dose-response” relationship among practice and achievement, with more practice producing higher outcome gains. These results are presented in a subsequent report15 that demonstrated a direct relationship between hours of simulator practice and standardized learning outcomes.

The scholarship of the AHRQ report and BEME review is amplified by at least two other reviews12,13 about simulation-based medical education that also can inform CME practices. One way to highlight advances in simulation-based UME and GME is to focus on exemplary education and research programs that identify their special features. Work completed by two medical simulation education and research programs1627 are illustrative, as they are thematic, sustained, and cumulative and are of special interest for chest physicians. Additional studies inform on academic standard setting28 and mastery learning of clinical skills in advanced cardiac life support29 and thoracentesis.30

A more recent report31 discussed the “scope of simulation-based healthcare education,” pointing out that the best simulation-based medical education is a multiplicative product of simulation technology (eg, devices, standardized patients), teachers prepared to use the technology to maximum educational advantage, and curriculum integration. It argued that the major flaws in current simulation-based medical education stem from a lack of prepared teachers and curriculum isolation, not from technological problems or deficits.

The design of educational activities useful to practicing physicians assumes that CME program directors are knowledgeable about “what works” from scholarly reviews2,12,13,15 and from individual studies having strong research designs, such as randomized trials,24 mastery learning research,29,30 and cohort studies.32 Program directors also should be informed about the latest scholarship on technology in medical education.33 The key lesson is that medical simulation and other educational technologies are best used to complement, not replace, education grounded in patient care. CME in any form should be based on scientific best evidence rather than on opinion or habit.14

We endorse the position that CME best practices reside in educational programs that have the following three features: mastery learning, deliberate practice, and recognition and attention to cultural barriers within the medical profession that frustrate better CME programs. In particular, mastery learning and deliberate practice are ideally suited for simulation-based medical education. They also conform to accepted principles of adult education independent of teaching modalities.34

Mastery Learning

Essential elements of the mastery learning model have been described in earlier publications.3537 In brief, mastery learning has the following seven complementary features:

  1. Baseline, or diagnostic testing;

  2. Clear learning objectives, sequenced as units in increasing difficulty;

  3. Engagement in educational activities (eg, skills practice, data interpretation, reading, focused on reaching the objectives);

  4. A set minimum passing standard (eg, test score) for each educational unit;

  5. Formative testing to gauge unit completion at a preset minimum passing standard for mastery;

  6. Advancement to the next educational unit given measured achievement at or above the mastery standard; and

  7. Continued practice or study on an educational unit until the mastery standard is reached.

The goal in mastery learning is to ensure that all learners accomplish all educational objectives with little or no variation in outcome. The amount of time needed to reach mastery standards for a unit's educational objectives varies among the learners. To illustrate, in mastery learning studies on acquiring advanced cardiac life support29 and thoracentesis30 skills, approximately 20% of the internal medicine resident trainees needed more time beyond the minimum allocation to reach mastery standards. The extra time needed was usually < 1 h.

The mastery learning model also includes other options in simulation-based education. For example, mastery learning can address learning objectives beyond skill acquisition to include knowledge gains; affective qualities, such as self-efficacy; or features of medical professionalism. Mastery learning requires a standardized curriculum for all learners, with uniform outcomes assessed by rigorous measurements and standards.28,38,39

Deliberate Practice

Deliberate practice is an educational variable associated with delivery of strong and consistent educational treatments as part of the mastery learning model.4043 Although demanding of learners, deliberate practice is grounded in information processing and behavioral theories of skill acquisition and maintenance.4042 It has at least nine requirements that can inform CME, as follows:

  1. Highly motivated learners with good concentration;

  2. Engagement with a well-defined learning objective or task;

  3. Appropriate level of difficulty;

  4. Focused, repetitive practice;

  5. Rigorous, precise measurements;

  6. Informative feedback from educational sources (eg, simulators or teachers);

  7. Monitoring, correction of errors, and more deliberate practice;

  8. Evaluation to reach a mastery standard; and

  9. Advancement to another task or unit.

The goal of deliberate practice in a CME mastery-learning context is to require constant improvement of skill and knowledge rather than maintenance of a minimal level. Ericsson4042 cites data that underscore a “4/10 rule” about development of expertise in any field, as follows: it takes 4 h of deliberate practice every day for 10 years to become a world-class performer like an Olympic athlete, cutting-edge scientist, chess master, patient-care provider, or writer. Even Michael Jordan took 500 free throws every day throughout his professional basketball career to maintain and improve his professional edge.44

Deliberate practice using medical simulation has been shown to improve medical performance in several medical and surgical specialties.45 In a small study of pulmonary fellows trained with virtual bronchoscopy,46 trainees demonstrated equal or better facility with an airway training model compared with skilled colleagues with several years of experience. This cascaded approach to education shapes “deliberate practice pedagogy.”47 Joined with a mastery-learning program structure, the combination creates the potential for a more standard way to assess knowledge and competence.48

Cultural Barriers

It is unlikely that educational innovations involving simulation-based mastery learning and deliberate practice will receive a warm reception in the medical community. Innovations will be used effectively only after barriers in medical culture are acknowledged, addressed, and breached. Adoption of innovation in medical education at all levels has been slowed by inertia, habit, absence of audit and accountability, and other cited barriers. Medical education culture needs to embrace innovation and its diffusion as energizing opportunities that will boost professional competence, morale, and patient care. Specifically, barriers in contemporary medical culture that inhibit advancement of simulation-based CME are as follows:

  1. A group of scholars stated in 1978 that the most powerful force in medical education is inertia.37 This statement probably is still true, and the CME community must change habits and embrace evidence-based, outcomes-focused educational models to replace approaches featuring passive lectures, seat time, and continuing education units.

  2. Clinical medical education has a traditional patient-centered focus, grounded in Osler's writing49 about medical education in the 19th century. The focus is completely shifted in the medical simulation environment to the learner's education and skill acquisition, rather than the care of the patient.

  3. Evaluation apprehension is a pervasive fear among physicians about being identified as lacking knowledge, judgment, or clinical skill, especially in a public setting, and it is endemic in medical culture.50 CME practices that include rigorous diagnostic assessments of physicians' baseline skill and knowledge probably will meet stiff resistance unless privacy of the learners is protected. Such assessments usually reveal professional deficits, even among advanced clinical learners.24,25,29,51

  4. There is a widespread belief in medical education that seniority and clinical experience are proxies for clinical competence. This belief contrasts with research findings that clinical experience alone is not associated with better performance at basic skills, like cardiac auscultation,52 responses to acute intraoperative events in anesthesia,19 or the wider competency of delivering quality health care.53

  5. Medical education rarely has had rigorous assessment of its short- or long-term impact on patient care or clinical outcomes, such as those obtained through audit, accountability, and performance improvement.54

  6. The medical profession has been insular, rarely looking to other professions for educational models, ideas, or approaches to education and personnel assessment. For example, medicine has much to learn about team training from aviation55 and from nuclear power plant operation.56,57 Professional education for clergy roles also offers ideas to medicine about career formation and shaping a public-service identity.58

  7. Faculty development, especially about the effective use of simulation technology to promote learner achievement, must become a priority training goal.33 Simple or sophisticated simulation technology will be ineffective or misused unless faculty members, including physicians and other health professionals, are prepared as simulation educators.

Simulation technology as an educational tool could lead to significant changes in medical education, including a new emphasis on skill and knowledge acquisition and maintenance, integration of the technique into a comprehensive clinical curriculum that includes certification and recertification, adoption of mastery learning and deliberate practice, and increased competence and outcome measurement. Research should focus on valid and reliable tools for more systematic outcome measurements, with the ultimate goal of improving the quality of patient care. Policies that inform physician performance and govern the privilege to practice not only need to endorse the effective educational use of simulation technology, but also tackle sources of cultural resistance to its adoption.

Simulation will never replace the situational context and complex interactions learned through interaction with real patients. Expert mentors will always be needed not only to verify trainee performance in real situations, but also to judge the simulators' in vivo fidelity. Nevertheless, cultural barriers should not hinder the adoption and informed use of simulation technology as a powerful and effective educational tool to maximize physician and other health professional training and, ultimately, to improve patient care.

AHRQ

Agency for Healthcare Research and Quality

BEME

best evidence medical education

CME

continuing medical education

GME

graduate medical education

UME

undergraduate medical education

Dr. McGaghie has twice served as a paid speaker at Simulation Users Network (SUN) events sponsored by Laerdal Medical Corp.

Ms. Siddall has no conflicts of interest.

Dr. Mazmanian has no conflicts of interest.

Dr. Myers has no conflicts of interest.

Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. 2007; Rockville, MD Agency for Healthcare Research and Quality Evidence Report/Technology Assessment No 149.
 
Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27:10-28. [PubMed] [CrossRef]
 
Spivey BE. Continuing medical education in the United States: why it needs reform and how we propose to accomplish it. J Contin Educ Health Prof. 2005;25:134-143. [PubMed]
 
McGaghie WC.Tekian A, McGuire CH, McGaghie WC. Simulation in professional competence assessment: basic considerations. Innovative simulations for assessing professional competence. 1999; Chicago, IL Department of Medical Education, University of Illinois at Chicago:7-22
 
Boulet JR, Swanson DB.Dunn WF. Psychometric challenges of using simulations for high-stakes assessment. Simulators in critical care education and beyond. 2004; Des Plaines, IL Society of Crit Care Med:119-130
 
Issenberg SB, McGaghie WC.Tekian A, McGuire CH, McGaghie WC. Assessing knowledge and skills in the health professions: a continuum of simulation fidelity. Innovative simulations for assessing professional competence. 1999; Chicago, IL Department of Medical Education, University of Illinois at Chicago:125-146
 
Petrusa ER.Norman GR, van der Vleuten CPM, Newble DI. Clinical performance assessments. International handbook of research in medical education: part two. 2002; Dordrecht, the Netherlands Kluwer Academic Publishers:673-709
 
Pangaro LN, Worth-Dickstein H, MacMillan MK, et al. Performance of “standardized examinees” in a standardized-patient examination of clinical skills. Acad Med. 1997;72:1008-1011. [PubMed]
 
Worth-Dickstein H, Pangaro LN, MacMillan MK, et al. Use of “standardized examinees” to screen for standardized-patient scoring bias in a clinical skills examination. Teach Learn Med. 2005;17:9-13. [PubMed]
 
Kneebone R, Nestel D, Yadollahi F, et al. Assessing procedural skills in context: exploring the feasibility of an Integrated Procedural Performance Instrument (IPPI). Med Educ. 2006;40:1105-1114. [PubMed]
 
Moher D, Cook DJ, Eastwood S, et al. Improving the quality of reports of meta-analyses of randomized controlled trials: the QUOROM statement. Lancet. 1999;354:1896-1900. [PubMed]
 
Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861-866. [PubMed]
 
Levine AI.Loyd GE, Lake CL, Greenberg RB. Simulation for continuing health professional education. Practical health care simulations. 2004; Philadelphia, PA Elsevier:527-541
 
Harden RM, Grant J, Buckley EG, et al. BEME guide No. 1: best evidence medical education. Med Teacher. 1999;21:553-562
 
McGaghie WC, Issenberg SB, Petrusa ER, et al. Effect of practice on standardized learning outcomes in simulation-based medical education. Med Educ. 2006;40:792-797. [PubMed]
 
Boulet JR, Murray D, Kras J, et al. Reliability and validity of a simulation-based acute care skills assessment for medical students and residents. Anesthesiology. 2003;99:1270-1280. [PubMed]
 
Murray DJ, Boulet JR, Kras JF, et al. Acute care skills in anesthesia practice. Anesthesiology. 2004;101:1084-1095. [PubMed]
 
Murray DJ, Boulet JR, Kras JF, et al. A simulation-based acute skills performance assessment for anesthesia training. Anesth Analg. 2005;101:1127-1134. [PubMed]
 
Murray DJ, Boulet JR, Avidan M, et al. Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology. 2007;107:705-713. [PubMed]
 
McGaghie WC, Pugh CM, Wayne DB.Kyle R, Murray WB. Fundamentals of educational research using clinical simulation. Clinical simulation: operations, engineering, and management. 2008; Burlington, MA Academic Press:517-526
 
Stufflebeam DL. Guidelines for developing evaluation checklists: the checklists development checklist (CDC).Accessed January 12, 2009 Available at:http://www.wmich.edu/evalctr/checklists.
 
Cummins RO. ACLS provider manual. 2001; Dallas, TX American Heart Association
 
American Heart Association 2005 American Heart Association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation. 2005;112:IV1-IV203. [PubMed]
 
Wayne DB, Butter J, Siddall VJ, et al. Simulation-based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial. Teach Learn Med. 2005;17:210-216. [PubMed]
 
Wayne DB, Butter J, Siddall VJ, et al. Graduating internal medicine residents' self-assessment and performance of advanced cardiac life support skills. Med Teach. 2006;28:365-369. [PubMed]
 
Wayne DB, Siddall VJ, Butter J, et al. Longitudinal study of internal medicine residents' retention of advanced cardiac life support skills. Acad Med. 2006;81suppl:S9-S12. [PubMed]
 
Wayne DB, Didwania A, Feinglass J, et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest. 2008;133:56-61. [PubMed]
 
Wayne DB, Fudala MJ, Butter J, et al. Comparison of two standard setting methods for advanced cardiac life support skills training. Acad Med. 2005;80suppl:S63-S66. [PubMed]
 
Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251-256. [PubMed]
 
Wayne DB, Barsuk JH, O'Leary K, et al. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:48-54. [PubMed]
 
Issenberg SB. The scope of simulation-based healthcare education. Sim Healthcare. 2006;1:203-208
 
Issenberg SB, McGaghie WC, Gordon DL, et al. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med. 2002;14:223-228. [PubMed]
 
Association of American Medical Colleges AAMC Colloquium on Educational Technology: effective use of educational technology in medical education; summary report. 2007; Washington, DC Association of American Medical Colleges
 
Knox AB. Helping adults learn. 1986; San Francisco, CA Jossey-Bass
 
Block JH. Mastery learning: theory and practice. 1971; New York, NY Holt, Rinehart and Winston
 
Bloom BS. Time and learning. Am Psychol. 1974;29:682-688
 
McGaghie WC, Miller GE, Sajid A, Telder TV. Competency-based curriculum development in medical education. 1978; Geneva, Switzerland World Health Organization Public Health Paper No. 68.
 
Adler M, Trainor JL, Siddall VJ, et al. Development and evaluation of high-fidelity case scenarios for pediatric resident education. Ambul Pediatr. 2007;7:182-186. [PubMed]
 
Issenberg SB, McGaghie WC, Brown DD, et al;Melnick DE. Development of multimedia computer-based measures of clinical skills in bedside cardiology. 2000; The Eighth International Ottawa Conference on Medical Education and Assessment Proceedings: evolving assessment; protecting the human dimension Philadelphia, PA National Board of Medical Examiners:821-829
 
Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79suppl:S70-S81. [PubMed]
 
Ericsson KA, Charness N. Expert performance: its structure and acquisition. Am Psychol. 1994;49:725-747
 
Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363-406
 
Cordray DS, Pion GM.Bootzin RR, McKnight PE. Treatment strength and integrity: models and methods. Strengthening research methodology: psychological measurement and evaluation. 2006; Washington, DC American Psychological Association:103-124
 
Jordan M. For the love of the game. 1998; New York, NY Crown Publishers
 
Ericsson KA. An expert-performance perspective of research on medical expertise: the study of clinical performance. Med Educ. 2007;41:1124-1130. [PubMed]
 
Colt HG, Crawford SW, Galbraith O. Virtual reality bronchoscopy simulation: a revolution in procedural training. Chest. 2001;120:1333-1339. [PubMed]
 
Farmer LC, Williams GR. The rigorous application of deliberate practice methods in skills courses. the UCLA/IALS Sixth International Clinical Conference on Enriching Clinical Education October 27–30, 2005 Los Angeles, CA Paper presented at:
 
Ericsson KA. The road to excellence: the acquisition of expert performance in the arts and sciences, sports and games. 1996; Mahwah, NJ Lawrence Erlbaum Associates
 
Osler W.Osler W. The hospital as a college. Aequanimitas. 1906;2nd ed. Philadelphia, PA P. Blakiston's Son & Co:329-342
 
Good M-JD. American medicine: the quest for competence. 1995; Berkeley, CA University of California Press
 
Eisen LA, Berger JS, Hegde A, et al. Competency in chest radiography: a comparison of medical students, residents, and fellows. J Gen Intern Med. 2006;21:460-465. [PubMed]
 
Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees: a comparison of diagnostic proficiency. JAMA. 1997;278:717-722. [PubMed]
 
Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: the relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260-273. [PubMed]
 
Reed D, Price EG, Windish DM, et al. Challenges in systematic reviews of educational intervention studies. Ann Intern Med. 2005;142:1080-1089. [PubMed]
 
Weiner EL, Kanki BG, Helmreich RL. Cockpit resource management. 1993; San Diego, CA Academic Press
 
Mathieu JE, Day DV.Brannick BT, Salas E, Prince C. Assessing processes within and between organizational teams: a nuclear power plant example. Team performance assessment and measurement. 1997; Mahwah, NJ Lawrence Erlbaum Associates:173-195
 
Touquam JL, Macaulay JL, Westra CD, et al;Brannick BT, Salas E, Prince C. Assessment of nuclear power plant crew performance variability. Team performance assessment and measurement. 1997; Mahwah, NJ Lawrence Erlbaum Associates:253-287
 
Foster CR, Dahill LE, Golemon LA, et al. Educating clergy: teaching practices and pastoral imagination. 2006; San Francisco, CA Jossey-Bass
 

Figures

Tables

References

Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. 2007; Rockville, MD Agency for Healthcare Research and Quality Evidence Report/Technology Assessment No 149.
 
Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27:10-28. [PubMed] [CrossRef]
 
Spivey BE. Continuing medical education in the United States: why it needs reform and how we propose to accomplish it. J Contin Educ Health Prof. 2005;25:134-143. [PubMed]
 
McGaghie WC.Tekian A, McGuire CH, McGaghie WC. Simulation in professional competence assessment: basic considerations. Innovative simulations for assessing professional competence. 1999; Chicago, IL Department of Medical Education, University of Illinois at Chicago:7-22
 
Boulet JR, Swanson DB.Dunn WF. Psychometric challenges of using simulations for high-stakes assessment. Simulators in critical care education and beyond. 2004; Des Plaines, IL Society of Crit Care Med:119-130
 
Issenberg SB, McGaghie WC.Tekian A, McGuire CH, McGaghie WC. Assessing knowledge and skills in the health professions: a continuum of simulation fidelity. Innovative simulations for assessing professional competence. 1999; Chicago, IL Department of Medical Education, University of Illinois at Chicago:125-146
 
Petrusa ER.Norman GR, van der Vleuten CPM, Newble DI. Clinical performance assessments. International handbook of research in medical education: part two. 2002; Dordrecht, the Netherlands Kluwer Academic Publishers:673-709
 
Pangaro LN, Worth-Dickstein H, MacMillan MK, et al. Performance of “standardized examinees” in a standardized-patient examination of clinical skills. Acad Med. 1997;72:1008-1011. [PubMed]
 
Worth-Dickstein H, Pangaro LN, MacMillan MK, et al. Use of “standardized examinees” to screen for standardized-patient scoring bias in a clinical skills examination. Teach Learn Med. 2005;17:9-13. [PubMed]
 
Kneebone R, Nestel D, Yadollahi F, et al. Assessing procedural skills in context: exploring the feasibility of an Integrated Procedural Performance Instrument (IPPI). Med Educ. 2006;40:1105-1114. [PubMed]
 
Moher D, Cook DJ, Eastwood S, et al. Improving the quality of reports of meta-analyses of randomized controlled trials: the QUOROM statement. Lancet. 1999;354:1896-1900. [PubMed]
 
Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861-866. [PubMed]
 
Levine AI.Loyd GE, Lake CL, Greenberg RB. Simulation for continuing health professional education. Practical health care simulations. 2004; Philadelphia, PA Elsevier:527-541
 
Harden RM, Grant J, Buckley EG, et al. BEME guide No. 1: best evidence medical education. Med Teacher. 1999;21:553-562
 
McGaghie WC, Issenberg SB, Petrusa ER, et al. Effect of practice on standardized learning outcomes in simulation-based medical education. Med Educ. 2006;40:792-797. [PubMed]
 
Boulet JR, Murray D, Kras J, et al. Reliability and validity of a simulation-based acute care skills assessment for medical students and residents. Anesthesiology. 2003;99:1270-1280. [PubMed]
 
Murray DJ, Boulet JR, Kras JF, et al. Acute care skills in anesthesia practice. Anesthesiology. 2004;101:1084-1095. [PubMed]
 
Murray DJ, Boulet JR, Kras JF, et al. A simulation-based acute skills performance assessment for anesthesia training. Anesth Analg. 2005;101:1127-1134. [PubMed]
 
Murray DJ, Boulet JR, Avidan M, et al. Performance of residents and anesthesiologists in a simulation-based skill assessment. Anesthesiology. 2007;107:705-713. [PubMed]
 
McGaghie WC, Pugh CM, Wayne DB.Kyle R, Murray WB. Fundamentals of educational research using clinical simulation. Clinical simulation: operations, engineering, and management. 2008; Burlington, MA Academic Press:517-526
 
Stufflebeam DL. Guidelines for developing evaluation checklists: the checklists development checklist (CDC).Accessed January 12, 2009 Available at:http://www.wmich.edu/evalctr/checklists.
 
Cummins RO. ACLS provider manual. 2001; Dallas, TX American Heart Association
 
American Heart Association 2005 American Heart Association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation. 2005;112:IV1-IV203. [PubMed]
 
Wayne DB, Butter J, Siddall VJ, et al. Simulation-based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial. Teach Learn Med. 2005;17:210-216. [PubMed]
 
Wayne DB, Butter J, Siddall VJ, et al. Graduating internal medicine residents' self-assessment and performance of advanced cardiac life support skills. Med Teach. 2006;28:365-369. [PubMed]
 
Wayne DB, Siddall VJ, Butter J, et al. Longitudinal study of internal medicine residents' retention of advanced cardiac life support skills. Acad Med. 2006;81suppl:S9-S12. [PubMed]
 
Wayne DB, Didwania A, Feinglass J, et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest. 2008;133:56-61. [PubMed]
 
Wayne DB, Fudala MJ, Butter J, et al. Comparison of two standard setting methods for advanced cardiac life support skills training. Acad Med. 2005;80suppl:S63-S66. [PubMed]
 
Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251-256. [PubMed]
 
Wayne DB, Barsuk JH, O'Leary K, et al. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:48-54. [PubMed]
 
Issenberg SB. The scope of simulation-based healthcare education. Sim Healthcare. 2006;1:203-208
 
Issenberg SB, McGaghie WC, Gordon DL, et al. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med. 2002;14:223-228. [PubMed]
 
Association of American Medical Colleges AAMC Colloquium on Educational Technology: effective use of educational technology in medical education; summary report. 2007; Washington, DC Association of American Medical Colleges
 
Knox AB. Helping adults learn. 1986; San Francisco, CA Jossey-Bass
 
Block JH. Mastery learning: theory and practice. 1971; New York, NY Holt, Rinehart and Winston
 
Bloom BS. Time and learning. Am Psychol. 1974;29:682-688
 
McGaghie WC, Miller GE, Sajid A, Telder TV. Competency-based curriculum development in medical education. 1978; Geneva, Switzerland World Health Organization Public Health Paper No. 68.
 
Adler M, Trainor JL, Siddall VJ, et al. Development and evaluation of high-fidelity case scenarios for pediatric resident education. Ambul Pediatr. 2007;7:182-186. [PubMed]
 
Issenberg SB, McGaghie WC, Brown DD, et al;Melnick DE. Development of multimedia computer-based measures of clinical skills in bedside cardiology. 2000; The Eighth International Ottawa Conference on Medical Education and Assessment Proceedings: evolving assessment; protecting the human dimension Philadelphia, PA National Board of Medical Examiners:821-829
 
Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79suppl:S70-S81. [PubMed]
 
Ericsson KA, Charness N. Expert performance: its structure and acquisition. Am Psychol. 1994;49:725-747
 
Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363-406
 
Cordray DS, Pion GM.Bootzin RR, McKnight PE. Treatment strength and integrity: models and methods. Strengthening research methodology: psychological measurement and evaluation. 2006; Washington, DC American Psychological Association:103-124
 
Jordan M. For the love of the game. 1998; New York, NY Crown Publishers
 
Ericsson KA. An expert-performance perspective of research on medical expertise: the study of clinical performance. Med Educ. 2007;41:1124-1130. [PubMed]
 
Colt HG, Crawford SW, Galbraith O. Virtual reality bronchoscopy simulation: a revolution in procedural training. Chest. 2001;120:1333-1339. [PubMed]
 
Farmer LC, Williams GR. The rigorous application of deliberate practice methods in skills courses. the UCLA/IALS Sixth International Clinical Conference on Enriching Clinical Education October 27–30, 2005 Los Angeles, CA Paper presented at:
 
Ericsson KA. The road to excellence: the acquisition of expert performance in the arts and sciences, sports and games. 1996; Mahwah, NJ Lawrence Erlbaum Associates
 
Osler W.Osler W. The hospital as a college. Aequanimitas. 1906;2nd ed. Philadelphia, PA P. Blakiston's Son & Co:329-342
 
Good M-JD. American medicine: the quest for competence. 1995; Berkeley, CA University of California Press
 
Eisen LA, Berger JS, Hegde A, et al. Competency in chest radiography: a comparison of medical students, residents, and fellows. J Gen Intern Med. 2006;21:460-465. [PubMed]
 
Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees: a comparison of diagnostic proficiency. JAMA. 1997;278:717-722. [PubMed]
 
Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: the relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260-273. [PubMed]
 
Reed D, Price EG, Windish DM, et al. Challenges in systematic reviews of educational intervention studies. Ann Intern Med. 2005;142:1080-1089. [PubMed]
 
Weiner EL, Kanki BG, Helmreich RL. Cockpit resource management. 1993; San Diego, CA Academic Press
 
Mathieu JE, Day DV.Brannick BT, Salas E, Prince C. Assessing processes within and between organizational teams: a nuclear power plant example. Team performance assessment and measurement. 1997; Mahwah, NJ Lawrence Erlbaum Associates:173-195
 
Touquam JL, Macaulay JL, Westra CD, et al;Brannick BT, Salas E, Prince C. Assessment of nuclear power plant crew performance variability. Team performance assessment and measurement. 1997; Mahwah, NJ Lawrence Erlbaum Associates:253-287
 
Foster CR, Dahill LE, Golemon LA, et al. Educating clergy: teaching practices and pastoral imagination. 2006; San Francisco, CA Jossey-Bass
 
NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

CHEST Journal Articles
CHEST Collections
PubMed Articles
  • CHEST Journal
    Print ISSN: 0012-3692
    Online ISSN: 1931-3543