0
CME: ACCP Evidence-Based Educational Guidelines |

Continuing Medical Education Effect on Physician Knowledge Application and Psychomotor Skills: Effectiveness of Continuing Medical Education: American College of Chest Physicians Evidence-Based Educational Guidelines FREE TO VIEW

Kevin M. O'Neil, MD, FCCP; Doreen J. Addrizzo-Harris, MD, FCCP
Author and Funding Information

*From Wilmington Health Associates (Dr. O'Neil), Wilmington, NC; and New York University School of Medicine (Dr. Addrizzo-Harris), New York, NY.

Correspondence to: Kevin M. O'Neil, MD, FCCP, Wilmington Health Associates, 1227 Medical Center Dr, Wilmington, NC 28401; e-mail: koneil@wilmingtonhealth.com


Reproduction of this article is prohibited without written permission from the American College of Chest Physicians (www.chestjournal.org/misc/reprints.shtml).


Chest. 2009;135(3_suppl):37S-41S. doi:10.1378/chest.08-2516
Text Size: A A A
Published online

Background:  Recommendations for optimizing continuing medical education (CME) effectiveness in improving physician application of knowledge and psychomotor skills are needed to guide the development of processes that effect physician change and improve patient care.

Methods:  The guideline panel reviewed evidence tables and a comprehensive review of the effectiveness of CME developed by The Johns Hopkins Evidence-based Practice Center for the Agency for Healthcare Research and Quality (AHRQ Evidence Report). The panel considered studies relevant to the effect of CME on physician knowledge application and psychomotor skill development. From the 136 studies identified in the systematic review, 15 articles, 12 addressing physician application of knowledge and 3 addressing psychomotor skills, were identified and reviewed. Recommendations for optimizing CME were developed using the American College of Chest Physicians guideline grading system.

Results:  The preponderance of evidence demonstrated improvement in physician application of knowledge with CME. The quality of evidence did not allow specific recommendations regarding optimal media or educational techniques or the effectiveness of CME in improving psychomotor skills.

Conclusions:  CME is effective in improving physician application of knowledge. Multiple exposures and longer durations of CME are recommended to optimize educational outcomes.

  1. General: We recommend that CME activities be used to improve physician application of knowledge (Grade 1C).

  2. Frequency of exposure: We suggest that multiple CME exposures be used in place of a single exposure to maximize retention and improve physician application of knowledge (Grade 2C).

Despite strong evidence that didactic lectures and unsolicited, mailed, printed material do not produce physician behavioral change or improve patient care,14 they remain popular methods for providing continuing medical education (CME).1,5 As a result, there is strong pressure on several fronts to reform CME to a process that produces physician change and improves patient care.1,6 There is a clear need for guidance regarding best practices to inform both providers and consumers of CME so that these programs can change in ways that improve patient outcomes. As an example, psychomotor skills training is critically important in modern medicine. More than two thirds of a medical career is spent in a post-graduate medical education (GME) environments,1 but recent developments in laparoscopic and minimally invasive surgery (both with significant learning curves) call for training large numbers of physicians who are no longer in GME programs.79 Although limited research on psychomotor training in GME exists and is evolving with the increased use of simulation and virtual reality training,1014 it is not clear that these data apply to physicians outside of GME. There is scant information on the effectiveness of CME for teaching psychomotor skills8 or guidance for introducing new techniques and procedures to the post-GME physician in a way that improves patient outcomes. This article reviews the available literature and provides recommendations for use of CME in improving physician application of knowledge and teaching psychomotor skills.

The guideline panel reviewed evidence tables and a comprehensive review of the effectiveness of CME developed by The Johns Hopkins Evidence-based Practice Center at the request of the Agency for Healthcare Research and Quality (AHRQ) [AHRQ Evidence Report].15 The processes for developing the comprehensive review are listed in the Methods article.15a For this section, the guideline panel considered studies relevant to the effect of CME on physician knowledge application and psychomotor skill development. In the AHRQ Evidence Report,15 the term skills was used to refer to both psychomotor skills, such as joint injection or the performance of a physical examination, and cognitive skills, such as assessment of depression, the application of patient management, and the use of critical appraisal skills for assessing the medical literature. The panel chose to substitute the term knowledge application in place of cognitive skills to better differentiate this from knowledge acquisition, which is addressed in the Physician Knowledge article.17a In addition, application of communication techniques was considered a cognitive skill in the evidence report, and studies addressing this are considered with knowledge application in this article. From the 136 studies reviewed in the AHRQ Evidence Report, those addressing the impact of CME on knowledge application or psychomotor skills were identified and retrieved for review. The recommendations listed herein were developed using the American College of Chest Physicians guideline grading system, which is outlined in detail in the Methods article.15a

Overall, the data for making decisions related to CME and physician knowledge application are scant. The structured review15 identified only 15 articles1630 that addressed either knowledge application or psychomotor skills training. Many of these articles had significant methodological flaws, and the overall quality of evidence is low.

Knowledge application (cognitive and communication skills) was addressed in 12 articles1622,24,26,27,29,30 and included the ability to calculate correct medication doses for pediatric patients, communication skills, diagnostic accuracy for psychiatric patients, critical appraisal skills for assessing medical literature and application of evidence in medical care, diagnostic accuracy and evaluation of skin cancer, headache and COPD, application of cancer control skills, and correct utilization of oral antibiotics. These articles studied only primary care providers (family practice, pediatrics, internal medicine). No data were available for specialists or surgeons. Eleven of the 12 studies1622,24,26,29,30 showed improvement in knowledge application over the short term. Only six studies1721,26,30 provided assessments beyond 30 days, and all showed that improvement was maintained. The only study that did not show an improvement in knowledge application was a controlled study27 involving academic internists and intervention with electronic delivery of a weekly structured summary of research selected from core medical journals. After a 3-month intervention, there was no increase in self-reported use of evidence in patient care compared to a control group; however, the internists reported incorporating evidence into 60% of patient care at baseline. The experimental group increased its reading efficiency compared to the controls and indicated that 20% of the selected articles would have been missed without the summaries.

Recommendation

  1. We recommend that CME activities be used to improve physician application of knowledge (Grade 1C).

A variety of instructional media were employed in the studies cited previously. Four studies16,18,24,30 used videos or CD-ROMs, four studies24,26,29,30 used print media, 4 used computer-based information (e-mail,27 Web-based programs,20,21 Listservs26). Two studies17,22 used audio methods, and eight studies1618,22,24,26,29,30 used live media. Eight1618,22,24,26,29,30 of the 12 studies1622,24,26,27,29,30 used multiple media. In all three studies19,29,30 that compared two or more experimental groups, the same media were used in each intervention. Given the limited data and lack of direct comparison among media, no specific recommendations can be made for optimal media use in CME to effect change in physician knowledge application.

Multiple educational techniques also were employed in the 12 studies that evaluated physician knowledge application. Eight studies1619,22,24,29,30 used lecture, four studies18,22,29,30 used role playing, five studies18,24,27,29 used readings, six studies17,18,22,26,29,30 used discussion groups, three studies21,22,30 used feedback, and three studies17,22,29 used audiotaped patient encounters. Listservs,26 problem-based learning,19 and case-based learning16,20,30 were other techniques studied. Only two studies19,30 directly compared two or more educational techniques, and the one study19 that demonstrated a difference between techniques had methodological flaws. Eight1618,21,22,24,27,29,30 of the 12 studies used multiple techniques in the study groups, and several studies26,27,30 used similar techniques in both the control and the study groups. Based on the limited data and lack of direct comparison among methods, no conclusions can be made regarding the preferred educational techniques for optimizing physician knowledge application.

The majority of studies provided multiple exposures to the CME material, often with differing media and educational techniques. Eight studies1619,22,26,29,30 specified available exposure times, although in some,20,21,24,26,27 the utilization of CME was incompletely specified. Reported CME exposure times ranged from 2 h19 to 48 h.26 Three16,18,19 of the studies specified a single exposure involving a minimum of 2 h of CME for at least one group in the study. Two of the studies with single exposures16,18 used multiple media and educational techniques in sessions lasting 3.5 h16 and 8 h,18 respectively, raising the question about what constitutes a single educational exposure. Only one study compared multiple exposures to a single exposure.19 In that study, multiple exposures were associated with better outcomes, but there were significant differences in techniques employed and time devoted to CME between the two groups. The groups also were not randomly assigned. Despite the limitations in the data, the guideline panel believed that the most convincing data demonstrating improvement in physician application of knowledge were generated in studies using multiple exposures. The consensus of the panel was that both longer duration and multiple exposures appeared to provide greater benefit in improving physician application of knowledge, although the evidence to allow a recommendation for longer duration was not believed to be sufficient at this time given the available data.

Recommendation

  • 2. We suggest that multiple CME exposures be used in place of a single exposure to maximize retention and improve physician application of knowledge (Grade 2C).

Psychomotor skills were addressed in only three studies23,25,28 identified in the structured review. Hergenroeder et al23 investigated two methods for teaching pediatricians to perform ankle and knee physical examinations. Seventy-five pediatricians were randomly assigned to either a group that received videotape and written instruction or a group that received videotape, written instruction, and a skills-building workshop. When tested with standardized patients at 4 to 5 months after training, both groups improved compared to baseline, but the group with hands-on experience in the skills-building session improved to a greater degree.

Rodney and Albers28 compared two methods of teaching primary care providers to perform flexible sigmoidoscopy. They compared a large group-lecture format supplemented with videotapes and 3 h of mannequin practice with a small-group format, less lecture time, a more intensive workshop session, more faculty involvement, and a requirement for the participant to demonstrate proficiency with the sigmoidoscope. The groups were not randomly assigned, and there was a substantial difference in CME time between the two groups. The outcome assessment 12 to 16 months after completing the courses used self-reported data about the learning curve, defined as time to complete a procedure and the depth scoped as a function of the number of procedures performed. The small-group-format study participants reported a shorter procedure time for the first 10 procedures, but no other differences between the two groups were noted. The authors also noted but did not discuss that the small-group-format participants appeared less likely to purchase a sigmoidoscope and performed fewer procedures per provider than the large-group participants. No comparison to a non-CME group was made, although comparison to published complication rates for colonoscopy was made.

Leopold et al25 studied three methods for teaching primary care providers how to perform a knee injection. Ninety-three participants in a CME workshop were randomized to one of three groups after attending a 15-min lecture on knee landmarks and injection techniques. One group received written instruction in knee injection, the second watched a videotape illustrating the procedure, and the third received hands-on instruction with supervised practice and feedback. All three groups improved compared to the baseline assessment, which was a simulated injection using a knee model, and no difference among groups was demonstrated.

In all three studies,23,25,28 all methods studied provided some evidence for improved psychomotor skills, although data to make specific recommendations are very limited. Only simple outpatient procedures were studied, and all included only primary care providers. All studies provided training using multiple instructional techniques and multiple media. Thus, no specific recommendations regarding preferred modalities, media, frequency, or duration of CME are possible. As a result, no recommendation can be made regarding procedural skills training.

The available data show that CME can be effective in physician knowledge application, and the guideline panel recommended that CME be used for this purpose. With only 15 studies addressing this topic, the data are too limited to allow recommendations regarding optimal media or educational techniques. The available studies employed a variety of media and techniques, often in the same groups, and direct comparisons of competing media or educational techniques are infrequent and often marred by study design flaws. Although issues also existed with the data addressing exposure, most of the positive studies employed multiple exposures and longer duration activities, which is in keeping with previously published reviews on CME effectiveness.31 Thus, the guideline panel recommended that CME programs employ multiple exposures to improve physician knowledge application. The data for CME effectiveness in psychomotor skills training include only three studies and are too limited to merit a recommendation.

A better understanding of what works in CME is required if CME is to be effective in improving physician application of knowledge; psychomotor skills; and, ultimately, patient outcomes. Well-designed studies directly comparing different media and educational techniques are needed. Additionally, defining an appropriate duration and number of exposures to improve knowledge and skills is critical for CME in a time when the costs, effectiveness, and sponsorship of this $2 billion-a-year industry is increasingly called into question.5 Broadening the study populations to include more than primary care practitioners and additional studies addressing psychomotor skills also are critically needed, and it would be important to learn whether the principles developed for medical school and GME also are applicable to CME, as some8,32 have suggested. Until such data are available, our recommendations provide the best guidance for CME providers and physicians looking to optimize CME effectiveness with regard to physician knowledge application and serve as a starting point for additional research.

Dr. O'Neil has no conflicts of interest to disclose.

Dr. Addrizzo-Harris has no conflicts of interest to disclose.

Spivey BE. Continuing medical education in the United States: why it needs reform and how we propose to accomplish it. J Contin Educ Health Prof. 2005;25:134-143. [PubMed] [CrossRef]
 
Davis NL, Willis CE. A new metric for continuing medical education credit 1. J Contin Educ Health Prof. 2004;24:139-144. [PubMed]
 
Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence 2. JAMA. 2002;288:1057-1060. [PubMed]
 
Davis DA, Thomson MA, Oxman AD, et al. Changing physician performance: a systematic review of the effect of continuing medical education strategies 3. JAMA. 1995;274:700-705. [PubMed]
 
McDonald WJ. Council of Medical Specialty Societies: committed to continuing medical education reform. J Contin Educ Health Prof. 2005;25:144-150. [PubMed]
 
Mazmanian PE. Reform of continuing medical education in the United States. J Contin Educ Health Prof. 2005;25:132-133
 
Frumovitz M, Ramirez PT, Greer M, et al. Laparoscopic training and practice in gynecologic oncology among Society of Gynecologic Oncologists members and fellows-in-training. Gynecol Oncol. 2004;94:746-753. [PubMed]
 
Rogers DA, Elstein AS, Bordage G. Improving continuing medical education for surgical techniques: applying the lessons learned in the first decade of minimal access surgery. Ann Surg. 2001;233:159-166. [PubMed]
 
Wallace T, Birch DW. A needs-assessment study for continuing professional development in advanced minimally invasive surgery. Am J Surg. 2007;193:593-595. [PubMed]
 
Aggarwal R, Grantcharov T, Moorthy K, et al. A competency-based virtual reality training curriculum for the acquisition of laparoscopic psychomotor skill. Am J Surg. 2006;191:128-133. [PubMed]
 
Grantcharov TP, Kristiansen VB, Bendix J, et al. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg. 2004;91:146-150. [PubMed]
 
Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med. 2005;80:549-553. [PubMed]
 
Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236:458-463. [PubMed]
 
Rosser JC Jr, Rosser LE, Savalgi RS. Objective evaluation of a laparoscopic surgical skill program for residents and senior surgeons 2. Arch Surg. 1998;133:657-661. [PubMed]
 
Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education: evidence report/technology assessment No. 149. 2007; Rockville, MD Agency for Healthcare Research and Quality (prepared by the Johns Hopkins Evidence-based Practice Center, under Contract No. 290-02-0018). AHRQ Publication No. 07-E006.
 
Marinopoulos SS, Baumann MH. Methods and definitions of terms: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135suppl:17S-28S. [PubMed]
 
Andersen SM, Harthorn BH. Changing the psychiatric knowledge of primary care physicians: the effects of a brief intervention on clinical diagnosis and treatment. Gen Hosp Psychiatry. 1990;12:177-190. [PubMed]
 
Brown JB, Boles M, Mullooly JP, et al. Effect of clinician communication skills training on patient satisfaction: a randomized, controlled trial. Ann Intern Med. 1999;131:822-829. [PubMed]
 
Bordage G, Carlin B, Mazmanian PE. Continuing medical education effect on physician knowledge: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135suppl:29S-36S. [PubMed]
 
Carney PA, Dietrich AJ, Freeman DH Jr, et al. A standardized- patient assessment of a continuing medical education program to improve physicians' cancer-control clinical skills. Acad Med. 1995;70:52-58. [PubMed]
 
Doucet MD, Purdy RA, Kaufman DM, et al. Comparison of problem-based learning and lecture format in continuing medical education on headache diagnosis and management. Med Educ. 1998;32:590-596. [PubMed]
 
Frush K, Hohenhaus S, Luo X, et al. Evaluation of a Web-based education program on reducing medication dosing error: a multicenter, randomized controlled trial. Pediatr Emerg Care. 2006;22:62-70. [PubMed]
 
Gerbert B, Bronstone A, Maurer T, et al. The effectiveness of an Internet-based tutorial in improving primary care physicians' skin cancer triage skills. J Cancer Educ. 2002;17:7-11. [PubMed]
 
Gerrity MS, Cole SA, Dietrich AJ, et al. Improving the recognition and management of depression: is there a role for physician education? J Fam Pract. 1999;48:949-957. [PubMed]
 
Hergenroeder AC, Chorley JN, Laufman L, et al. Two educational interventions to improve pediatricians' knowledge and skills in performing ankle and knee physical examinations. Arch Pediatr Adolesc Med. 2002;156:225-229. [PubMed]
 
Kiang KM, Kieke BA, Como-Sabetti K, et al. Clinician knowledge and beliefs after statewide program to promote appropriate antimicrobial drug use. Emerg Infect Dis. 2005;11:904-911. [PubMed]
 
Leopold SS, Morgan HD, Kadel NJ, et al. Impact of educational intervention on confidence and competence in the performance of a simple surgical task. J Bone Joint Surg Am. 2005;87:1031-1037. [PubMed]
 
Macrae HM, Regehr G, McKenzie M, et al. Teaching practicing surgeons critical appraisal skills with an internet-based journal club: a randomized, controlled trial. Surgery. 2004;136:641-646. [PubMed]
 
Mukohara K, Schwartz MD. Electronic delivery of research summaries for academic generalist doctors: a randomised trial of an educational intervention. Med Educ. 2005;39:402-409. [PubMed]
 
Rodney WM, Albers G. Flexible sigmoidoscopy: primary care outcomes after two types of continuing medical education. Am J Gastroenterol. 1986;81:133-137. [PubMed]
 
Roter DL, Hall JA, Kern DE, et al. Improving physicians' interviewing skills and reducing patients' emotional distress: a randomized clinical trial. Arch Intern Med. 1995;155:1877-1884. [PubMed]
 
Terry PB, Wang VL, Flynn BS, et al. A continuing medical education program in chronic obstructive pulmonary diseases: design and outcome. Am Rev Respir Dis. 1981;123:42-46. [PubMed]
 
Davis DA, Thomson MA, Oxman AD, et al. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials 5. JAMA. 1992;268:1111-1117. [PubMed]
 
Rosser JC Jr, Rosser LE, Savalgi RS. Objective evaluation of a laparoscopic surgical skill program for residents and senior surgeons 2. Arch Surg. 1998;133:657-661. [PubMed]
 

Figures

Tables

References

Spivey BE. Continuing medical education in the United States: why it needs reform and how we propose to accomplish it. J Contin Educ Health Prof. 2005;25:134-143. [PubMed] [CrossRef]
 
Davis NL, Willis CE. A new metric for continuing medical education credit 1. J Contin Educ Health Prof. 2004;24:139-144. [PubMed]
 
Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence 2. JAMA. 2002;288:1057-1060. [PubMed]
 
Davis DA, Thomson MA, Oxman AD, et al. Changing physician performance: a systematic review of the effect of continuing medical education strategies 3. JAMA. 1995;274:700-705. [PubMed]
 
McDonald WJ. Council of Medical Specialty Societies: committed to continuing medical education reform. J Contin Educ Health Prof. 2005;25:144-150. [PubMed]
 
Mazmanian PE. Reform of continuing medical education in the United States. J Contin Educ Health Prof. 2005;25:132-133
 
Frumovitz M, Ramirez PT, Greer M, et al. Laparoscopic training and practice in gynecologic oncology among Society of Gynecologic Oncologists members and fellows-in-training. Gynecol Oncol. 2004;94:746-753. [PubMed]
 
Rogers DA, Elstein AS, Bordage G. Improving continuing medical education for surgical techniques: applying the lessons learned in the first decade of minimal access surgery. Ann Surg. 2001;233:159-166. [PubMed]
 
Wallace T, Birch DW. A needs-assessment study for continuing professional development in advanced minimally invasive surgery. Am J Surg. 2007;193:593-595. [PubMed]
 
Aggarwal R, Grantcharov T, Moorthy K, et al. A competency-based virtual reality training curriculum for the acquisition of laparoscopic psychomotor skill. Am J Surg. 2006;191:128-133. [PubMed]
 
Grantcharov TP, Kristiansen VB, Bendix J, et al. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg. 2004;91:146-150. [PubMed]
 
Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med. 2005;80:549-553. [PubMed]
 
Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236:458-463. [PubMed]
 
Rosser JC Jr, Rosser LE, Savalgi RS. Objective evaluation of a laparoscopic surgical skill program for residents and senior surgeons 2. Arch Surg. 1998;133:657-661. [PubMed]
 
Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education: evidence report/technology assessment No. 149. 2007; Rockville, MD Agency for Healthcare Research and Quality (prepared by the Johns Hopkins Evidence-based Practice Center, under Contract No. 290-02-0018). AHRQ Publication No. 07-E006.
 
Marinopoulos SS, Baumann MH. Methods and definitions of terms: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135suppl:17S-28S. [PubMed]
 
Andersen SM, Harthorn BH. Changing the psychiatric knowledge of primary care physicians: the effects of a brief intervention on clinical diagnosis and treatment. Gen Hosp Psychiatry. 1990;12:177-190. [PubMed]
 
Brown JB, Boles M, Mullooly JP, et al. Effect of clinician communication skills training on patient satisfaction: a randomized, controlled trial. Ann Intern Med. 1999;131:822-829. [PubMed]
 
Bordage G, Carlin B, Mazmanian PE. Continuing medical education effect on physician knowledge: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135suppl:29S-36S. [PubMed]
 
Carney PA, Dietrich AJ, Freeman DH Jr, et al. A standardized- patient assessment of a continuing medical education program to improve physicians' cancer-control clinical skills. Acad Med. 1995;70:52-58. [PubMed]
 
Doucet MD, Purdy RA, Kaufman DM, et al. Comparison of problem-based learning and lecture format in continuing medical education on headache diagnosis and management. Med Educ. 1998;32:590-596. [PubMed]
 
Frush K, Hohenhaus S, Luo X, et al. Evaluation of a Web-based education program on reducing medication dosing error: a multicenter, randomized controlled trial. Pediatr Emerg Care. 2006;22:62-70. [PubMed]
 
Gerbert B, Bronstone A, Maurer T, et al. The effectiveness of an Internet-based tutorial in improving primary care physicians' skin cancer triage skills. J Cancer Educ. 2002;17:7-11. [PubMed]
 
Gerrity MS, Cole SA, Dietrich AJ, et al. Improving the recognition and management of depression: is there a role for physician education? J Fam Pract. 1999;48:949-957. [PubMed]
 
Hergenroeder AC, Chorley JN, Laufman L, et al. Two educational interventions to improve pediatricians' knowledge and skills in performing ankle and knee physical examinations. Arch Pediatr Adolesc Med. 2002;156:225-229. [PubMed]
 
Kiang KM, Kieke BA, Como-Sabetti K, et al. Clinician knowledge and beliefs after statewide program to promote appropriate antimicrobial drug use. Emerg Infect Dis. 2005;11:904-911. [PubMed]
 
Leopold SS, Morgan HD, Kadel NJ, et al. Impact of educational intervention on confidence and competence in the performance of a simple surgical task. J Bone Joint Surg Am. 2005;87:1031-1037. [PubMed]
 
Macrae HM, Regehr G, McKenzie M, et al. Teaching practicing surgeons critical appraisal skills with an internet-based journal club: a randomized, controlled trial. Surgery. 2004;136:641-646. [PubMed]
 
Mukohara K, Schwartz MD. Electronic delivery of research summaries for academic generalist doctors: a randomised trial of an educational intervention. Med Educ. 2005;39:402-409. [PubMed]
 
Rodney WM, Albers G. Flexible sigmoidoscopy: primary care outcomes after two types of continuing medical education. Am J Gastroenterol. 1986;81:133-137. [PubMed]
 
Roter DL, Hall JA, Kern DE, et al. Improving physicians' interviewing skills and reducing patients' emotional distress: a randomized clinical trial. Arch Intern Med. 1995;155:1877-1884. [PubMed]
 
Terry PB, Wang VL, Flynn BS, et al. A continuing medical education program in chronic obstructive pulmonary diseases: design and outcome. Am Rev Respir Dis. 1981;123:42-46. [PubMed]
 
Davis DA, Thomson MA, Oxman AD, et al. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials 5. JAMA. 1992;268:1111-1117. [PubMed]
 
Rosser JC Jr, Rosser LE, Savalgi RS. Objective evaluation of a laparoscopic surgical skill program for residents and senior surgeons 2. Arch Surg. 1998;133:657-661. [PubMed]
 
NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Find Similar Articles
CHEST Journal Articles
PubMed Articles
  • CHEST Journal
    Print ISSN: 0012-3692
    Online ISSN: 1931-3543