0
CME: ACCP Evidence-Based Educational Guidelines |

Continuing Medical Education Effect on Physician Knowledge: Effectiveness of Continuing Medical Education: American College of Chest Physicians Evidence-Based Educational Guidelines FREE TO VIEW

Georges Bordage, MD, PhD; Brian Carlin, MD, FCCP; Paul E. Mazmanian, PhD
Author and Funding Information

*From the University of Illinois at Chicago (Dr. Bordage), Chicago, IL; Allegheny General Hospital (Dr. Carlin), Pittsburgh, PA; and Virginia Commonwealth University (Dr. Mazmanian), Richmond, VA.

Correspondence to: Georges Bordage, MD, PhD, Professor, Department of Medical Education, College of Medicine, University of Illinois at Chicago, M/C 591, 808 South Wood, Chicago, IL 60612-7309; e-mail: bordage@uic.edu


Reproduction of this article is prohibited without written permission from the American College of Chest Physicians (www.chestjournal.org/misc/reprints.shtml).


Chest. 2009;135(3_suppl):29S-36S. doi:10.1378/chest.08-2515
Text Size: A A A
Published online

Background:  Physicians are continuously engaging in continuing medical education (CME) activities. Whether CME activities actually improve their knowledge and whether multiple media, instructional techniques, and exposures are better than single experiences are questions that are still under discussion.

Methods:  The Johns Hopkins Evidence-based Practice Center for Healthcare Research and Quality conducted a systematic review of the effectiveness of CME (Agency for Healthcare Research and Quality Evidence Report) from which the guideline panel used 28 (± 2) studies to answer these questions about improvements in knowledge. The studies were selected based on the presence of an adequate control group from an initial pool of 136 studies on CME.

Results:  Despite the heterogeneity of the studies reviewed and the low quality of the evidence, the results from the majority of the studies (79%) showed that CME activities were associated with improvements in physician knowledge.

Conclusions:  The evidence gathered about the use of media and instructional techniques and the frequency of exposure suggests that multimedia, multiple instructional techniques, and multiple exposures be used whenever possible in CME. Future studies of CME should include assessment of applied knowledge, and should incorporate programmatic and collaborative studies of CME.

Figures in this Article

  1. General: We recommend that continuing medical education (CME) activities be used to improve physician knowledge (Grade 1C).

  2. Instructional media: We suggest the use of multimedia CME interventions in preference to single-medium interventions to improve physician knowledge (Grade 2C).

  3. Instructional techniques: We suggest the use of CME interventions with multiple instructional techniques in preference to a single technique to improve physician knowledge (Grade 2C).

  4. Frequency of exposure: We suggest the use of multiple exposures (sessions) to CME content in preference to a single exposure be used to improve physician knowledge (Grade 2C).

There are important associations between physician knowledge and practice outcomes. For example, a positive association exists between knowledge- based certification examination results in internal medicine and actual clinical performance,1 with disattenuated correlation coefficients between 0.55 and 0.59. There is also a relationship between the knowledge-based certification status of internists and cardiologists and the mortality rate of their patients following an acute myocardial infarction, with 19% lower mortality rates for the patients of certified specialists.2 Patients of surgeons certified by the American Board of Surgery had lower mortality and morbidity rates following segmental colon resection than those of physicians who were not certified.3 Similar improvement in results related to board certification status have been found in studies of obstetrical care,4 surgical mortality,5 outcomes after abdominal aortic aneurysm rupture,6 malpractice claims,7 and disciplinary actions.8 Thus, assessing physician knowledge immediately or some time after attending a continuing medical education (CME) activity is a legitimate endeavor. Knowledge acquisition and retention represent two of the many outcomes worthy of assessment in CME, along with participant satisfaction; transfer of skills into practice; and ultimately, patient and population outcomes.

In CME, the assessment of physician knowledge in the context of clinical practice can serve the following two main purposes: to assess learning needs; and to assess physicians' ability to apply their knowledge in explaining and managing clinical problems. The assessment of the knowledge needs of practicing physicians, both perceived and nonperceived, constitutes one of the elements used to determine the content and format of CME interventions. The importance of educational needs is highlighted in the Accreditation Council for Continuing Medical Education accreditation process.9 The chain of outcomes in CME accreditation criteria goes from physician competence to physician performance and patient outcomes (compliance criterion No. 3). Viewed in the context of the pyramid of Miller10 for assessing clinical competence, the lowest level of assessment is factual knowledge (knows), followed by competence or applied knowledge (knows how), performance (shows how), and action (does) [Fig 1]. The real test of competence (knows how) is in the clinician's ability to understand underlying concepts and principles, to think through problems, and to make decisions and explain findings and mechanisms, the second level in the Miller pyramid.10

Four key questions arise from CME activities that offer knowledge objectives. Do these CME activities improve physician knowledge? Are multimedia interventions (eg, live, computer based, Internet based, use of video, audio, or print) in CME preferable to single-medium interventions to improve physician knowledge? Are multiple instructional techniques (eg, academic detailing, case-based learning, demonstrations, discussion groups, lectures, mentoring, readings, or simulations) preferable to single-technique CME interventions to improve physician knowledge? Are multiple exposures to CME content preferable to a single exposure to improve physician knowledge? A systematic review was undertaken to answer these questions.

Implications for the physician-learner from this portion of the review suggest that participating in CME activities that do not have multiple methods of learning will have a minimal impact. CME activities that require the application of knowledge are a strong indicator of expert learning and competency. Physician-learners have better knowledge gain when asked to gather and analyze history and physical examination findings presented on paper or through a simulated or live patient, make a diagnosis, explain the underlying pathophysiology, and make recommendations on how best to manage the situation. From the physician-teacher perspective, a CME activity should be designed to include multiple educational media and techniques, and, where possible, multiple exposures.

As detailed in the “Methods” article10a, the Johns Hopkins Evidence-based Practice Center conducted a systematic review of the effectiveness of CME.11 Of the 136 studies in the systematic review, 39 studies 1250 addressed a total of 41 knowledge objectives. Of these 39 studies, 11 studies20,28,3032,34,35,37,44,46,50 were excluded from further analyses because, although they used a comparison group, they did not have a control group; thus, the effectiveness of the intervention compared to a nonintervention, control-group baseline could not be determined. Consequently, 28 studies with an adequate control group were included in the analyses, addressing a total of 29 knowledge objectives. The data from each study were classified as to whether the knowledge objectives were met, were not met, or resulted in mixed results (see Methods article10a for definitions).11 Thus, the knowledge gain was measured using the control group as the base level. To assess the duration of knowledge retention, the extent of time between the CME activity and the test of knowledge was classified as short term (< 30 days), long term (≥ 30 days), or not clearly reported. The heterogeneity of the content, the designs of the studies, and the weaknesses in the study designs warranted an overall grading of C (low) for the quality of the evidence by the guideline panel, using the American College of Chest Physicians grading system.51

Twenty-two of the 28 studies (79%)1214,1719,21,24,26,27,29,33,36,38,39,4143,45,4749 showed improvements in knowledge. Four studies16,22,23,25 (14%) failed to show improvements in knowledge, and two studies15,40 (7%) had mixed results. None showed a decrement in knowledge.

The results from 19 of the 28 studies came from long-term assessment of knowledge retention, as follows: 15 studies (of 22) from the improvement group, 3 studies (of 4) from the no-improvement group, and 1 study (of 2) from the mixed-results group. The knowledge retention from the remaining nine studies was too sparse or heterogeneous to conduct a meaningful analysis.

In summary, most studies (79%) showed that CME activities were associated with improvements in knowledge. The majority (15 of 22 studies; 68%) showed long-term knowledge retention.

Recommendation

  1. 1. We recommend that CME activities be used to improve physician knowledge (Grade 1C).

The CME activities from the 136 studies included in the Agency for Healthcare Research and Quality Evidence report11 were classified according to the following three facets of CME activities: the instructional media, the instructional techniques, and the frequency of exposure to a given CME program. Overall, multimedia (eg, video, audio, or print), multiple instructional techniques (eg, discussion groups or case-based learning), and multiple exposures (sessions) were used most often. (Methods article10a, Tables 3 and 4 provide detailed description of the media, technique, and exposure characteristics).11

Instructional Media: Are Multimedia Interventions in CME Preferable to Single-Medium Interventions To Improve Physician Knowledge?

Instructional media were classified according to eight different types (Methods article10a, Table 3).11 Studies also were classified as having used a single-medium intervention or multimedia intervention or as comparative studies of single vs multimedia interventions.

Four of the 28 studies included in the analyses compared single-medium vs multimedia CME interventions. Of these, three studies13,38,43 showed that multimedia interventions had a greater benefit than single-medium interventions, including a print-based, single-medium intervention group.

Of the nine studies using a single-medium intervention, seven21,26,29,36,45,47,49 (78%) showed improvements in knowledge. Of the 15 studies using multimedia interventions, 12 studies12,14,1719,24,27,33,39,41,42,48 (80%) showed improvements in knowledge.

In the four studies that did not show improvements in knowledge, one23 used a single-print medium, whereas the other three16,23,25 used multimedia interventions (live, audio, and print; live and print; live intervention vs live Internet with non-real-time reading material). In the two studies with mixed results, one40 used a single-medium intervention (live), and one15 used a multimedia intervention (live Internet with printed material).

The overall effectiveness of both single-medium and multimedia interventions was similar. However, more studies evaluated multimedia formats, and three of the four comparative studies favored multimedia intervention.

Recommendation

  • 2. We suggest the use of multimedia CME interventions in preference to single-medium interventions to improve physician knowledge (Grade 2C).

Instructional Technique: Are Multiple Instructional Techniques Preferable to a Single Technique To Improve Physician Knowledge?

Instructional techniques were classified according to 17 different types (Methods article10a, Table 4).11 Studies also were classified as having used a single-technique intervention or multiple-technique intervention or as comparative studies of single vs multiple technique interventions.

Thirty studies with an adequate control group addressed this question, with a total of 31 knowledge objectives. Two studies28,30 not previously used were added for this question because they contained a concurrent comparison group that used a different instructional technique. Overall, 22 studies (73%) showed improvements in knowledge, including 2 with single-technique interventions and 20 with multiple-technique interventions. Six studies did not show improvements, and 2 contained mixed results.

Five of the 30 studies compared single-technique vs multiple-technique CME interventions. Of these five comparative studies, two29,38 showed that multiple-technique interventions had a greater benefit on knowledge improvement than single-technique interventions, and one21 showed the opposite, favoring a single-technique problem-based learning intervention. Of the two studies using single-technique interventions, one17 showed improvements in knowledge, and one46 did not. Of the 23 studies using multiple techniques, 18 studies1214,18,19,24,26,27,29,33,36,39,41,42,45,4749 (78%) showed improvements in knowledge.

Each of the 22 studies showing improvements in knowledge used a variety of combinations of instructional techniques, including, for example, case-based learning most commonly combined with discussion group and readings; lecture with readings, standardized patient, or team-based learning; or readings and discussion group. In the six studies that did not show improvements in knowledge, only one23 used a single-technique intervention (reading). The remaining five studies16,22,25,28,30 used multiple-technique interventions that combined two to four of the following instructional techniques (ie, three interventions with two techniques, one with three techniques, and two with four techniques): case-based readings, discussion group, feedback, lecture, point of care, readings, and role play. The other two studies15,40 reporting mixed results used multiple-technique interventions (case-based learning with discussion group and lecture with discussion group). Finally, of the studies for which the time of knowledge assessment (short or long term) could be clearly established, the majority (88%) were associated with long-term assessment of multiple-technique interventions.

Given the heterogeneity of combinations of instructional techniques used in the studies reviewed and the array of results reported across techniques, no firm conclusions can be made. However, because a majority of the studies showing improvement in physician knowledge used more than one technique, the Johns Hopkins Evidence-based Practice Center11 concluded that “multiple techniques that most commonly include case-based learning seem to be more associated with improvements in knowledge.”

Recommendation

  • 3. We suggest the use of CME interventions with multiple instructional techniques in preference to a single technique to improve physician knowledge (Grade 2C).

Frequency of Exposure: Are Multiple Exposures to CME Content Preferable to a Single Exposure To Improve Physician Knowledge?

The frequency of exposure to CME content in a given CME program (number of CME sessions) was classified in each study according to the following three categories: single exposure (once); multiple exposures (multiple times); and single vs multiple exposures. For this question, 27 studies with an adequate control group addressed this question, with 28 knowledge objectives. One study43 was excluded because the description of the exposure frequency was not sufficiently clear to classify. Overall, 21 of the 27 studies (78%) showed improvements in knowledge as follows: 5 studies12,13,26,36,49 (of 5) from single-exposure CME interventions; 16 and from multiple-exposure interventions (ie, 12 [of 17] from multiple-exposure studies and 4 [of 5] from comparative [single vs multiple] studies). Four studies did not show improvements in knowledge, and two reported mixed results.

The five studies12,13,26,36,49 of single-exposure CME interventions all led to improvements in knowledge, four with long-term assessments of knowledge and one with the time of knowledge assessment not clearly specified. The four comparative studies19,21,24,38 with improvements in knowledge used multiple-exposure interventions (vs single exposure), three with long-term assessments of knowledge and one with the time of knowledge assessment not clearly specified. Of the 17 multiple-exposure studies, 1214,17,18,27,29,33,39,41,42,45,47,48 showed improvements in knowledge, with knowledge assessed at different times (1 with short-term assessment, 8 with long-term assessments, and 3 with the time of assessment not clearly specified).

In the four studies16,22,23,25 with no improvements in knowledge, 0 came from single-exposure studies, and one16 came from a comparative study with long-term assessment of knowledge. Three studies22,23,25 came from multiple-exposure studies, two22,23 with long-term knowledge assessment and one25 with the time of assessment not clearly specified.

In summary, despite the heterogeneity of the studies reviewed and the fact that all five studies that used a single-exposure CME intervention showed improvements in knowledge, the other results, with head-to-head comparisons, “imply that when possible multiple exposures [to CME content] produces better knowledge gains.”11

Recommendation

  • 4. We suggest the use of multiple exposures (sessions) to CME content in preference to a single exposure be used to improve physician knowledge (Grade 2C).

The overall quality of the evidence from the studies reviewed to assess the impact of CME interventions on short-term and long-term knowledge acquisition and retention was low. In addition, the heterogeneity of the studies in terms of study designs, content areas, educational methods, and frequency of exposure preclude firm conclusions. Nevertheless, the results from the majority (79%) of those studies showed that CME activities were associated with improvements in knowledge, with both short-term and long-term gains, thus warranting a recommendation to use CME activities to improve physician knowledge.

The type of knowledge assessed was not reviewed in detail in this study but warrants further comments. Some educators divide knowledge into the following two broad categories: factual knowledge (to define, describe, list, name, recall) and applied knowledge, mirroring the “knows” and “knows how” in Miller's pyramid of clinical competence.10,52 Applied knowledge calls on greater cognitive skill, as follows from the taxonomy of mental skills by Krathwohl et al53: comprehension (to explain), application (to solve), analysis (to compare and contrast), synthesis (to create, summarize), and evaluation (to justify, defend). Rote memorization of facts is a poor predictor of problem-solving ability, whereas applied knowledge best predicts deeper learning and expert performance.5456 It was the French scientist Poincaré57 who drew the following analogy to distinguish facts from applied knowledge: “Science is built up with facts, as a house is with stones. But a collection of facts is no more a science than a heap of stones is a house.” National testing agencies, such as the United States Medical Licensing Examination, recognize the importance of applied knowledge and require that their test items assess applied knowledge related to key concepts and principles essential for clinicians to understand.52 This is the difference, for example, between asking a clinician to describe Korsakoff syndrome and asking a clinician to gather and analyze the history and physical examination findings from a postoperative patient who is agitated (presented on paper or through a simulated or live patient), make a diagnosis, explain its mechanism, and manage the situation. If one cannot apply what one knows, then that knowledge is not useful. Implied in the Accreditation Council for Continuing Medical Education accreditation criteria9 is the assessment not only of knowledge, factual or applied, as part of the needs assessment process, but also of applied knowledge (competence) as an outcome measure of physician competence. The assessment of applied knowledge with written or performance tests is useful to evaluate that dimension of CME outcomes because it predicts actual performance in practice.13,58

Although it is of low quality, the evidence concerning the use of media and instructional techniques and the frequency of exposure suggests that multimedia, multiple instructional techniques, and multiple exposures be used whenever possible in CME. Although all three sets of results point to the conclusion that multiple modalities are most effective, reality actually is more complex. The interaction of each variation of each dimension is an important consideration for CME providers, teachers, and attendees. For example, CME providers must address multiple audiences with different needs and preferences. On the other hand, attendees may reject a single format because it does not fit their preferences. No single solution to these issues is possible; rather, the instructional technique should be chosen to best engage the attendees (eg, group discussions, demonstrations) while the medium helps to present the content (eg, printed materials, video presentations). Academic detailing is a good example of the importance of considering instructional media and instructional techniques as synergistic. Some physicians may prefer demonstrations with printed materials, whereas others may prefer group discussions with videos. Some teachers may be better at leading groups and others at mentoring. The frequency of exposure further increases the possible interactions among these variables. Elucidating what works to improve an individual physician's competence and performance involves the consideration of complex interactions among intervention characteristics, physician characteristics, practice setting characteristics, and patient characteristics.59,60

Physician-learners progress at their own rates, depending on motivation, knowledge of a problem, or the perception of a gap between their current knowledge and skills and those needed.61 Knowledge is necessary but not an end in itself that will lead to a change in physician behavior or patient outcomes.62 When barriers to change are addressed or gaps are demonstrated and resources deployed to help the learner, change may be expected to occur.62,63 Consistent with prior studies, combinations of instructional techniques, with interactive components such as case-based discussions or role playing, were shown to effect change in knowledge.

These interactions and the variety of possible combinations, while beneficial for learners, create major study design problems and potential confounders when conducting research on the effectiveness of any single dimension or sets of dimensions. Was knowledge gained because of multimedia interventions or single vs multiple exposures or because multimedia may simply represent multiple exposures to the same content? Cook64 discussed these design issues in the context of conducting research on computer-based instruction. Using a slightly different taxonomy, he identified the following four dimensions of educational interventions that need to be controlled across experimental and control groups when comparing different instructional interventions: the medium (eg, classroom, audiotapes, Web based), the configuration (eg, discussion boards, tutorials, simulations), the instructional method (eg, practice, feedback), and the presentation (eg, color, sound, fidelity). To understand the unique contribution of each dimension (or specific sets of dimensions), the researcher must vary only one dimension (or set) at a time, holding the other dimensions constant. Comparisons must be made within dimensions and not between dimensions to avoid confounding variables. Even a perfectly designed and executed randomized clinical trial is still subject to confounding variables if the interventions being compared differ in multiple ways. Two of the main problems encountered in the studies reviewed were the heterogeneity of the dimensions across studies and the presence of confounding variables. A more systematic approach is needed to better assess the unique and combined contributions of media, techniques, and frequency of exposure and their impact on outcomes, including physician knowledge and performance and patient outcomes following CME interventions. Much of the research conducted in CME, and medical education in general, is done in isolation and on an opportunistic basis because of either available funding or an investigator's passing interest. Programmatic and collaborative research is needed to better assess the unique and combined contributions of the various dimensions of CME,65 such as media, techniques, and frequency of exposure.

Dr. Bordage has no conflicts of interest.

Dr. Carlin has no conflicts of interest.

Dr. Mazmanian has no conflicts of interest.

Ramsey PG, Carline JD, Inui TS, et al. Predictive validity of certification by the American Board of Internal Medicine. Ann Intern Med. 1989;110:719-726. [PubMed]
 
Norcini JJ, Lipner RS, Kimball HR. Certifying examination performance and patient outcomes following acute myocardial infarction. Med Educ. 2002;36:853-859. [PubMed] [CrossRef]
 
Prystowsky JB, Bordage G, Feinglass JM. Patient outcomes for segmental colon resection according to surgeon's training, certification, and experience. Surgery. 2002;132:663-670. [PubMed]
 
Haas JS, Orav EJ, Goldman L. The relationship between physicians' qualifications and experience and the adequacy of prenatal care and low birthweight. Am J Public Health. 1995;85:1087-1091. [PubMed]
 
Kelly JV, Hellinger FJ. Physician and hospital factors associated with mortality of surgical patients. Med Care. 1986;24:785-800. [PubMed]
 
Rutledge R, Oller DW, Meyer AA, et al. A statewide, population-based time-series analysis of the outcome of ruptured abdominal aortic aneurysm. Ann Surg. 1996;223:492-502. [PubMed]
 
Adamson TE, Baldwin DC Jr, Sheehan TJ, et al. Characteristics of surgeons with high and low malpractice claims rates. West J Med. 1997;166:37-44. [PubMed]
 
Morrison J, Wickersham P. Physicians disciplined by a state medical board. JAMA. 1998;279:1889-1893. [PubMed]
 
Accreditation Council for Continuing Medical Education Decision-making criteria relevant to the essential areas and elements: 2006 update.Accessed January 12, 2009 Available at:http://www.accme.org/dir_docs/doc_upload/b03aa5cc-b0174395-a41f8d5d89ac31ca_uploaddocument.pdf.
 
Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:S63-S67. [PubMed]
 
Marinopoulos SS, Baumann MH. Methods and definitions of terms: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135suppl:17S-28S. [PubMed]
 
Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep).2007:1-69
 
Andersen SM, Harthorn BH. Changing the psychiatric knowledge of primary care physicians: the effects of a brief intervention on clinical diagnosis and treatment. Gen Hosp Psychiatry. 1990;12:177-190. [PubMed]
 
Beaulieu M, Choquette D, Rahme E, et al. CURATA: a patient health management program for the treatment of osteoarthritis in Quebec; an integrated approach to improving the appropriate utilization of anti-inflammatory/analgesic medications. Am J Manag Care. 2004;10:569-575. [PubMed]
 
Block L, Banspach SW, Gans K, et al. Impact of public education and continuing medical education on physician attitudes and behavior concerning cholesterol. Am J Prev Med. 1988;4:255-260. [PubMed]
 
Chodosh J, Berry E, Lee M, et al. Effect of a dementia care management intervention on primary care provider knowledge, attitudes, and perceptions of quality of care. J Am Geriatr Soc. 2006;54:311-317. [PubMed]
 
Chung S, Mandl KD, Shannon M, et al. Efficacy of an educational Web site for educating physicians about bioterrorism. Acad Emerg Med. 2004;11:143-148. [PubMed]
 
Cohn BA, Wingard DL, Patterson RC, et al. The National DES Education Program: effectiveness of the California Health Provider Intervention. J Cancer Educ. 2002;17:40-45. [PubMed]
 
Costanza ME, Zapka JG, Harris DR, et al. Impact of a physician intervention program to increase breast cancer screening. Cancer Epidemiol Biomarkers Prev. 1992;1:581-589. [PubMed]
 
Curran VR, Hoekman T, Gulliver W, et al. Web-based continuing medical education: (II). Evaluation study of computer-mediated continuing medical education. J Contin Educ Health Prof. 2000;20:106-119. [PubMed]
 
Des Marchais JE, Jean P, Castonguay LG. Training psychiatrists and family doctors in evaluating interpersonal skills. Med Educ. 1990;24:376-381. [PubMed]
 
Doucet MD, Purdy RA, Kaufman DM, et al. Comparison of problem-based learning and lecture format in continuing medical education on headache diagnosis and management. Med Educ. 1998;32:590-596. [PubMed]
 
Elliott TE, Murray DM, Oken MM, et al. Improving cancer pain management in communities: main results from a randomized controlled trial. J Pain Symptom Manage. 1997;13:191-203. [PubMed]
 
Evans CE, Haynes RB, Birkett NJ, et al. Does a mailed continuing education program improve physician performance? Results of a randomized trial in antihypertensive care. JAMA. 1986;255:501-504. [PubMed]
 
Fordis M, King JE, Ballantyne CM, et al. Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005;294:1043-1051. [PubMed]
 
Gerrity MS, Cole SA, Dietrich AJ, et al. Improving the recognition and management of depression: is there a role for physician education? J Fam Pract. 1999;48:949-957. [PubMed]
 
Gerstein HC, Reddy SS, Dawson KG, et al. A controlled evaluation of a national continuing medical education programme designed to improve family physicians' implementation of diabetes-specific clinical practice guidelines. Diabet Med. 1999;16:964-969. [PubMed]
 
Gifford DR, Mittman BS, Fink A, et al. Can a specialty society educate its members to think differently about clinical decisions? Results of a randomized trial. J Gen Intern Med. 1996;11:664-672. [PubMed]
 
Greenberg LW, Jewett LS. The impact of two teaching techniques on physicians' knowledge and performance. J Med Educ. 1985;60:390-396. [PubMed]
 
Harris JM Jr, Kutob RM, Surprenant ZJ, et al. Can Internet-based education improve physician confidence in dealing with domestic violence? Fam Med. 2002;34:287-292. [PubMed]
 
Heale J, Davis D, Norman G, et al. A randomized controlled trial assessing the impact of problem-based versus didactic teaching methods in CME. Proc Annu Conf Res Med Educ. 1988;27:72-77
 
Hergenroeder AC, Chorley JN, Laufman L, et al. Two educational interventions to improve pediatricians' knowledge and skills in performing ankle and knee physical examinations. Arch Pediatr Adolesc Med. 2002;156:225-229. [PubMed]
 
Kemper KJ, Gardiner P, Gobble J, et al. Randomized controlled trial comparing four strategies for delivering e-curriculum to health care professionals. BMC Med Educ. 2006;6:2. [PubMed]
 
Kiang KM, Kieke BA, Como-Sabetti K, et al. Clinician knowledge and beliefs after statewide program to promote appropriate antimicrobial drug use. Emerg Infect Dis. 2005;11:904-911. [PubMed]
 
Kutcher SP, Lauria-Horner BA, MacLaren CM, et al. Evaluating the impact of an educational program on practice patterns of Canadian family physicians interested in depression treatment. Prim Care Companion J Clin Psychiatry. 2002;4:224-231. [PubMed]
 
Labelle M, Beaulieu M, Renzi P, et al. Integrating clinical practice guidelines into daily practice: impact of an interactive workshop on drafting of a written action plan for asthma patients. J Contin Educ Health Prof. 2004;24:39-49. [PubMed]
 
Lane DS, Messina CR, Grimson R. An educational approach to improving physician breast cancer screening practices and counseling skills. Patient Educ Couns. 2001;43:287-299. [PubMed]
 
Lockyer JM, Fidler H, Hogan DB, et al. Dual-track CME: accuracy and outcome. Acad Med. 2002;77:S61-S63. [PubMed]
 
Maiman LA, Becker MH, Liptak GS, et al. Improving pediatricians' compliance-enhancing practices: a randomized trial. Am J Dis Child. 1988;142:773-779. [PubMed]
 
Mann KV, Lindsay EA, Putnam RW, et al. Increasing physician involvement in cholesterol-lowering practices: the role of knowledge, attitudes and perceptions. Adv Health Sci Educ Theory Pract. 1997;2:237-253. [PubMed]
 
Maxwell JA, Sandlow LJ, Bashook PG. Effect of a medical care evaluation program on physician knowledge and performance. J Med Educ. 1984;59:33-38. [PubMed]
 
Meredith LS, Jackson-Triche M, Duan N, et al. Quality improvement for depression enhances long-term treatment knowledge for primary care clinicians. J Gen Intern Med. 2000;15:868-877. [PubMed]
 
Premi J, Shannon S, Hartwick K, et al. Practice-based small-group CME. Acad Med. 1994;69:800-802. [PubMed]
 
Premi J, Shannon SI. Randomized controlled trial of a combined video-workbook educational program for CME. Acad Med. 1993;68:S13-S15. [PubMed]
 
Rosenthal MS, Lannon CM, Stuart JM, et al. A randomized trial of practice-based education to improve delivery systems for anticipatory guidance. Arch Pediatr Adolesc Med. 2005;159:456-463. [PubMed]
 
Short LM, Surprenant ZJ, Harris JM Jr. A community-based trial of an online intimate partner violence CME program. Am J Prev Med. 2006;30:181-185. [PubMed]
 
Slotnick HB. Educating physicians through advertising: using the brief summary to teach about pharmaceutical use. J Contin Educ Health Prof. 1993;13:299-314
 
Stewart M, Marshall JN, Ostbye T, et al. Effectiveness of case-based on-line learning of evidence-based practice guidelines. Fam Med. 2005;37:131-138. [PubMed]
 
Terry PB, Wang VL, Flynn BS, et al. A continuing medical education program in chronic obstructive pulmonary diseases: design and outcome. Am Rev Respir Dis. 1981;123:42-46. [PubMed]
 
White CW, Albanese MA, Brown DD, et al. The effectiveness of continuing medical education in changing the behavior of physicians caring for patients with acute myocardial infarction: a controlled randomized trial. Ann Intern Med. 1985;102:686-692. [PubMed]
 
White M, Michaud G, Pachev G, et al. Randomized trial of problem-based versus didactic seminars for disseminating evidence-based guidelines on asthma management to primary care physicians. J Contin Educ Health Prof. 2004;24:237-243. [PubMed]
 
Guyatt G, Gutterman D, Baumann MH, et al. Grading strength of recommendations and quality of evidence in clinical guidelines: report from an American College of Chest Physicians task force. Chest. 2006;129:174-181. [PubMed]
 
Case S, Swanson D. Constructing written test questions for the basic and clinical sciences.Accessed January 12, 2009 Available at:http://www.nbme.org/publications/item-writing-manual.html.
 
Krathwohl DR, Bloom BS, Masia BB. Taxonomy of educational objectives, the classification of educational goals: handbook II; affective domain. 1973; New York, NY David McKay Co
 
Bordage G. Elaborated knowledge: a key to successful diagnostic thinking. Acad Med. 1994;69:883-885. [PubMed]
 
Bransford JD, Brown AL, Cocking RR. How people learn: brain, mind, experience and school; National Research Council. 2000; Washington DC National Academy Press
 
Chang RW, Bordage G, Connell KJ. The importance of early problem representation during case presentations. Acad Med. 1998;73:S109-S111. [PubMed]
 
Poincaré J. La Science et L'hpothese.Accessed January 12, 2009 Available at:http://www-history.mcs.st-andrews.ac.uk/Quotations/Poincare.html.
 
Norman GR. Editorial: inverting the pyramid. Adv Health Sci Educ. 2005;10:85-88
 
Davis DA, Taylor-Vaisey A. Translating guidelines into practice: a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines. CMAJ. 1997;157:408-416. [PubMed]
 
Smith WR. Evidence for the effectiveness of techniques to change physician behavior. Chest. 2000;118:8S-17S. [PubMed]
 
Davis DA, Thomson MA, Oxman AD, et al. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700-705. [PubMed]
 
Davis D, O'Brien MA, Freemantle N, et al. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282:867-874. [PubMed]
 
Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence. JAMA. 2002;288:1057-1060. [PubMed]
 
Cook DA. The research we still are not doing: an agenda for the study of computer-based learning. Acad Med. 2005;80:541-548. [PubMed]
 
Bordage G. Moving the field forward: going beyond quantitative- qualitative. Acad Med. 2007;82:S126-S128. [PubMed]
 

Figures

Tables

References

Ramsey PG, Carline JD, Inui TS, et al. Predictive validity of certification by the American Board of Internal Medicine. Ann Intern Med. 1989;110:719-726. [PubMed]
 
Norcini JJ, Lipner RS, Kimball HR. Certifying examination performance and patient outcomes following acute myocardial infarction. Med Educ. 2002;36:853-859. [PubMed] [CrossRef]
 
Prystowsky JB, Bordage G, Feinglass JM. Patient outcomes for segmental colon resection according to surgeon's training, certification, and experience. Surgery. 2002;132:663-670. [PubMed]
 
Haas JS, Orav EJ, Goldman L. The relationship between physicians' qualifications and experience and the adequacy of prenatal care and low birthweight. Am J Public Health. 1995;85:1087-1091. [PubMed]
 
Kelly JV, Hellinger FJ. Physician and hospital factors associated with mortality of surgical patients. Med Care. 1986;24:785-800. [PubMed]
 
Rutledge R, Oller DW, Meyer AA, et al. A statewide, population-based time-series analysis of the outcome of ruptured abdominal aortic aneurysm. Ann Surg. 1996;223:492-502. [PubMed]
 
Adamson TE, Baldwin DC Jr, Sheehan TJ, et al. Characteristics of surgeons with high and low malpractice claims rates. West J Med. 1997;166:37-44. [PubMed]
 
Morrison J, Wickersham P. Physicians disciplined by a state medical board. JAMA. 1998;279:1889-1893. [PubMed]
 
Accreditation Council for Continuing Medical Education Decision-making criteria relevant to the essential areas and elements: 2006 update.Accessed January 12, 2009 Available at:http://www.accme.org/dir_docs/doc_upload/b03aa5cc-b0174395-a41f8d5d89ac31ca_uploaddocument.pdf.
 
Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:S63-S67. [PubMed]
 
Marinopoulos SS, Baumann MH. Methods and definitions of terms: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135suppl:17S-28S. [PubMed]
 
Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep).2007:1-69
 
Andersen SM, Harthorn BH. Changing the psychiatric knowledge of primary care physicians: the effects of a brief intervention on clinical diagnosis and treatment. Gen Hosp Psychiatry. 1990;12:177-190. [PubMed]
 
Beaulieu M, Choquette D, Rahme E, et al. CURATA: a patient health management program for the treatment of osteoarthritis in Quebec; an integrated approach to improving the appropriate utilization of anti-inflammatory/analgesic medications. Am J Manag Care. 2004;10:569-575. [PubMed]
 
Block L, Banspach SW, Gans K, et al. Impact of public education and continuing medical education on physician attitudes and behavior concerning cholesterol. Am J Prev Med. 1988;4:255-260. [PubMed]
 
Chodosh J, Berry E, Lee M, et al. Effect of a dementia care management intervention on primary care provider knowledge, attitudes, and perceptions of quality of care. J Am Geriatr Soc. 2006;54:311-317. [PubMed]
 
Chung S, Mandl KD, Shannon M, et al. Efficacy of an educational Web site for educating physicians about bioterrorism. Acad Emerg Med. 2004;11:143-148. [PubMed]
 
Cohn BA, Wingard DL, Patterson RC, et al. The National DES Education Program: effectiveness of the California Health Provider Intervention. J Cancer Educ. 2002;17:40-45. [PubMed]
 
Costanza ME, Zapka JG, Harris DR, et al. Impact of a physician intervention program to increase breast cancer screening. Cancer Epidemiol Biomarkers Prev. 1992;1:581-589. [PubMed]
 
Curran VR, Hoekman T, Gulliver W, et al. Web-based continuing medical education: (II). Evaluation study of computer-mediated continuing medical education. J Contin Educ Health Prof. 2000;20:106-119. [PubMed]
 
Des Marchais JE, Jean P, Castonguay LG. Training psychiatrists and family doctors in evaluating interpersonal skills. Med Educ. 1990;24:376-381. [PubMed]
 
Doucet MD, Purdy RA, Kaufman DM, et al. Comparison of problem-based learning and lecture format in continuing medical education on headache diagnosis and management. Med Educ. 1998;32:590-596. [PubMed]
 
Elliott TE, Murray DM, Oken MM, et al. Improving cancer pain management in communities: main results from a randomized controlled trial. J Pain Symptom Manage. 1997;13:191-203. [PubMed]
 
Evans CE, Haynes RB, Birkett NJ, et al. Does a mailed continuing education program improve physician performance? Results of a randomized trial in antihypertensive care. JAMA. 1986;255:501-504. [PubMed]
 
Fordis M, King JE, Ballantyne CM, et al. Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA. 2005;294:1043-1051. [PubMed]
 
Gerrity MS, Cole SA, Dietrich AJ, et al. Improving the recognition and management of depression: is there a role for physician education? J Fam Pract. 1999;48:949-957. [PubMed]
 
Gerstein HC, Reddy SS, Dawson KG, et al. A controlled evaluation of a national continuing medical education programme designed to improve family physicians' implementation of diabetes-specific clinical practice guidelines. Diabet Med. 1999;16:964-969. [PubMed]
 
Gifford DR, Mittman BS, Fink A, et al. Can a specialty society educate its members to think differently about clinical decisions? Results of a randomized trial. J Gen Intern Med. 1996;11:664-672. [PubMed]
 
Greenberg LW, Jewett LS. The impact of two teaching techniques on physicians' knowledge and performance. J Med Educ. 1985;60:390-396. [PubMed]
 
Harris JM Jr, Kutob RM, Surprenant ZJ, et al. Can Internet-based education improve physician confidence in dealing with domestic violence? Fam Med. 2002;34:287-292. [PubMed]
 
Heale J, Davis D, Norman G, et al. A randomized controlled trial assessing the impact of problem-based versus didactic teaching methods in CME. Proc Annu Conf Res Med Educ. 1988;27:72-77
 
Hergenroeder AC, Chorley JN, Laufman L, et al. Two educational interventions to improve pediatricians' knowledge and skills in performing ankle and knee physical examinations. Arch Pediatr Adolesc Med. 2002;156:225-229. [PubMed]
 
Kemper KJ, Gardiner P, Gobble J, et al. Randomized controlled trial comparing four strategies for delivering e-curriculum to health care professionals. BMC Med Educ. 2006;6:2. [PubMed]
 
Kiang KM, Kieke BA, Como-Sabetti K, et al. Clinician knowledge and beliefs after statewide program to promote appropriate antimicrobial drug use. Emerg Infect Dis. 2005;11:904-911. [PubMed]
 
Kutcher SP, Lauria-Horner BA, MacLaren CM, et al. Evaluating the impact of an educational program on practice patterns of Canadian family physicians interested in depression treatment. Prim Care Companion J Clin Psychiatry. 2002;4:224-231. [PubMed]
 
Labelle M, Beaulieu M, Renzi P, et al. Integrating clinical practice guidelines into daily practice: impact of an interactive workshop on drafting of a written action plan for asthma patients. J Contin Educ Health Prof. 2004;24:39-49. [PubMed]
 
Lane DS, Messina CR, Grimson R. An educational approach to improving physician breast cancer screening practices and counseling skills. Patient Educ Couns. 2001;43:287-299. [PubMed]
 
Lockyer JM, Fidler H, Hogan DB, et al. Dual-track CME: accuracy and outcome. Acad Med. 2002;77:S61-S63. [PubMed]
 
Maiman LA, Becker MH, Liptak GS, et al. Improving pediatricians' compliance-enhancing practices: a randomized trial. Am J Dis Child. 1988;142:773-779. [PubMed]
 
Mann KV, Lindsay EA, Putnam RW, et al. Increasing physician involvement in cholesterol-lowering practices: the role of knowledge, attitudes and perceptions. Adv Health Sci Educ Theory Pract. 1997;2:237-253. [PubMed]
 
Maxwell JA, Sandlow LJ, Bashook PG. Effect of a medical care evaluation program on physician knowledge and performance. J Med Educ. 1984;59:33-38. [PubMed]
 
Meredith LS, Jackson-Triche M, Duan N, et al. Quality improvement for depression enhances long-term treatment knowledge for primary care clinicians. J Gen Intern Med. 2000;15:868-877. [PubMed]
 
Premi J, Shannon S, Hartwick K, et al. Practice-based small-group CME. Acad Med. 1994;69:800-802. [PubMed]
 
Premi J, Shannon SI. Randomized controlled trial of a combined video-workbook educational program for CME. Acad Med. 1993;68:S13-S15. [PubMed]
 
Rosenthal MS, Lannon CM, Stuart JM, et al. A randomized trial of practice-based education to improve delivery systems for anticipatory guidance. Arch Pediatr Adolesc Med. 2005;159:456-463. [PubMed]
 
Short LM, Surprenant ZJ, Harris JM Jr. A community-based trial of an online intimate partner violence CME program. Am J Prev Med. 2006;30:181-185. [PubMed]
 
Slotnick HB. Educating physicians through advertising: using the brief summary to teach about pharmaceutical use. J Contin Educ Health Prof. 1993;13:299-314
 
Stewart M, Marshall JN, Ostbye T, et al. Effectiveness of case-based on-line learning of evidence-based practice guidelines. Fam Med. 2005;37:131-138. [PubMed]
 
Terry PB, Wang VL, Flynn BS, et al. A continuing medical education program in chronic obstructive pulmonary diseases: design and outcome. Am Rev Respir Dis. 1981;123:42-46. [PubMed]
 
White CW, Albanese MA, Brown DD, et al. The effectiveness of continuing medical education in changing the behavior of physicians caring for patients with acute myocardial infarction: a controlled randomized trial. Ann Intern Med. 1985;102:686-692. [PubMed]
 
White M, Michaud G, Pachev G, et al. Randomized trial of problem-based versus didactic seminars for disseminating evidence-based guidelines on asthma management to primary care physicians. J Contin Educ Health Prof. 2004;24:237-243. [PubMed]
 
Guyatt G, Gutterman D, Baumann MH, et al. Grading strength of recommendations and quality of evidence in clinical guidelines: report from an American College of Chest Physicians task force. Chest. 2006;129:174-181. [PubMed]
 
Case S, Swanson D. Constructing written test questions for the basic and clinical sciences.Accessed January 12, 2009 Available at:http://www.nbme.org/publications/item-writing-manual.html.
 
Krathwohl DR, Bloom BS, Masia BB. Taxonomy of educational objectives, the classification of educational goals: handbook II; affective domain. 1973; New York, NY David McKay Co
 
Bordage G. Elaborated knowledge: a key to successful diagnostic thinking. Acad Med. 1994;69:883-885. [PubMed]
 
Bransford JD, Brown AL, Cocking RR. How people learn: brain, mind, experience and school; National Research Council. 2000; Washington DC National Academy Press
 
Chang RW, Bordage G, Connell KJ. The importance of early problem representation during case presentations. Acad Med. 1998;73:S109-S111. [PubMed]
 
Poincaré J. La Science et L'hpothese.Accessed January 12, 2009 Available at:http://www-history.mcs.st-andrews.ac.uk/Quotations/Poincare.html.
 
Norman GR. Editorial: inverting the pyramid. Adv Health Sci Educ. 2005;10:85-88
 
Davis DA, Taylor-Vaisey A. Translating guidelines into practice: a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines. CMAJ. 1997;157:408-416. [PubMed]
 
Smith WR. Evidence for the effectiveness of techniques to change physician behavior. Chest. 2000;118:8S-17S. [PubMed]
 
Davis DA, Thomson MA, Oxman AD, et al. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700-705. [PubMed]
 
Davis D, O'Brien MA, Freemantle N, et al. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282:867-874. [PubMed]
 
Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence. JAMA. 2002;288:1057-1060. [PubMed]
 
Cook DA. The research we still are not doing: an agenda for the study of computer-based learning. Acad Med. 2005;80:541-548. [PubMed]
 
Bordage G. Moving the field forward: going beyond quantitative- qualitative. Acad Med. 2007;82:S126-S128. [PubMed]
 
NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Find Similar Articles
CHEST Journal Articles
  • CHEST Journal
    Print ISSN: 0012-3692
    Online ISSN: 1931-3543