Graduate Medical Education evolves to meet changing health care system

Michael Wadman, M.D.

We all see residents and fellows caring for patients in our hospital and clinics, but most are unfamiliar with the way these training programs ensure that their graduates are prepared to meet the health care needs of Nebraska. 

The Accreditation Council for Graduate Medical Education (ACGME) provides the framework for the regulation of graduate medical education programs, ensuring consistent educational quality between all programs, but also serves as a mechanism to ensure the safety of the public receiving care at teaching hospitals.

The Next Accreditation System (NAS), the most recent ACGME accreditation model, emphasizes the physician competencies demanded by the public to meet the needs of the rapidly evolving U.S. health care system, utilizing an outcomes-based approach rather than the process-based approach of the past.

The key components of the NAS include annual screening of key performance measures for programs semiannual resident milestone reporting, 10-year program self-study and self-study site visit, and an institutional site visit – the clinical learning environment review (CLER).

Continuous Accreditation: Web-based Reporting of Program PerformanceMeasures

The first component of NAS is the annual reporting of key performance measures for programs using a web-based accreditation data system (WebADS). In the previous accreditation model, site visits occurred every four to five years and were preceded by the submission of a Program Information Form (PIF) that included required program data. This format of episodic site visits focusing on the confirmation of PIF data resulted in what has been described as a ‘biopsy model’ of accreditation, detracting from ongoing quality improvement cycles at the program level.

The NAS requires continuous accreditation utilizing an annual screen of key program performance indicators, including basic program data (faculty roster with certification status, major program changes, citation responses, program characteristics, faculty and resident scholarly activity, and curriculum), aggregate board pass rate, resident clinical experience, and resident and faculty survey results.

Each specialty’s RRC reviews these program data elements, in addition to milestone data (addressed in the next section), and assigns an accreditation status based on their findings – continued accreditation, continued accreditation with warning, or site visit required.

Following completion of the site visit, in addition to continued accreditation and continued accreditation with warning, other possible options include probationary accreditation (for a maximum of two years), or withdrawal of accreditation.  In addition to the accreditation decision, the RRC may also recognize and commend programs for exemplary performance or innovations, identify areas for improvement, identify concerning trends, issue citations, extend prior citations, or request a progress report.

In the first year of experience with the NAS, annual accreditation notifications from the ACGME either identify or confirm focus areas for annual performance improvement activities.  The ‘areas for improvement’ and ‘concerning trends’ allow program leadership to address problems earlier in the accreditation cycle than in the past. Using this information, in addition to other data, programs may implement timely interventions to prevent citations from occurring, rather than recognizing major problems just prior to or immediately after the five-year site visit cycles of the prior accreditation model.

Outcomes-based Evaluations: Milestones

In the past, the ACGME has largely followed a process-based model in the accreditation of GME programs. This model required programs to report components of the resident education process, such as didactic conference schedules and clinical rotation blocks, but lacked any real data on the performance of their trainees.

In 1999, the ACGME and the American Board of Medical Specialties (ABMS) introduced the six domains of clinical competency for physicians – ‘the core competencies’ of patient care, medical knowledge, interpersonal and communication skills, professionalism, practice based learning and improvement, and systems based practice.

In subsequent years, the ACGME’s ‘Outcomes Project’ required programs to use the framework of the six core competencies in curricular design and resident evaluations and the ABMS certifying boards configured their initial certification and MOC examinations to evaluate graduates of these programs for the same six competencies.  Unfortunately, the ultimate goal of an assessment system allowing for program accreditation based on outcomes was not fully realized by the ‘Outcomes Project’. The NAS milestones evaluations and reporting system are the most recent step in the evolution of an outcomes-based accreditation system.

Milestones represent significant observable steps in the development of a trainee, from the first day of residency or fellowship training to graduation and then into independent practice. For each specialty, a working group of specialty board and specialty college representatives, program directors, RRC members, and residents and fellows was formed by the ACGME and the ABMS board.

The Phase I Programs – emergency medicine, internal medicine, neurologic surgery, orthopedic surgery, pediatrics, diagnostic radiology, and urology – implemented their milestones in July 2013, and have reported milestones assessments for their trainees in December 2013 and June 2014. The next group of programs, Phase II, began milestones assessments in July 2014.

For individual residents and fellows, the milestones are intended to provide for more objective assessments, improved feedback, earlier recognition of deficiencies, and more targeted performance improvement plans.

The attainment of a milestone by a trainee is not determined by any single assessment or single faculty member, but rather by a consensus decision of a group of faculty reviewing data from multiple assessment tools.

The Clinical Competency Committee (CCC) makes recommendations for milestone achievement levels to the program director after reviewing a compilation of data from multiple sources semiannually.  The structure and function of the CCC, multiple reviewers following a standardized process, allows for a more objective and standardized review for all trainees, minimizing bias.

For accreditation purposes, the RRCs for each specialty review aggregate milestone data for each program and, together with other program metrics, comprise the data set used to make annual accreditation decisions. The milestone aggregate reports to the RRC provide continuous monitoring of programs, allowing for significant lengthening of the site visit cycle.    

Most importantly, the implementation of milestone evaluations may also drive curricular reform, not only on the program level, but also nationally, as aggregate milestone data demonstrate the true outcomes of GME programs.

Early experience with milestone assessments suggests that, for programs, milestone data may facilitate targeted curriculum development to address training needs. For instance, a program with a procedural milestone that was not being met by residents until very late in their three year program may implement a series of laboratory and simulation experiences to allow for this experience earlier in the course of training.

10-Year Self-Study and Self-Study Site Visit

From the program directors’ standpoint, the greatest burden of the past accreditation model was the preparation of a PIF and the preparation for a site visit every four to five years.

One stated goal of the NAS is too decrease the burden of accreditation on programs, and the 10-year self-study site visit is a significant step in this effort.

The self-study visit must verify self-reported data (WebADS, milestones); monitor progress in addressing citations and other issues identified on resident and faculty surveys; track progress in enhancing strengths and addressing improvement areas, determine additional data needs, and identify areas to reduce burden and increase efficiency; explore strengths and improvement areas; and assess program aims and efforts to meet these aims

Overall, the self-study will be performance-oriented, rather than process-oriented, noting a program’s progress towards its stated goals and then establishing future goals.

Program self-study and the self-study site visits begin for Phase I programs in July 2015, so experience is limited at this time. The first programs scheduled for this visit are beginning their self-study activities at this time, and this early experience suggests that this process will emphasize the importance of strict adherence to an annual performance improvement cycle and follow through on action plans designed to address any deficiencies.

The self-study is a longitudinal evaluation, rather than the periodic evaluation of the prior model, necessitating an effective annual system of program evaluation. In the NAS, each program is required to complete an Annual Program Evaluation addressing the quality of education experiences, resident and fellow performance, graduate performance, and faculty development.

The 10-year series of APEs may serve as the framework for the self-study, if structured to identify program strengths and areas of improvement based on a review of external and internal data, resulting in the development and implementation of action plans with defined timelines and follow up mechanisms.

Clinical Learning Environment Review

Graduate medical education must provide a safe, effective training environment for both learners and patients, where quality health care is delivered and learners acquire the knowledge, skills, and attitudes necessary to provide similar high quality care following completion of their training. Scholarly work in this area confirms that residents and fellows that train in hospitals receiving high marks for patient safety and quality, go on to practice at a level of similar high quality and safety- and, most notably, that this level of performance is sustained for years beyond graduation.

The key element of the NAS is the biannual clinical learning environment review (CLER) program, providing timely, formative feedback to the institution under review.  The program provides the sponsoring institution and affiliated sites with feedback addressing six areas: patient safety, health care quality and health care disparities, transitions of care, supervision, duty hours and fatigue management and mitigation, and professionalism.

The CLER site visit addresses five key questions: 1) who and what form the hospital/medical center’s infrastructure designed to address the six focus areas? 2) how integrated is the GME leadership and faculty in the hospital/medical center’s efforts across the six focus areas?  3) how engaged are the residents and fellows?  4) how does the hospital/medical center determine success?  And, 5) what are the areas the hospital/medical center has identified for improvement?

The CLER site visit is a two- to three-day assessment of the six focus areas at the institution under review.  Prior to the visit, the institution may submit an organization chart and policies and procedures addressing the six focus areas.  Submission of these documents is optional, but allows the visit team to familiarize themselves with the current leadership structure and approach to the six focus areas.

The visit begins with a meeting of hospital and GME leaders with the strict requirement of Chief Executive Officer (CEO) of the hospital participation at both the initial and concluding meetings of the visit.  Others typically attending these meetings include the Chief Medical Officer (CMO), Chief Nursing Officer (CNO), Chief Patient Safety and Quality Officer (CPS/CQO) for the hospital and the Designated Institutional Official (DIO)/Associate Dean for GME.

Following this initial leadership meeting, the site visitors meet with groups of residents and fellows, core faculty, and program directors.  During each meeting, a series of questions is asked utilizing an audience response system, to ensure anonymity and encourage full participation of all.  Between meetings, site visitors are accompanied by senior level residents on walking rounds to all patient care areas of the hospital, during which any residents, fellows, nursing staff or ancillary personnel may be asked questions addressing the CLER focus areas. At the conclusion of the visit, the hospital and GME leaders again meet with the site visitors and a preliminary verbal report is presented with opportunities for clarification.

The data from the site visit is then submitted to the CLER Evaluation Committee for analysis.  This information is not used for accreditation decisions, but rather to set expectations for the six focus areas and to provide the sponsoring institution with an outside assessment of the level of GME engagement in institutional initiatives in the focus areas.