Navigating the Next Accreditation System: A Dashboard for the Milestones

Navigating the Next Accreditation System: A Dashboard for the Milestones

 

Samir Johna, MD; Brandon Woodward, MD

Perm J 2015 Fall; 19(4):61-63

https://doi.org/10.7812/TPP/15-041

Abstract

Introduction: In July 2014, all residency programs accredited by the Accreditation Council for Graduate Medical Education (ACGME) were enrolled in a new system called the Next Accreditation System. Residency programs may not be clear on how best to comply with these new accreditation requirements. Large amounts of data must be collected, evaluated, and submitted twice a year to the council's Web-based data collection system. One challenge is that the new "end-of-rotation" evaluations must reflect specialty-specific milestones, on which many faculty members are not well versed. Like other residency programs, we tried to address the challenges using our local resources.
Methods: We used our existing electronic goals and objectives for each rotation coupled with appropriate end-of-rotation evaluations reflecting the specialty-specific milestones through a process of editing and mapping.
Results: Data extracted from these evaluations were added to an interactive dashboard that also contained evaluations on additional program-specific modifiers of residents' performance. A resident's final overall performance was visually represented on a plot graph. The novel dashboard included features to save evaluations for future comparisons and to track residents' progress during their entire training. It proved simple to use and was able to reduce the time needed for each resident evaluation to 5 to 10 minutes.
Conclusion: This tool has made it much easier and less challenging for the members of our Clinical Competency Committee to start deliberation about each resident's performance.

Introduction

Beginning in 1999, the Accreditation Council for Graduate Medical Education (ACGME) launched a series of changes, revamping medical education in the US. The accreditation of every academic residency program was shifted from being "process oriented" to "outcome oriented" through implementation of teaching and assessment of six core competencies. In July 2013, public and political pressure to train physicians capable of practicing cost-conscious and patient-centered care led to the development and implementation of Next Accreditation System (NAS). A year later, all residency programs were enrolled in the NAS system.

The Vision

The primary goal of the NAS, the ACGME hopes, is to transform the accreditation system into a less administratively burdensome process.1,2 The council envisions 2 mechanisms to achieve this goal. First, it wants to create a continuous accreditation model via annual data submission to the ACGME. The Residency Review Committees will evaluate trends in key performance measures on an annual basis, thus eliminating the 1- to 5-year review cycles and the Program Information Form. Instead, a self-study every 10 years will be in order. Unlike the old Program Information Form's periodic follow-ups, it will allow programs to present their innovative achievements.

Second, the Residency Review Committees' primary mechanism to confirm the program compliance with published educational standards will be via the biannual data submission. The provided information reflects all the dynamic changes in the residency program.

Two additional data sources are used:

  1. The educational milestones represent the behavioral and clinical expectations that residents must achieve throughout their training. They also facilitate the identification of deficiencies so that proper remediation can be implemented in a timely manner. The newly formed Clinical Competency Committee will be responsible for making decisions regarding the progress of the residents.
  2. The visit for the clinical learning environment review will focus on the learning environment, which is expected to be conducive to teaching and learning. It also looks at specific program achievements: patient safety, quality improvement, transitions of care, resident supervision, duty hours and fatigue recognition and mitigation, and professionalism.

What is the Problem?

Residency programs are still not clear on how best to comply with the new NAS requirements. Large amounts of data must be collected, evaluated, and submitted twice a year to the ACGME Web-based data collection system. Some of the challenges are not new, such as the delays in completing faculty evaluations on time. To further complicate matters, the new "end-of-rotation" evaluations must reflect the specialty-specific milestones in which many faculty members are not well versed.

Although every Residency Review Committee has developed its own specialty-specific milestones, they are not meant to replace the end-of-rotation evaluations. Rather, they are summative descriptions of the cognitive, affective, and psychomotor domains of a resident's performance over the course of every six months in training. Furthermore, the provided milestones are neither rotation specific nor specific to level of training.

Methods

To address some of our challenges in implementing the NAS at our residency program, we conducted a self-study to look at options available to empower our newly formed Clinical Competency Committee. We started with our goals and objectives and our end-of-rotation evaluation for each index rotation.

We had two options available:

  1. Use the specialty-specific milestones to rewrite a new set of goals and objectives, and end-of-rotation evaluations.
  2. Use the current goals and objectives, and keep the same evaluation system after mapping each question in the evaluations to the appropriate milestone that matched the level of complexity of the question.

Given that we have more than 50 different rotations over the course of 5 years of training, the first option would have been a tremendously time-consuming and labor-intensive task. There was also a concern with the lack of proper faculty training on the new evaluation system. The second option seemed more practical because the required adjustments would be much easier to achieve.

Navigating the Next Accreditation System: A Dashboard for the MilestonesWe mapped every question in each end-of-rotation evaluation to reflect the appropriate level milestone, as shown in the Sidebar: Existing Evaluation Questions Mapped to the Milestones. This was feasible because most commercial evaluation software programs support such features.

Once the resident's evaluation was completed, we could generate reports that reflected the resident's performance on each milestone. However, these programs do not yet offer the Likert scale of 0 to 4 required by the NAS. Therefore, such scores must be converted to a Likert scale of 0 to 4.

Results

To address all the earlier-mentioned issues, we created a dashboard, or a user interface on a computer display, using a spreadsheet program (Microsoft Excel 2010, Microsoft, Redmond, WA). The dashboard was created with a user interface for entering averages derived from evaluations linked to each milestone in our electronic evaluation system.

A "modifiers" category was created for entering evaluation criteria on the basis of program-specific categories not included in our electronic evaluation system. Each category was divided into 4 levels with gradient bonuses or penalties scaled by postgraduate year (PGY). The assumption underlying the scaling was that fifth-year residents should be held to a higher standard and thus subject to more dramatic penalties and bonuses. A goal score of 1.5 was set for PGY-1, 2.0 for PGY-2, 2.5 for PGY-3, 3.0 for PGY-4, and 3.5 for PGY-5. The scale was such that if the resident achieved minimal goals as set by the Clinical Competency Committee, then s/he would meet or exceed the goal as signified by the gray line on the radar plot. The possibility existed for a resident to reach approximately 1 level above his/her training in particular categories. A radar plot was chosen because of the ease of graphical display of multiple data points simultaneously.

Additional features included an ability to save entered data for each resident, including semiannual evaluations for side-to-side comparison of progress of each resident. The summary page provided a 1-page analysis of a resident's progress by combining a radar plot display of milestone achievements with program-specific modifiers and raw data in a table format (Figure 1). On the left side of the summary page we placed a snapshot of a resident's performance on the milestones. In the middle of the page we gave a snapshot of the performance on the modifiers. On the right of the page a radar plot was generated to summarize the overall performance in a visual representation (Figure 2). The software was programmed to convert the Likert scale of 0 to 9 to a Likert scale of 0 to 4. This change allowed us to factor in modifiers that are important for residents' progress such as logging hours and cases on time; to add performance on the American Board of Surgery in-training examination; to add performance on the objective structured clinical examination; to add scholarly activities; to add peers', students', and nurses' evaluations; and so on.

The target performance for each resident was programmed to be specific to the level of training. The target level was colored red, and the resident's performance was colored blue for easier visual identification. The visual representation offered by the radar plot was of paramount importance and represented a good starting point for members of the Clinical Competency Committee during deliberation about the progress of the resident. Furthermore, residents were provided with their own radar plots to help them envision and monitor their performance over time.

The initial evaluation of our dashboard was promising. It is simple to use, practical, and able to reduce the time needed for each resident evaluation to 5 minutes to 10 minutes instead of 45 minutes to 60 minutes, which is what the residency program directors entering the milestone evaluations before us said it required. So far, we have completed one year of evaluations (2 cycles). It validates our hypothesis regarding the utility of this new tool.

Navigating the Next Accreditation System: A Dashboard for the Milestones

Navigating the Next Accreditation System: A Dashboard for the Milestones

Conclusion

Our residency program was able to create a tool to store and evaluate residents' progress through both graphical and numeric tracking on a one-page document (dashboard) in compliance with the milestones set forth by the ACGME while also incorporating program-specific modifiers.

This platform can be easily shared with other programs to help them catch up in this critical transition period.

Disclosure Statement

The author(s) have no conflicts of interest to disclose.

Acknowledgment

Kathleen Louden, ELS, provided editorial assistance.

References
1.    Goroll AH, Sirio C, Duffy FD, et al; Residency Review Committee for Internal Medicine. A new model for accreditation of residency programs in internal medicine. Ann Intern Med 2004 Jun 1;140(11):902-9. DOI: https://doi.org/10.7326/0003-4819-140-11-200406010-00012.
    2.    Nasca TJ, Philibert I, Brigham TP, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med 2012 Mar 15;366(11):1051-6. DOI: https://doi.org/10.1056/NEJMsr1200117.

Circulation

25,000 print readers per quarter, 6900 eTOC readers, and in 2015, 1.4 million page views on TPJ articles in PubMed from a broad international readership

Subscriptions

The Permanente Journal (ISSN 1552-5767) is published quarterly by The Permanente Press. The Permanente Journal is available online (ISSN 1552-5775) at www.thepermanentejournal.org.

Letters

Articles, editorials, letters to the editor, and other material represent the opinion of the authors. Send your comments to permanente.journal@kp.org.


Copyright 2016 The Permanente Journal - Kaiser Permanente. All Rights Reserved.