Navigating the Next Accreditation System: A Dashboard for the Milestones
Introduction: In July 2014, all residency programs accredited by the Accreditation Council for Graduate Medical Education (ACGME) were enrolled in a new system called the Next Accreditation System. Residency programs may not be clear on how best to comply with these new accreditation requirements. Large amounts of data must be collected, evaluated, and submitted twice a year to the council's Web-based data collection system. One challenge is that the new "end-of-rotation" evaluations must reflect specialty-specific milestones, on which many faculty members are not well versed. Like other residency programs, we tried to address the challenges using our local resources.
Beginning in 1999, the Accreditation Council for Graduate Medical Education (ACGME) launched a series of changes, revamping medical education in the US. The accreditation of every academic residency program was shifted from being "process oriented" to "outcome oriented" through implementation of teaching and assessment of six core competencies. In July 2013, public and political pressure to train physicians capable of practicing cost-conscious and patient-centered care led to the development and implementation of Next Accreditation System (NAS). A year later, all residency programs were enrolled in the NAS system.
The primary goal of the NAS, the ACGME hopes, is to transform the accreditation system into a less administratively burdensome process.1,2 The council envisions 2 mechanisms to achieve this goal. First, it wants to create a continuous accreditation model via annual data submission to the ACGME. The Residency Review Committees will evaluate trends in key performance measures on an annual basis, thus eliminating the 1- to 5-year review cycles and the Program Information Form. Instead, a self-study every 10 years will be in order. Unlike the old Program Information Form's periodic follow-ups, it will allow programs to present their innovative achievements.
Second, the Residency Review Committees' primary mechanism to confirm the program compliance with published educational standards will be via the biannual data submission. The provided information reflects all the dynamic changes in the residency program.
Two additional data sources are used:
What is the Problem?
Residency programs are still not clear on how best to comply with the new NAS requirements. Large amounts of data must be collected, evaluated, and submitted twice a year to the ACGME Web-based data collection system. Some of the challenges are not new, such as the delays in completing faculty evaluations on time. To further complicate matters, the new "end-of-rotation" evaluations must reflect the specialty-specific milestones in which many faculty members are not well versed.
Although every Residency Review Committee has developed its own specialty-specific milestones, they are not meant to replace the end-of-rotation evaluations. Rather, they are summative descriptions of the cognitive, affective, and psychomotor domains of a resident's performance over the course of every six months in training. Furthermore, the provided milestones are neither rotation specific nor specific to level of training.
To address some of our challenges in implementing the NAS at our residency program, we conducted a self-study to look at options available to empower our newly formed Clinical Competency Committee. We started with our goals and objectives and our end-of-rotation evaluation for each index rotation.
We had two options available:
Given that we have more than 50 different rotations over the course of 5 years of training, the first option would have been a tremendously time-consuming and labor-intensive task. There was also a concern with the lack of proper faculty training on the new evaluation system. The second option seemed more practical because the required adjustments would be much easier to achieve.
We mapped every question in each end-of-rotation evaluation to reflect the appropriate level milestone, as shown in the Sidebar: Existing Evaluation Questions Mapped to the Milestones. This was feasible because most commercial evaluation software programs support such features.
Once the resident's evaluation was completed, we could generate reports that reflected the resident's performance on each milestone. However, these programs do not yet offer the Likert scale of 0 to 4 required by the NAS. Therefore, such scores must be converted to a Likert scale of 0 to 4.
To address all the earlier-mentioned issues, we created a dashboard, or a user interface on a computer display, using a spreadsheet program (Microsoft Excel 2010, Microsoft, Redmond, WA). The dashboard was created with a user interface for entering averages derived from evaluations linked to each milestone in our electronic evaluation system.
A "modifiers" category was created for entering evaluation criteria on the basis of program-specific categories not included in our electronic evaluation system. Each category was divided into 4 levels with gradient bonuses or penalties scaled by postgraduate year (PGY). The assumption underlying the scaling was that fifth-year residents should be held to a higher standard and thus subject to more dramatic penalties and bonuses. A goal score of 1.5 was set for PGY-1, 2.0 for PGY-2, 2.5 for PGY-3, 3.0 for PGY-4, and 3.5 for PGY-5. The scale was such that if the resident achieved minimal goals as set by the Clinical Competency Committee, then s/he would meet or exceed the goal as signified by the gray line on the radar plot. The possibility existed for a resident to reach approximately 1 level above his/her training in particular categories. A radar plot was chosen because of the ease of graphical display of multiple data points simultaneously.
Additional features included an ability to save entered data for each resident, including semiannual evaluations for side-to-side comparison of progress of each resident. The summary page provided a 1-page analysis of a resident's progress by combining a radar plot display of milestone achievements with program-specific modifiers and raw data in a table format (Figure 1). On the left side of the summary page we placed a snapshot of a resident's performance on the milestones. In the middle of the page we gave a snapshot of the performance on the modifiers. On the right of the page a radar plot was generated to summarize the overall performance in a visual representation (Figure 2). The software was programmed to convert the Likert scale of 0 to 9 to a Likert scale of 0 to 4. This change allowed us to factor in modifiers that are important for residents' progress such as logging hours and cases on time; to add performance on the American Board of Surgery in-training examination; to add performance on the objective structured clinical examination; to add scholarly activities; to add peers', students', and nurses' evaluations; and so on.
The target performance for each resident was programmed to be specific to the level of training. The target level was colored red, and the resident's performance was colored blue for easier visual identification. The visual representation offered by the radar plot was of paramount importance and represented a good starting point for members of the Clinical Competency Committee during deliberation about the progress of the resident. Furthermore, residents were provided with their own radar plots to help them envision and monitor their performance over time.
The initial evaluation of our dashboard was promising. It is simple to use, practical, and able to reduce the time needed for each resident evaluation to 5 minutes to 10 minutes instead of 45 minutes to 60 minutes, which is what the residency program directors entering the milestone evaluations before us said it required. So far, we have completed one year of evaluations (2 cycles). It validates our hypothesis regarding the utility of this new tool.
Our residency program was able to create a tool to store and evaluate residents' progress through both graphical and numeric tracking on a one-page document (dashboard) in compliance with the milestones set forth by the ACGME while also incorporating program-specific modifiers.
This platform can be easily shared with other programs to help them catch up in this critical transition period.
The author(s) have no conflicts of interest to disclose.
Kathleen Louden, ELS, provided editorial assistance.
1. Goroll AH, Sirio C, Duffy FD, et al; Residency Review Committee for Internal Medicine. A new model for accreditation of residency programs in internal medicine. Ann Intern Med 2004 Jun 1;140(11):902-9. DOI: http://dx.doi.org/10.7326/0003-4819-140-11-200406010-00012.