Exploring the Reality of Using Patient Experience Data to Provide Resident Feedback: A Qualitative Study of Attending Physician PerspectivesSteffanie Campbell, MD; Heather Honoré Goltz, PhD, LMSW, MEd; Sarah Njue, MPH; Bich Ngoc Dang, MD Perm J 2016 Summer;20(3):15-154 [Full Citation] https://doi.org/10.7812/TPP/15-154E-pub: 07/05/2016ABSTRACTIntroduction: Little is known about the attitudes of faculty and residents toward the use of patient experience data as a tool for providing resident feedback. The purpose of this study was to explore the attitudes of teaching faculty surrounding patient experience data and how those attitudes may influence the feedback given to trainees. IntroductionInterpersonal and communication skills constitute one of the Accreditation Council for Graduate Medical Education’s (ACGME’s) 6 domains of clinical competencies for graduate medical education in internal medicine. The ACGME supports the use of patient experience data as an outcomes-based tool for providing resident feedback on interpersonal and communication skills.1 The American Board of Internal Medicine is exploring ways to integrate this outcomes-based approach into their physician certification activities.2 Patient experience data can serve as an effective tool for providing residents with feedback. In a study by Cope and colleagues,3 residents in an internal medicine training program were randomized to receive a 30-minute structured feedback session in which they received mean scores on an experience survey filled out by new patients. Residents in the intervention arm had a significant increase in mean scores on a subsequent survey of new patients compared with residents who did not receive feedback. Patient experience data paired with actionable feedback (ie, feedback that can change residents’ practice behavior) can be highly effective when provided by trained individuals.4 Although data suggest actionable feedback has a positive impact on residents’ practice behaviors, many graduate medical education programs have difficulties translating this knowledge into real-world practice.5-12 Studies evaluating feedback-based interventions typically devote immense resources on the development and training of personnel to deliver actionable feedback that is typically neither feasible nor designed for implementation in general practice.13-20 As such, patient experience survey data are rarely used effectively outside the research context to deliver resident feedback, owing to lack of either training or time.6,11,12,21 In addition, little is known about the attitudes of faculty and residents toward the use of patient experience data as a tool for providing resident feedback.22 The purpose of this study was to explore how attending physicians in a real-world academic setting incorporate patient experience survey data into feedback practices, explore the attitudes and beliefs surrounding the use of patient experience data as a feedback tool, and identify potential areas for improvement. Specifically, we were interested in exploring attendings’ attitudes around giving feedback and understanding the process by which attendings provide learners with actionable feedback. MethodsParticipantsThe study population was based on a nonrandomized convenience sample of attending physicians who precept residents in internal medicine at two continuity clinics in Houston, TX (clinics A and B). Eligibility criteria included 1) faculty with an appointment in the Department of Internal Medicine and 2) faculty with a role as a preceptor in the internal medicine resident continuity clinic. This study was approved by the institutional review board for our institution. Data Collection
Research Team and ReflexivityThe research team’s professional backgrounds and research interests informed development of the interview guide, interpretation of codes, and understanding of emergent themes within the context of medical education and patient care. Our multidisciplinary team consisted of two physicians, a social work researcher, and a research coordinator. SC, Associate Program Director of the Internal Medicine Residency Program, ensures quality education and training for residents. BND is an Assistant Professor of Medicine in the Section of Infectious Diseases. Her research examines the use of patient experience metrics as a modifiable focus for improving retention in care and adherence to medicines. HHG, Assistant Professor in Social Work, is experienced in qualitative research methods; she is interested in patients’ access to and quality of care. SN is a master’s-trained public health professional with a background in health promotion and behavioral science. Data AnalysisWe did not use an a priori code list. Four researchers (SC, HHG, SN, and BND) independently reviewed the transcripts and coded the data, looking for examples of facilitators and barriers to actionable feedback. The full research team then came together to compare codes and iteratively revise and refine codes until 100% consensus was reached. This occurred during several weekly team meetings. In the later stages of analysis, the team examined recurrent themes across interviews and clinic sites. ResultsCharacteristics of ParticipantsThe participation rate among eligible attending physicians was 75% (9/12). Nonparticipating physicians reported demanding clinical duties and the lack of time as reasons for opting out. Baseline characteristics are outlined in Table 1. Given the small sample size, limited demographic characteristics are reported to preserve confidentiality. Five were female and 4 reported their race/ethnicity as Asian. Five participants precept residents at clinic A and 4 at clinic B. Description of Clinic, Patient Panel Assignment, and Patient Experience Survey
Barriers to Actionable FeedbackSpecific patterns of feedback varied by clinic site; however, some core themes did emerge from the data. The research team identified six themes corresponding to potential barriers in using patient experience survey data to provide actionable feedback to residents: 1) perceived inability of residents to learn or incorporate feedback, 2) punitive nature of feedback, 3) lack of training in the use of patient-experience data to give feedback, 4) lack of timeliness in providing feedback, 5) unclear benefit of patient experience data as a tool to inform and frame actionable feedback, and 6) lack of individualized feedback. Perceived Inability of Residents to Learn or Incorporate Feedback On occasion, attending physicians seemed resigned to the belief that it is difficult to change residents’ practice behavior. They cited difficulties in teaching adult learners and difficulties in teaching “soft skills” (eg, personal attributes). Three attending physicians specifically reported difficulty in teaching “professionalism.” “It’s hard to change behavior for adults … . Just because they’re trainees, we should not forget the fact that they are adults and they’re supposed to be professionals, you know, so there’s only so much I can do …” “But you can’t change personalities and habits of people [who are] old. You can do your best, but professionalism is a very difficult thing to teach, and it’s professionalism in not just how you look or how you show up, but it’s also the amount of effort you put forth in what your actual duties are, you know, and how much you can relate and communicate to the patient. So it’s hard to teach soft skills. You can do your best with a resident that you might have for three years, but some personalities don’t change.” —Attending physicians at clinics A and B Punitive Nature of Feedback Punitive feedback refers to any negative approach to providing feedback. Three of four participants at clinic B reported that they approached underperforming residents in a nonpunitive way to address patient concerns or improve their clinical competence. These attending physicians engaged the resident in coming up with task-specific and actionable solutions. “I talked to the resident about what she thought had happened, … and then we kind of brainstormed kind of what we thought had gone wrong between that. And she asked me was there anything I thought uh could be done better in the- the situation, um and then we kind of wrote back to the patient what had happened, which I think was just a miscommunication thing.” “So I learned that we have to be sensitive but at the same time we have to get to the point because if you’re too sensitive you’re being too nice. And if they don’t get the message then you’re not getting to the feedback.” —Attending physicians at clinic B In contrast, attendings at clinic A reported using punitive feedback in response to the residents’ actions, such as removing patients from the residents’ panel. “So that is why they [the patients] don’t want to write bad things, but they will come and talk to us in person; especially because many of them, if you’ve been seeing them for several years, they do understand that these people are in training, which is okay to some extent. And some say, ‘No, I don’t want to see a resident; I want to see the attending.’ Then we just change the patient back to us [attendings].” “But some of my patients I have actually removed from his panel and either brought them back to me or put them with another person that I know is better at listening and communication.” “But it’s important for the resident that he at least gets a feel that we are watching them and patients do have their opinions.” —Attending physicians at clinic A Lack of Training in the Use of Patient Experience Data to Give Feedback Participants were asked if they received specific training in using patient experience survey data to provide actionable feedback. Two of the five participants at clinic A reported taking a three-hour institutional workshop two years prior. Per their report, the workshop explained how to provide feedback to residents, but not specifically how to incorporate patient experience data into feedback practices. “There’s a course at [my institution] about how to evaluate residents and other groups … . The one that we specifically had taken was how to complete evaluations.” —Attending physician at clinic A However, three of the four participants at clinic B reported no formal training in these areas. “No, I mean, no formal training to start off with, except I mean- I mean, we had, you know, teaching as residents and a lot of teaching built into our primary care residency.” —Attending physician at clinic B Lack of Timeliness in Providing Feedback Lack of timeliness refers to delays in providing feedback to the residents. Branch and Paranjas24 suggest residents should receive feedback at least every two to three months. In our analysis, two attending physicians at clinic B reported completing evaluations twice a year; they reported time constraints as a barrier to timely feedback. “It’s time consuming because I have 28 residents.” “We just do it electronically so we don’t actually have that feedback like oral feedback session because they come at different times.” —Attending physicians at clinic B “I’d rather not deal with it than deal with that because then you’re sending more work for me … . Every year, every year now, I have one that is wasting my time.” —Attending physician at clinic A Unclear Benefit of Patient Experience Data as a Tool to Inform and Frame Actionable Feedback Benefit refers to the degree with which the attending physicians consider patient experience survey data as a beneficial tool for providing actionable feedback. Only one of five participants at clinic B reported that the surveys were a suitable tool for providing feedback to the residents. However, of the nine participants overall, eight questioned the value of patient experience survey data in providing resident feedback. These attendings reported that the surveys were not beneficial; they felt that the information obtained from the surveys was insufficient to address patient issues or give effective feedback to the residents. “I don’t see that they are a big help to the resident, nor to me, unless the patient very specifically writes something, you know, out of the ordinary that the resident did, whether it be egregious or something positive. Short of that, I don’t see that they are very helpful evaluations to me or the resident, in their current state.” “Yeah, because I’m not getting that much useful information except uh “wonderful doctor,” “the best,”… but no really constructive feedback … . We [are] doing it [patient experience surveys] just to meet this [Accreditation Council for Graduate Medical Education] requirement but yet they’re not learning … . There’s no feedback and they’re not learning what they should do to improve themselves. There’s no purpose of doing the evaluation … . So it’s a little more difficult and there’s no details on those and so it’s a little harder to give feedback … . I don’t see any comments … at all so it’s hard to give the feedback.” “I don’t think the residents care too much. They get evaluated so many ways and so many times a year.” —Attending physicians at clinics A and B Most attendings did not like the survey format. They preferred open-ended questions where patients could provide specific examples and task-specific feedback. Lack of Resident-Centered Feedback Resident-centered feedback is feedback that engages the resident in discussion and allows for shared goal setting.4 One participant in clinic B reported delivering feedback by having a face-to-face conversation. In contrast, the other three participants in clinic B reported providing feedback electronically; they cited lack of time and the high number of assigned residents as barriers to resident-centered feedback. “Unfortunately we don’t sit down … We don’t sit down with any one of them except the ones who actually um have difficulty. Then we meet, we talk to that person personally, but other than that we just do it electronically so we don’t actually have that feedback-like oral feedback session.” —Attending physician at clinic B “I mean, ideally, yes, it would be lovely to have them come, sit, go through everything, see how you’re doing, whatever, but there’s so many of them.” —Attending physician at clinic B These attendings acknowledge that use of an electronic medium alone can create a barrier to resident-centered feedback because it does not provide an opportunity for the resident to reflect, comment, or engage in the solution-making process. DiscussionThis study provides insight into how attending physicians use patient-reported experience measures to provide feedback for residents in an internal medicine training program. We identified six core themes influencing the use of patient experience data in providing resident feedback: 1) perceived inability of residents to learn or to incorporate feedback, 2) punitive nature of feedback, 3) lack of training in the delivery of actionable feedback, 4) lack of timeliness in the delivery of feedback, 5) unclear benefit of patient experience survey data as a tool for providing resident feedback, and 6) lack of individualized feedback. In 2001, the Institute of Medicine codified patient-centeredness as one of six health care quality aims.25 Patient experience is a critical facet of patient-centeredness. Moreover, studies have linked better patient experiences to favorable health behaviors and outcomes.26-39 In alignment with this aim, the Institute of Medicine advocates the use of patient experience data as a patient-centered tool for promoting quality care. Concrete patient experience data can define key points of intervention for improving the care experience. These data argue for greater training on the use of patient experience survey data to effect practice change and ultimately to improve health behaviors and outcomes. Physicians in training are an ideal population to intervene because they are at an early stage in their career and may be more malleable.8,40-43 Thus, actionable feedback may have a greater effect on practice behaviors.
An important strength of our study is that we are one of the first to explore how patient experience data is incorporated into the resident feedback process. We identified six core themes that residency programs can use in assessing and modifying their own resident feedback process. On the basis of our findings, we believe that patient experience data can be successfully used to augment existing evaluation processes. LimitationsThe findings in our study should be interpreted with the following limitations in mind. Although our sample size is small, our participation rate of 75% is acceptable for exploratory analyses. In a qualitative study using 60 interviews, core themes were present as early as 6 interviews and data saturation was reached at 12 interviews.51 Although we collected data at 2 very different institutions, these institutions are affiliated with the same academic center. Our findings may not be generalizable. We were forced to use mixed methods by combining data from individual interviews and focus groups. In a focus group there is the concern that 1 or 2 individuals can dominate the conversation. However, there is also the opportunity for individuals to motivate each other to express their thoughts. It has been noted that integration of these 2 study methods may provide data enrichment.52 ConclusionGraduate Medical Education programs may want to conduct their own internal assessment of the resident feedback process. Such assessments should review how patient experience data is incorporated into the resident feedback process and how, if at all, their faculty are trained to provide such feedback. We believe there is value in adhering to the ACGME guidelines in both spirit and content so that residents emerge from training with greater competency in interpreting and using patient experience data to improve their interpersonal and communication behaviors. Disclosure StatementThis work was supported in part by the facilities and resources of the Center for Innovations in Quality, Effectiveness and Safety at the Michael E DeBakey VA Medical Center (#CIN 13-413), and the facilities and resources of Harris Health System. The views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs. The author(s) have no other conflicts of interest to disclose. AcknowledgementsWe thank Aanand D Naik, MD, and Sylvia J Hysong, PhD, for their critical review of an earlier draft of this manuscript. How to Cite this ArticleCampbell S, Goltz HH, Njue S, Dang BN. Exploring the reality of using patient experience data to provide resident feedback: A qualitative study of attending physician perspectives. Perm J 2016 Summer;20(3):15-154. DOI: https://doi.org/10.7812/TPP/15-154. References1. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med 2012 Mar 15;366(11):1051-6. DOI: https://doi.org/10.1056/NEJMsr1200117.
|
ETOC
Click here to join the eTOC list or text ETOC to 22828. You will receive an email notice with the Table of Contents of The Permanente Journal.
CIRCULATION
2 million page views of TPJ articles in PubMed from a broad international readership.
Indexing
Indexed in MEDLINE, PubMed Central, EMBASE, EBSCO Academic Search Complete, and CrossRef.