Exploring the Reality of Using Patient Experience Data to Provide Resident Feedback: A Qualitative Study of Attending Physician Perspectives


Steffanie Campbell, MD; Heather Honoré Goltz, PhD, LMSW, MEd; Sarah Njue, MPH; Bich Ngoc Dang, MD

Perm J 2016 Summer;20(3):15-154 [Full Citation]

E-pub: 07/05/2016


Introduction: Little is known about the attitudes of faculty and residents toward the use of patient experience data as a tool for providing resident feedback. The purpose of this study was to explore the attitudes of teaching faculty surrounding patient experience data and how those attitudes may influence the feedback given to trainees.
Methods: From July 2013 to August 2013, we conducted in-depth, face-to-face, semistructured interviews with 9 attending physicians who precept residents in internal medicine at 2 continuity clinics (75% of eligible attendings). Interviews were coded using conventional content analysis.
Results: Content analysis identified six potential barriers in using patient experience survey data to provide feedback to residents: 1) perceived inability of residents to learn or to incorporate feedback, 2) punitive nature of feedback, 3) lack of training in the delivery of actionable feedback, 4) lack of timeliness in the delivery of feedback, 5) unclear benefit of patient experience survey data as a tool for providing resident feedback, and 6) lack of individualized feedback.
Conclusion: Programs may want to conduct an internal review on how patient experience data is incorporated into the resident feedback process and how, if at all, their faculty are trained to provide such feedback.


Interpersonal and communication skills constitute one of the Accreditation Council for Graduate Medical Education’s (ACGME’s) 6 domains of clinical competencies for graduate medical education in internal medicine. The ACGME supports the use of patient experience data as an outcomes-based tool for providing resident feedback on interpersonal and communication skills.1 The American Board of Internal Medicine is exploring ways to integrate this outcomes-based approach into their physician certification activities.2 Patient experience data can serve as an effective tool for providing residents with feedback. In a study by Cope and colleagues,3 residents in an internal medicine training program were randomized to receive a 30-minute structured feedback session in which they received mean scores on an experience survey filled out by new patients. Residents in the intervention arm had a significant increase in mean scores on a subsequent survey of new patients compared with residents who did not receive feedback. Patient experience data paired with actionable feedback (ie, feedback that can change residents’ practice behavior) can be highly effective when provided by trained individuals.4

Although data suggest actionable feedback has a positive impact on residents’ practice behaviors, many graduate medical education programs have difficulties translating this knowledge into real-world practice.5-12 Studies evaluating feedback-based interventions typically devote immense resources on the development and training of personnel to deliver actionable feedback that is typically neither feasible nor designed for implementation in general practice.13-20 As such, patient experience survey data are rarely used effectively outside the research context to deliver resident feedback, owing to lack of either training or time.6,11,12,21 In addition, little is known about the attitudes of faculty and residents toward the use of patient experience data as a tool for providing resident feedback.22 The purpose of this study was to explore how attending physicians in a real-world academic setting incorporate patient experience survey data into feedback practices, explore the attitudes and beliefs surrounding the use of patient experience data as a feedback tool, and identify potential areas for improvement. Specifically, we were interested in exploring attendings’ attitudes around giving feedback and understanding the process by which attendings provide learners with actionable feedback.



The study population was based on a nonrandomized convenience sample of attending physicians who precept residents in internal medicine at two continuity clinics in Houston, TX (clinics A and B). Eligibility criteria included 1) faculty with an appointment in the Department of Internal Medicine and 2) faculty with a role as a preceptor in the internal medicine resident continuity clinic. This study was approved by the institutional review board for our institution.

Data Collection

15 154p2Participants were recruited by e-mail; attending physicians received an e-mail from the Associate Program Director of the Internal Medicine Residency Program (SC) inviting them to take part in the study. The e-mail informed potential participants of the study and its purpose. Twelve attending physicians met eligibility criteria and were recruited; 9 attending physicians participated.  Between July and August 2013, SC conducted in-depth, face-to-face, semistructured interviews with attending physicians. She conducted 4 individual interviews at clinic B and a focus group interview involving 5 participants at clinic A. A focus group interview was conducted at clinic A per the request of the clinic director, given the time constraints of the attending staff. Staff members were willing to complete a group interview during their lunch hour, but were unable to dedicate an hour individually for interviews. Participants provided verbal but not written informed consent to protect their identities. No compensation was provided for participation. The individual interviews lasted 30 minutes to 60 minutes and the focus group lasted 60 minutes. Interviews were audiotaped using an encrypted recorder and transcribed verbatim by professional transcriptionists. The interviews were conducted using an open-ended interview guide developed by the multidisciplinary team. The interview guide consisted of open-ended questions to identify the process of feedback, the attitudes and beliefs surrounding feedback, and the training given to attending physicians to use patient experience survey data in providing resident feedback (see Sidebar: Major Topics and Key Interview Questions in Study of Resident Feedback). Interviews took place in conference rooms at the participants’ respective clinic sites.

Research Team and Reflexivity

The research team’s professional backgrounds and research interests informed development of the interview guide, interpretation of codes, and understanding of emergent themes within the context of medical education and patient care. Our multidisciplinary team consisted of two physicians, a social work researcher, and a research coordinator. SC, Associate Program Director of the Internal Medicine Residency Program, ensures quality education and training for residents. BND is an Assistant Professor of Medicine in the Section of Infectious Diseases. Her research examines the use of patient experience metrics as a modifiable focus for improving retention in care and adherence to medicines. HHG, Assistant Professor in Social Work, is experienced in qualitative research methods; she is interested in patients’ access to and quality of care. SN is a master’s-trained public health professional with a background in health promotion and behavioral science.

Data Analysis

We did not use an a priori code list. Four researchers (SC, HHG, SN, and BND) independently reviewed the transcripts and coded the data, looking for examples of facilitators and barriers to actionable feedback. The full research team then came together to compare codes and iteratively revise and refine codes until 100% consensus was reached. This occurred during several weekly team meetings. In the later stages of analysis, the team examined recurrent themes across interviews and clinic sites.


Characteristics of Participants

The participation rate among eligible attending physicians was 75% (9/12). Nonparticipating physicians reported demanding clinical duties and the lack of time as reasons for opting out. Baseline characteristics are outlined in Table 1. Given the small sample size, limited demographic characteristics are reported to preserve confidentiality. Five were female and 4 reported their race/ethnicity as Asian. Five participants precept residents at clinic A and 4 at clinic B.

Description of Clinic, Patient Panel Assignment, and Patient Experience Survey

15 154p3aEach clinic has a unique structure for assignment of a primary care physician. Clinic A assigns patients to a staff physician as their primary care physician. Residents are assigned to a specific attending who then designates the resident as an associate physician for 75 to 90 of their patients. Attempts are made to schedule follow-up appointments during the time the resident is present to create continuity of care between residents and their patients. At clinic B, patients are assigned a resident physician as their primary care physician. All follow-up appointments are made on the half-day the resident physicians are available to assure continuity of care. Resident physicians then discuss care plans with the staff physician available during that half-day. A 9-item survey adapted from resident evaluation tools, including those originating from Saint Mary’s Hospital and Maine Medical Center, is used to measure patients’ experience with residents during a specific encounter (see Sidebar: Patient Experience Survey Questions).23 These items reflect interpersonal and communication skills valued as important on the basis of the program’s educational objectives and the extant literature. Responses are kept anonymous and filed under the resident’s name.

Barriers to Actionable Feedback

Specific patterns of feedback varied by clinic site; however, some core themes did emerge from the data. The research team identified six themes corresponding to potential barriers in using patient experience survey data to provide actionable feedback to residents: 1) perceived inability of residents to learn or incorporate feedback, 2) punitive nature of feedback, 3) lack of training in the use of patient-experience data to give feedback, 4) lack of timeliness in providing feedback, 5) unclear benefit of patient experience data as a tool to inform and frame actionable feedback, and 6) lack of individualized feedback.

Perceived Inability of Residents to Learn or Incorporate Feedback

On occasion, attending physicians seemed resigned to the belief that it is difficult to change residents’ practice behavior. They cited difficulties in teaching adult learners and difficulties in teaching “soft skills” (eg, personal attributes). Three attending physicians specifically reported difficulty in teaching “professionalism.”

“It’s hard to change behavior for adults … . Just because they’re trainees, we should not forget the fact that they are adults and they’re supposed to be professionals, you know, so there’s only so much I can do …”

“But you can’t change personalities and habits of people [who are] old.  You can do your best, but professionalism is a very difficult thing to teach, and it’s professionalism in not just how you look or how you show up, but it’s also the amount of effort you put forth in what your actual duties are, you know, and how much you can relate and communicate to the patient. So it’s hard to teach soft skills. You can do your best with a resident that you might have for three years, but some personalities don’t change.”

—Attending physicians at clinics A and B

Punitive Nature of Feedback

Punitive feedback refers to any negative approach to providing feedback. Three of four participants at clinic B reported that they approached underperforming residents in a nonpunitive way to address patient concerns or improve their clinical competence. These attending physicians engaged the resident in coming up with task-specific and actionable solutions.

“I talked to the resident about what she thought had happened, … and then we kind of brainstormed kind of what we thought had gone wrong between that. And she asked me was there anything I thought uh could be done better in the- the situation, um and then we kind of wrote back to the patient what had happened, which I think was just a miscommunication thing.”

“So I learned that we have to be sensitive but at the same time we have to get to the point because if you’re too sensitive you’re being too nice.  And if they don’t get the message then you’re not getting to the feedback.”

—Attending physicians at clinic B

In contrast, attendings at clinic A reported using punitive feedback in response to the residents’ actions, such as removing patients from the residents’ panel.

“So that is why they [the patients] don’t want to write bad things, but they will come and talk to us in person; especially because many of them, if you’ve been seeing them for several years, they do understand that these people are in training, which is okay to some extent. And some say, ‘No, I don’t want to see a resident; I want to see the attending.’ Then we just change the patient back to us [attendings].”

“But some of my patients I have actually removed from his panel and either brought them back to me or put them with another person that I know is better at listening and communication.”

“But it’s important for the resident that he at least gets a feel that we are watching them and patients do have their opinions.”

—Attending physicians at clinic A

Lack of Training in the Use of Patient Experience Data to Give Feedback

Participants were asked if they received specific training in using patient experience survey data to provide actionable feedback. Two of the five participants at clinic A reported taking a three-hour institutional workshop two years prior. Per their report, the workshop explained how to provide feedback to residents, but not specifically how to incorporate patient experience data into feedback practices.

“There’s a course at [my institution] about how to evaluate residents and other groups … . The one that we specifically had taken was how to complete evaluations.”

—Attending physician at clinic A

However, three of the four participants at clinic B reported no formal training in these areas.

“No, I mean, no formal training to start off with, except I mean- I mean, we had, you know, teaching as residents and a lot of teaching built into our primary care residency.”

—Attending physician at clinic B

Lack of Timeliness in Providing Feedback

Lack of timeliness refers to delays in providing feedback to the residents. Branch and Paranjas24 suggest residents should receive feedback at least every two to three months. In our analysis, two attending physicians at clinic B reported completing evaluations twice a year; they reported time constraints as a barrier to timely feedback.

“It’s time consuming because I have 28 residents.”

“We just do it electronically so we don’t actually have that feedback like oral feedback session because they come at different times.”

—Attending physicians at clinic B

I’d rather not deal with it than deal with that because then you’re sending more work for me … . Every year, every year now, I have one that is wasting my time.”

—Attending physician at clinic A

Unclear Benefit of Patient Experience Data as a Tool to Inform and Frame Actionable Feedback

Benefit refers to the degree with which the attending physicians consider patient experience survey data as a beneficial tool for providing actionable feedback. Only one of five participants at clinic B reported that the surveys were a suitable tool for providing feedback to the residents. However, of the nine participants overall, eight questioned the value of patient experience survey data in providing resident feedback. These attendings reported that the surveys were not beneficial; they felt that the information obtained from the surveys was insufficient to address patient issues or give effective feedback to the residents.

“I don’t see that they are a big help to the resident, nor to me, unless the patient very specifically writes something, you know, out of the ordinary that the resident did, whether it be egregious or something positive. Short of that, I don’t see that they are very helpful evaluations to me or the resident, in their current state.”

“Yeah, because I’m not getting that much useful information except uh “wonderful doctor,”  “the best,”… but no really constructive feedback … . We [are] doing it [patient experience surveys] just to meet this [Accreditation Council for Graduate Medical Education] requirement but yet they’re not learning … . There’s no feedback and they’re not learning what they should do to improve themselves. There’s no purpose of doing the evaluation … . So it’s a little more difficult and there’s no details on those and so it’s a little harder to give feedback … . I don’t see any comments … at all so it’s hard to give the feedback.”

“I don’t think the residents care too much. They get evaluated so many ways and so many times a year.”

—Attending physicians at clinics A and B

Most attendings did not like the survey format. They preferred open-ended questions where patients could provide specific examples and task-specific feedback.

Lack of Resident-Centered Feedback

Resident-centered feedback is feedback that engages the resident in discussion and allows for shared goal setting.4 One participant in clinic B reported delivering feedback by having a face-to-face conversation. In contrast, the other three participants in clinic B reported providing feedback electronically; they cited lack of time and the high number of assigned residents as barriers to resident-centered feedback.

“Unfortunately we don’t sit down … We don’t sit down with any one of them except the ones who actually um have difficulty. Then we meet, we talk to that person personally, but other than that we just do it electronically so we don’t actually have that feedback-like oral feedback session.”

—Attending physician at clinic B

“I mean, ideally, yes, it would be lovely to have them come, sit, go through everything, see how you’re doing, whatever, but there’s so many of them.”

—Attending physician at clinic B

These attendings acknowledge that use of an electronic medium alone can create a barrier to resident-centered feedback because it does not provide an opportunity for the resident to reflect, comment, or engage in the solution-making process. 

15 154p3


This study provides insight into how attending physicians use patient-reported experience measures to provide feedback for residents in an internal medicine training program. We identified six core themes influencing the use of patient experience data in providing resident feedback: 1) perceived inability of residents to learn or to incorporate feedback, 2) punitive nature of feedback, 3) lack of training in the delivery of actionable feedback, 4) lack of timeliness in the delivery of feedback, 5) unclear benefit of patient experience survey data as a tool for providing resident feedback, and 6) lack of individualized feedback.  In 2001, the Institute of Medicine codified patient-centeredness as one of six health care quality aims.25 Patient experience is a critical facet of patient-centeredness. Moreover, studies have linked better patient experiences to favorable health behaviors and outcomes.26-39 In alignment with this aim, the Institute of Medicine advocates the use of patient experience data as a patient-centered tool for promoting quality care. Concrete patient experience data can define key points of intervention for improving the care experience. These data argue for greater training on the use of patient experience survey data to effect practice change and ultimately to improve health behaviors and outcomes. Physicians in training are an ideal population to intervene because they are at an early stage in their career and may be more malleable.8,40-43 Thus, actionable feedback may have a greater effect on practice behaviors.

15 154p5Implementation science dictates that a tool or system must be accepted by the stakeholders for it to be successful.44 Low acceptability of patient experience data was noted in our study and previously at other institutions.45 Thus, methods for increasing attending buy-in on the merits of patient experience measures as tools to inform actionable feedback need to be explored. Increased buy-in could be achieved by involving attending physicians in the implementation process. For example, participants in our study suggested including open-ended questions and comment areas to elicit more detailed patient experience data. Previous studies suggest that medical education programs can benefit from more intense support and from training on how to interpret patient experience survey data and to deliver actionable feedback.46,47 One potential method for incorporating patient experience data into actionable feedback for the resident is the use of a framework grounded in feedback-intervention theory. Feedback should be individualized, and recommendations should be solutions oriented (ie, task-specific and actionable). The highest, most effective form of feedback de-emphasizes hierarchy and embraces a supportive dialogue between the attending and resident. Beyond identifying competency gaps, it requires the attending to understand the resident as a learner (ie, understand the resident’s motivations and goal orientation). The attending can then leverage this knowledge to engage and motivate the resident to reflect on his/her performance, and to set goals and develop an action plan to achieve those goals. To close the feedback loop, the attending should follow-up to determine if the resident has made progress in achieving goals.48,49 The Sidebar: Steps to Using Patient Experience Data to Provide Residents with Actionable Feedback details the steps to using patient experience data to provide residents with actionable feedback. These steps use an individualized, resident-centered, and nonpunitive approach to providing feedback. A feedback sheet (Table 2) provides the resident’s average score on each item of a patient experience survey, and compares those scores with a group of peers. Items where residents score below a certain cut-off point (eg, the lowest quartile) can identify critical areas where residents can improve. A plan to improve the identified areas, in the context of specific goals, should be formulated. For example, if “listening” is identified as an area of weakness, specific goals may be 1) using more eye contact during the visit, and 2) making reflective statements to summarize what the patient has said.50 By using this or other identified tools, one can create a robust and effective feedback process.

An important strength of our study is that we are one of the first to explore how patient experience data is incorporated into the resident feedback process. We identified six core themes that residency programs can use in assessing and modifying their own resident feedback process. On the basis of our findings, we believe that patient experience data can be successfully used to augment existing evaluation processes.

15 154p5a


The findings in our study should be interpreted with the following limitations in mind. Although our sample size is small, our participation rate of 75% is acceptable for exploratory analyses. In a qualitative study using 60 interviews, core themes were present as early as 6 interviews and data saturation was reached at 12 interviews.51 Although we collected data at 2 very different institutions, these institutions are affiliated with the same academic center. Our findings may not be generalizable. We were forced to use mixed methods by combining data from individual interviews and focus groups. In a focus group there is the concern that 1 or 2 individuals can dominate the conversation. However, there is also the opportunity for individuals to motivate each other to express their thoughts. It has been noted that integration of these 2 study methods may provide data enrichment.52


Graduate Medical Education programs may want to conduct their own internal assessment of the resident feedback process. Such assessments should review how patient experience data is incorporated into the resident feedback process and how, if at all, their faculty are trained to provide such feedback. We believe there is value in adhering to the ACGME guidelines in both spirit and content so that residents emerge from training with greater competency in interpreting and using patient experience data to improve their interpersonal and communication behaviors.

Disclosure Statement

This work was supported in part by the facilities and resources of the Center for Innovations in Quality, Effectiveness and Safety at the Michael E DeBakey VA Medical Center (#CIN 13-413), and the facilities and resources of Harris Health System. The views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs.

The author(s) have no other conflicts of interest to disclose.


We thank Aanand D Naik, MD, and Sylvia J Hysong, PhD, for their critical review of an earlier draft of this manuscript.

Mary Corrado, ELS, provided editorial assistance.

How to Cite this Article

Campbell S, Goltz HH, Njue S, Dang BN. Exploring the reality of using patient experience data to provide resident feedback: A qualitative study of attending physician perspectives. Perm J 2016 Summer;20(3):15-154. DOI: https://doi.org/10.7812/TPP/15-154.

1.    Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med 2012 Mar 15;366(11):1051-6. DOI: https://doi.org/10.1056/NEJMsr1200117.
    2.    Earning Maintenance of Certification points [Internet]. Philadelphia, PA: American Board of Internal Medicine; 2004-2016 [cited 2016 Mar 4]. Available from: www.abim.org/maintenance-of-certification/earning-points.aspx.
    3.    Cope DW, Linn LS, Leake BD, Barrett PA. Modification of residents’ behavior by preceptor feedback of patient satisfaction. J Gen Intern Med 1986 Nov-Dec;1(6):394-8. DOI: https://doi.org/10.1007/BF02596425.
    4.    Hysong SJ, Best RG, Pugh JA. Audit and feedback and clinical practice guideline adherence: making feedback actionable. Implement Sci 2006 Apr 28;1:9. DOI: https://doi.org/10.1186/1748-5908-1-9.
    5.    Al-Mously N, Nabil NM, Al-Babtain SA, Fouad Abbas MA. Undergraduate medical students’ perceptions on the quality of feedback received during clinical rotations. Med Teach 2014 Apr;36 Suppl 1:S17-23. DOI: https://doi.org/10.3109/0142159X.2014.886009.
    6.    Bing-You RG, Trowbridge RL. Why medical educators may be failing at feedback. JAMA 2009 Sep 23;302(12):1330-1. DOI: https://doi.org/10.1001/jama.2009.1393.
    7.    DaRosa DA, Skeff K, Friedland JA, et al. Barriers to effective teaching. Acad Med 2011 Apr;86(4):453-9. DOI: https://doi.org/10.1097/ACM.0b013e31820defbe.
    8.    Ende J. Feedback in clinical medical education. JAMA 1983 Aug 12;250(6):777-81. DOI: https://doi.org/10.1001/jama.1983.03340060055026.
    9.    Kogan JR, Conforti LN, Bernabeo EC, Durning SJ, Hauer KE, Holmboe ES. Faculty staff perceptions of feedback to residents after direct observation of clinical skills. Med Educ 2012 Feb;46(2):201-15. DOI: https://doi.org/10.1111/j.1365-2923.2011.04137.x.
    10.    Perron NJ, Sommer J, Hudelson P, et al. Clinical supervisors’ perceived needs for teaching communication skills in clinical practice. Med Teach 2009 Jul;31(7):316-22. DOI: https://doi.org/10.1080/01421590802650134.
    11.    Stewart EA, Marzio DH, Guggenheim DE, Gotto J, Veloski JJ, Kane GC. Resident scores on a patient satisfaction survey: evidence for maintenance of communication skills throughout residency. J Grad Med Educ 2011 Dec;3(4):487-9. DOI: https://doi.org/10.4300/JGME-D-11-00047.1.
    12.    Tamblyn R, Benaroya S, Snell L, McLeod P, Schnarch B, Abrahamowicz M. The feasibility and value of using patient satisfaction ratings to evaluate internal medicine residents. J Gen Intern Med 1994 Mar;9(3):146-52. DOI: https://doi.org/10.1007/BF02600030.
    13.    Hewson MG, Little ML. Giving feedback in medical education: verification of recommended techniques. J Gen Intern Med 1998 Feb;13(2):111-6. DOI: https://doi.org/10.1046/j.1525-1497.1998.00027.x.
    14.    Junod Perron N, Nendaz M, Louis-Simonet M, et al. Effectiveness of a training program in supervisors’ ability to provide feedback on residents’ communication skills. Adv Health Sci Educ Theory Pract 2013 Dec;18(5):901-15. DOI: https://doi.org/
    15.    McLean M, Cilliers F, Van Wyk JM. Faculty development: yesterday, today and tomorrow. Med Teach 2008;30(6):555-84. DOI: https://doi.org/10.1080/01421590802109834.
    16.    Puri A, Graves D, Lowenstein A, Hsu L. New faculty’s perception of faculty development initiatives at small teaching institutions. International Scholarly Research Notices [Internet] 2012 [cited 2015 Oct 29];2012:[about 13 p]. Available from:
www.hindawi.com/journals/isrn/2012/726270/. DOI: https://doi.org/
    17.    Richmond M, Canavan C, Holtman MC, Katsufrakis PJ. Feasibility of implementing a standardized multisource feedback program in the graduate medical education environment. J Grad Med Educ 2011 Dec;3(4):511-6. DOI: https://doi.org/10.4300/JGME-D-10-00088.1.
    18.    Searle NS, Thibault GE, Greenberg SB. Faculty development for medical educators: current barriers and future directions. Acad Med 2011 Apr;86(4):405-6. DOI: https://doi.org/10.1097/ACM.0b013e31820dc1b3.
    19.    Steinert Y, Mann K, Centeno A, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach 2006 Sep;28(6):497-526. DOI: https://doi.org/10.1080/01421590600902976.
    20.    Steinert Y, McLeod PJ, Boillat M, Meterissian S, Elizov M, Macdonald ME. Faculty development: a ‘field of dreams’? Med Educ 2009 Jan;43(1):42-9. DOI: https://doi.org/10.1111/j.1365-2923.2008.03246.x.
    21.    Hutul OA, Carpenter RO, Tarpley JL, Lomis KD. Missed opportunities: a descriptive assessment of teaching and attitudes regarding communication skills in a surgical residency. Curr Surg 2006 Nov-Dec;63(6):401-9. DOI: https://doi.org/10.1016/j.cursur.2006.06.016.
    22.    Wood J, Collins J, Burnside ES, et al. Patient, faculty, and self-assessment of radiology resident performance: a 360-degree method of measuring professionalism and interpersonal/communication skills. Acad Radiol 2004 Aug;11(8):931-9. DOI: https://doi.org/10.1016/j.acra.2004.04.016.
    23.    Residency evaluation tools [Internet]. Alexandria, VA: Alliance for Academic Internal Medicine; c2015 [cited 2015 Sep 24]. Available from: http://connect.im.org/p/cm/ld/fid=701.
    24.    Branch WT Jr, Paranjape A. Feedback and reflection: teaching methods for clinical settings. Acad Med 2002 Dec;77(12 Pt 1):1185-8. DOI: https://doi.org/10.1097/00001888-200212000-00005.
    25.    Corrigan JM, Donaldson MS, Kohn LT, Maguire SK, Pike KC. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academies Press; 2001.
    26.    Barbosa CD, Balp MM, Kulich K, Germain N, Rofail D. A literature review to explore the link between treatment satisfaction and adherence, compliance, and persistence. Patient Prefer Adherence 2012;6:39-48. DOI: https://doi.org/10.2147/PPA.S24752.
    27.    Bartlett EE, Grayson M, Barker R, Levine DM, Golden A, Libber S. The effects of physician communications skills on patient satisfaction; recall, and adherence. J Chronic Dis 1984;37(9-10):755-64. DOI: https://doi.org/10.1016/0021-9681(84)90044-4.
    28.    Carroll JG, Monroe J. Teaching medical interviewing: a critique of educational research and practice. J Med Educ 1979 Jun;54(6):498-500. DOI: https://doi.org/10.1097/00001888-197906000-00009.
    29.    Dang BN, Westbrook RA, Black WC, Rodriguez-Barradas MC, Giordano TP. Examining the link between patient satisfaction and adherence to HIV care: a structural equation model. Plos One 2013;8(1):e54729. DOI: https://doi.org/10.1371/journal.pone.0054729.
    30.    Dang BN, Westbrook RA, Hartman CM, Giordano TP. Retaining HIV patients in
care: the role of initial patient care experiences. AIDS Behav 2016 Feb 24. Epub ahead of print.
    31.    Doyle C, Lennox L, Bell D. A systematic review of evidence on the links between patient experience and clinical safety and effectiveness. BMJ Open 2013 Jan 3;3(1):e001570. DOI: https://doi.org/10.1136/bmjopen-2012-001570.
    32.    Greenfield S, Kaplan SH, Ware JE Jr, Yano EM, Frank HJ. Patients’ participation in medical care: effects on blood sugar control and quality of life in diabetes. J Gen Intern Med 1988 Sep-Oct;3(5):448-57. DOI: https://doi.org/10.1007/BF02595921.
    33.    Hojat M, Louis DZ, Markham FW, Wender R, Rabinowitz C, Gonnella JS. Physicians’ empathy and clinical outcomes for diabetic patients. Acad Med 2011 Mar;86(3):359-64. DOI: https://doi.org/10.1097/ACM.0b013e3182086fe1.
    34.    Jha AK, Orav EJ, Zheng J, Epstein AM. Patients’ perception of hospital care in the United States. N Engl J Med 2008 Oct 30;359(18):1921-31. DOI: https://doi.org/10.1056/NEJMsa0804116.
    35.    Ratanawongsa N, Karter AJ, Parker MM, et al. Communication and medication refill adherence: the Diabetes Study of Northern California. JAMA Intern Med 2013 Feb 11;173(3):210-8. DOI: https://doi.org/10.1001/jamainternmed.2013.1216.
    36.    Roberts KJ. Physician-patient relationships, patient satisfaction, and antiretroviral medication adherence among HIV-infected adults attending a public health clinic. AIDS Patient Care STDS 2002 Jan;16(1):43-50. DOI: https://doi.org/10.1089/108729102753429398.
    37.    Schneider J, Kaplan SH, Greenfield S, Li W, Wilson IB. Better physician-patient relationships are associated with higher reported adherence to antiretroviral therapy in patients with HIV infection. J Gen Intern Med 2004 Nov;19(11):1096-103. DOI: https://doi.org/10.1111/j.1525-1497.2004.30418.x.
    38.    Sequist TD, Schneider EC, Anastario M, et al. Quality monitoring of physicians: linking patients’ experiences of care to clinical quality and outcomes. J Gen Intern Med 2008 Nov;23(11):1784-90. DOI: https://doi.org/10.1007/s11606-008-0760-4.
    39.    Zolnierek KB, Dimatteo MR. Physician communication and patient adherence to treatment: a meta-analysis. Med Care 2009 Aug;47(8):826-34. DOI: https://doi.org/
    40.    Chatman JA. Improving interactional organizational research: a model of person-organization fit. Acad Manage Rev 1989 Jul;14(3):333-49. DOI: https://doi.org/10.2307/258171.
    41.    Dornan T. Workplace learning. Perspect Med Educ 2012 Mar;1(1):15-23. DOI:
    42.    Knowles MS. The adult learner: a neglected species. 3rd ed. Houston, TX: Gulf Publishing Co; 1989.
    43.    Norman GR. The adult learner: a mythical species. Acad Med 1999 Aug;74(8):886-9. DOI: https://doi.org/10.1097/00001888-199908000-00011.
    44.    Peters DH, Adam T, Alonge O, Agyepong IA, Tran N. Implementation research: what it is and how to do it. BMJ 2013 Nov 20;347:f6753. DOI: https://doi.org/10.1136/bmj.f6753.
    45.    Burford B, Illing J, Kergon C, Morrow G, Livingston M. User perceptions of multi-source feedback tools for junior doctors. Med Educ 2010 Feb;44(2):165-76. DOI: https://doi.org/10.1111/j.1365-2923.2009.03565.x.
    46.    Greenberg LW, Goldberg RM, Jewett LS. Teaching in the clinical setting: factors influencing residents’ perceptions, confidence and behaviour. Med Educ 1984 Sep;18(5):360-5. DOI: https://doi.org/10.1111/j.1365-2923.1984.tb01283.x.
    47.    Wilkerson L, Armstrong E, Lesky L. Faculty development for ambulatory teaching. J Gen Intern Med 1990 Jan-Feb;5(1 Suppl):S44-53. DOI: https://doi.org/10.1007/BF02600437.
    48.    Hysong SJ, Teal CR, Khan MJ, Haidet P. Improving quality of care through improved audit and feedback. Implement Sci 2012 May 18;7:45. DOI: https://doi.org/10.1186/1748-5908-7-45.
    49.    Luhanga U. Qualitative study of attendings’ and residents’ perspectives on feedback in pediatrics clinical settings [Internet: Doctoral dissertation]. Kingston, Ontario, Canada: Queen’s University; 2015 [cited 2015 Sep 24]. Available from: https://qspace.library.queensu.ca/bitstream/1974/13867/1/Luhanga_Ulemu_201512_PhD.pdf.
    50.    Carlisle A, Jacobson KL, Di Francesco L, Parker RM. Practical strategies to improve communication with patients. P T 2011 Sep;36(9):576-89.
    51.    Guest G, Bunce A, Johnson L. How many interviews are enough? An experiment with data saturation and variability. Field Methods 2006 Feb;18(1):59-82. DOI: https://doi.org/10.1177/1525822X05279903.
    52.    Lambert SD, Loiselle CG. Combining individual interviews and focus groups to enhance data richness. J Adv Nurs 2008 Apr;62(2):228-37. DOI: https://doi.org/10.1111/j.1365-2648.2007.04559.x.


Click here to join the eTOC list or text ETOC to 22828. You will receive an email notice with the Table of Contents of The Permanente Journal.


2 million page views of TPJ articles in PubMed from a broad international readership.


Indexed in MEDLINE, PubMed Central, EMBASE, EBSCO Academic Search Complete, and CrossRef.




ISSN 1552-5775 Copyright © 2021 thepermanentejournal.org

All Rights Reserved