Using Real-world Data for Decision Support: Recommendations from a Primary Care Provider Survey


Patricia A Areán, PhD1,2; Emily C Friedman, MID, CPE2; Abhishek Pratap, PhD3; Ryan Allred, BA1; Jaden Duffy, BA1; Sara Gille, MPH4; Shelley Reetz, BS4; Erin Keast, MPH4; Gregory Clarke, PhD4

Perm J 2021;25:20.213
E-pub: 03/01/2021

Introduction: The use of data from wearable sensors, smartphones, and apps holds promise as a clinical decision-making tool in health and mental health in primary care medicine. The aim of this study was to determine provider perspectives about the utility of these data for building digitally based decision-making tools.

Methods: This mixed quantitative and qualitative cross-sectional survey of a convenience sample of primary-care clinicians at Kaiser Permanente Northwest was conducted between April and July 2019 online via Institute for Translational Health Sciences’ Research Electronic Data Capture. Study outcomes were 1) attitudes toward digital data, 2) willingness to use digital data to support clinical decision making, and 3) concerns and recommendations about implementing a digital tool for clinical decision making.

Results: This sample of 131 clinicians was largely white (n = 98) female (n = 91) physicians (n = 86). Although respondents (75.7%, n = 87) had a positive attitude toward using digital tools in their practice, 88 respondents (67.3%) voiced concerns about the possible lack of clinical utility, suspected difficulty in integration with clinical workflows, and worried about the potential burden placed on patients. Participants indicated that the accuracy of the data in detecting the need for treatment adjustments would need to be high and the tool should be clinically tested.

Conclusions: Primary care providers find value in collecting real-world patient data to assist in clinical decision making, provided such information does not interfere with provider workflow or impose undue burden on patients. In addition, digital tools will need to demonstrate high accuracy, be able to integrate into current clinical workflows, and maintain the privacy and security of patients’ data.


Primary care medicine is the first stop for the assessment and treatment of mental health problems. As a result, the integration of mental health services into primary care medicine is now the subject of substantial dissemination and implementation supported through state, regional, and national efforts, including the implementation of specific billing codes for Medicare insurance recipients.1,2 A critical feature of good integrated care is continuous outcome measurement to monitor treatment response and access to expert opinion when patients fail to respond as anticipated. However, the monitoring of treatment outcomes and mediation at scale is challenging. Depression outcome measurement relies on the use of patient-reported outcomes, such as the Patient Health Questionnaire 9,3 which is based on retrospective self-report of symptoms and is collected only sporadically. Indeed, there is a marked decrease in the number of follow-up depression assessments in primary care medicine in people who screen positive for depression and receive treatment for it.4 Self-reports also are not informative about when treatment should be augmented or switched, or if a patient needs to be seen immediately for emergency reasons. Although patients find these measures somewhat informative, they also find that the questions asked do not assess important measures of improvement, such as activity, social connectedness, and work productivity.5 Decision support and access to expert opinion on the delivery of depression care is limited and impacts the quality of care substantially.6 This problem is recognized by many large health care systems that want to support the use of decision support tools.7,8

To mitigate this problem, recent efforts have turned out the use of Clinical Decision Support Systems (CDSS)—data analytic tools embedded in electronic health records that compile patient information, and synthesize and visualize the information to support clinicians in making treatment decisions. Preliminary evidence suggests these tools can be effective in supporting integrated mental health programs.9,10 A growing interest in the informatics field is the addition of data streams from ubiquitous sensing technologies and smartphone applications (patient-completed apps as a means of informing and improving clinical decision making, with the intention of embedding these data into CDSS).11 The large scale and frequent use of smartphones have the potential to capture changes in mood, activity, and function in real time, with minimal burden to the patient. A growing number of electronic health record apps that are both patient- and provider-facing tools are being deployed for health-care reasons,12 but the application to mental health care is in a nascent stage. An additional challenge to the integration of these data into CDSS tools is that clinician uptake is low, resulting from the perceived burden and poor clinical utility of the information provided.13-15 According to the Agency for Healthcare Research and Quality’s 2010 report on Clinical Decision Supports,16 poor uptake is a result of CDSS developers’ lack of familiarity with clinician workflows and how clinicians make decisions, and limited clinician involvement during CDSS design and development. For CDSS that integrate real-world data from ubiquitous technology to be usable and effective for providers delivering depression treatment in busy primary care clinics, these tools must be designed to account for clinician workflows and meaningfulness in decision making.

The purpose of this study was to determine what primary care providers feel would help them with clinical decision making; whether real-world data from digital tools are seen as helpful in decision making, and how the data should be shared and integrated into their practice; and whether they have any concerns regarding collecting and accessing such information. Collection of such data is meant to guide the development and deployment of CDSS in the future.


Study Design

This is a mixed-methods quantitative and qualitative cross-sectional survey of a convenience sample consisting of clinicians working in primary care medicine. Clinician participants were recruited from Kaiser Permanente Northwest, an integrated delivery system, to participate in an online survey regarding attitudes toward the use of digital information for clinical decision making. Recruitment took place between April and July 2019. Every eligible provider for the selected clinic locations was sent a preparticipation incentive (a quality chocolate bar) and a study recruitment letter with instructions for how to complete the online study survey (see eAppendix 1a in the Supplement for the recruitment letter). A week later, we sent an initial follow-up email to each of the providers who had not yet completed the survey (see eAppendix 2a in the Supplement for the follow-up email). In total, we sent 373 letters to providers, 358 initial follow-up emails, and 142 subsequent follow-up emails. This study was reviewed and approved by the institutional review boards at the University of Washington and Kaiser Permanente Northwest.

Survey Procedures

The survey was administered via Research Electronic Data Capture, a secure, online database and survey platform hosted at the University of Washington Institute of Translational Health Sciences. A link to the Research Electronic Data Capture survey was distributed to Kaiser Permanente Northwest providers via letter and email by study staff at the Kaiser Permanente Center for Health Research. Upon clicking the link, participants were asked to review the study consent form and decide whether they would like to continue to the full survey. The survey questions were designed to elicit the opinions of mental health care providers on using digital tools to help inform clinical decision making (see eAppendix 3a in the Supplement for the full survey, including the consent form). It included questions that asked for both multiple-choice responses (quantitative) as well as free-text responses (qualitative). The survey took approximately 10 minutes to complete.

Survey Development and Content

The survey items were codeveloped by the research team, and initial tests with pilot survey respondents were conducted to ascertain survey clarity, completeness, and burden. Modifications were made to the initial survey to account for survey burden and content. The aim was to create a face-valid survey that could be completed within a 10- to 15-minute timeframe.

Although the final survey was a mixed-methods survey that included some forced-choice answers to questions with an opportunity to provide further comments in an open-text field, the majority of the survey relied on the use of open-ended questions. These decisions were made to elicit the greatest range of unanticipated input without limiting or influencing the participants with predefined answers, which is particularly important when trying to understand attitudes and perceptions. Participants were asked to complete demographic questions that included gender, age, years in practice, ethnicity, and highest degree obtained. The survey probed 3 areas regarding mental health apps: general attitudes toward digital tools for clinical practice, current use of such tools, and acceptability of a future tool to track patient progress and inform clinical care. Each forced-choice question was followed by an open-ended query about the reason for their response and any concerns they had regarding the use of these tools for clinical purposes.

Data Analysis

SPSS data analysis software (IBM version 27 2020) was used to tabulate demographic information as well as the answers to the multiple-choice questions on topics such as behavior and attitudinal ratings. Most of the survey contained open-ended questions. To analyze the respondents’ qualitative responses, the answers were imported into Miro,17 a visual collaboration platform. Using an affinity diagram method,18 the data were organized into meaningful categories based on problems, common themes, and patterns that emerged or evolved naturally from the research questions.



Of the 131 clinicians who agreed to complete the survey, 70% (n = 91) of the respondents were female and a majority (74.8%, n = 98) were white. A total of 65.6% (n = 86) were doctors (Doctor of Medicine) and 5.3% (n = 7) were nurse practitioners, with the remainder being health-care providers from other disciplines (doctors of philosophy, licensed clinical social workers, physician assistants, medical assistants, and psychologists with a master’s of science degree). All providers worked in primary care medicine. The average age was 49 and the average years in practice was 13.8. See Table 1 for participant demographics.

Table 1. Demographic characteristics of study participants

Demographic n %a
 Man 39 30.0
 Woman 91 70.0
 Native American or Alaskan native 0 0.0
 Asian 29 22.1
 Black or African American 1 0.8
 Hispanic/Latino 5 3.8
 Native Hawaiian or Pacific Islander 0 0.0
 Other 1 0.8
 White 98 74.8
Degree(s) heldb
 MD 86 65.6
 PhD 3 2.3
 LCSW 11 8.4
 PA 6 4.6
 MA 5 3.8
 MS 1 0.8
 NP 7 5.3
 Other (includes RN, PsyD, LPC, LMFT, DO) 13 9.9
  Mean (SD) Range
Age (y) 46 (9.0) 27–65
Years in practice 13.8 (8.6) 1–37

a. Percentages exclude missing responses.

b. Racial categories and degrees held are not mutually exclusive.

N = 131.

DO = Doctor of Osteopathy; LCSW = licensed clinical social workers; LMFT = licensed marriage and family therapist; LPC = licensed professional counselor; MA = master of arts; MD = doctor of medicine; MS = master of science; NP = nurse practitioner; PA = physician assistant; PsyD = doctor of psychology; PhD = doctor of philosophy; RN = registered nurse; SD = standard deviation.

Attitudes Toward and Use of Digital Tools for Clinical Decision Making

A majority of respondents reported positive attitudes toward the use of apps or sensors for gathering patient’s real-world data in clinical practice, with 49.6% (n = 57) indicating positive attitudes and 26.1% (n = 30) indicating slightly positive attitudes. Forty-eight providers (42.5%) reported they currently use apps to help with decision making; 67 respondents (58.3%) reported they encouraged patients to use mental health apps to augment treatment. Respondents indicated that the apps they commonly used were tools for clinical support and risk assessment. Although respondents found these existing tools to be helpful in enhancing and expediting clinical decision making, they found these tools to be time-consuming to use and incompatible with existing technology systems they currently use (eg, the integrated electronic health record system). Apps that respondents tended to recommend to patients were mood trackers, meditation apps, and Kaiser Permanente’s health tools. Respondents indicated that these tools were useful for patient support and education, but were concerned that the technology felt too robotic/impersonal and that the information on these patient-facing apps may not be accurate or evidence based. See Table 2 for the main quantitative survey results.

Table 2. Main quantitative survey results

Survey questions n %
In general, would you describe your attitude toward using digital tools, such as an app or a monitor, in your practice as
 Negative 3 2.6
 Slightly negative 5 4.3
 Neutral 20 17.4
 Slightly positive 30 26.1
 Positive 57 49.6
  Yes (n) Yes (%)
Do you currently use any apps to help with your clinical decision making? 48 42.5
Do you encourage your patients to use any particular apps that are available now? 67 58.3
Would you encourage patients to use an app that tracked their progress and helped you stay informed about their treatment? 96 84.2
Would you have any concerns about such an app? 76 67.3
If an app existed that could collect information on a moment-by-moment basis about your patients’ state of mind, would you think this was useful? 47 42.7
If an app could tell you accurately that one of your patients will need a change in treatment, would you want that information? 101 90.2
  n %
If you could receive information from a patient-used app that could augment your clinical decision making, how/where would you want that information delivered?
 Clinical notes 80 61.1
 Epic staff message 41 31.3
 Outlook email message 4 3.1
Other 9 6.9

N = 131.

Acceptability of an Outcome Tracking App for Clinician Decision Making

Ninety-six respondents (84.2%) reported they would encourage patients to use a future app that could monitor treatment outcomes and could be shared with the provider for clinical decision making. Qualitative data indicated that, in addition to the information they already collect at each visit, such as symptom rating scales and identification of stressors, useful information from digital tools would be any data related to wellness, such as physical and social activity, sleep, symptom tracking, nutrition, heart rate, blood pressure, and any recent external life stressors. As one provider said, “[I would like to have information about] self-care behaviors as they truly are (social activity, exercise, sleep habits, alcohol intake, diet), but in a quick, usable format.”

Respondents were less likely to use a digital tool that tracked moment-to-moment information about patient state of mind (42.7%, n = 47), but were more likely to use a tool that could accurately inform them if any of their patients needed a change in treatment (90.2%, n = 101). Eighty respondents (61.1%) reported they would prefer to have this information integrated with their electronic clinical notes; 41 respondents (31.3%) preferred the information be provided in their electronic health record messaging platform. One provider explained, “. . . as long as the patient is comfortable, and it doesn’t add too much to my workload. In Primary Care we have so many streams of information and the inbox management with electronic messages and results has become overwhelming. More data isn’t necessarily better.” See Table 2 for the main quantitative survey results.


Qualitative analysis surfaced 3 areas of major concern about information from patient-facing apps and wearable sensors. First, respondents reported their patients may be overwhelmed by too much information and may have trouble interpreting the data from digital health tools. Examples of such comments include the following: “Sometimes too much information is more anxiety provoking” and “Information must be clear & easily presented so patient doesn't need my interpretation to benefit.” Second, data concerns were also prominently reported, particularly with regard to data security, privacy, and accuracy. One comment included, “Accuracy of data, plus amount of data would have to be summarized well. I don't want to be scrolling thru days of info.” Finally, respondents were concerned about the impact such information would have on their workflow and time, and potential responsibility and liability if providers were not able to respond right away to a high-risk situation (eg, suicidal ideation). As one provider said, “As long as the patient is comfortable and it doesn't add too much to my workload. We have so many streams of information and inbox management has become overwhelming. . . . What would happen if there were concerns such as [suicidal ideation] and I am not available to respond to them. What if nobody responds to them?” See eTable 1a in the Supplement for additional participant quotes that illustrate these areas of concern.


Respondents offered recommendations in 2 major areas: the evidence providers need to have before using a clinical decision support tool and how data collected from a clinical decision support tool should be integrated into their practice. The 3 most prominent themes that respondents felt would need to be addressed before the future use of real-world data in a clinical decision support tool were 1) the accuracy of the data in detecting the need for treatment adjustments, 2) the requirement for the tool be clinically tested and vetted, and 3) the need for the tool to be evidence based (see eTable 2a in the Supplement for participant quotes that illustrate these recommendations). For the information to be readily useful to providers, respondents indicated that the information in clinical notes or messaging services and alerts should be based on an assessment of patients’ needs and current status.


Real-time acquisition of information from smart devices has the potential to transform how quickly clinicians can intervene in treatment and may enhance clinical outcomes. According to the results of our survey, many providers currently use clinical decision support apps to inform practice, and encourage their patients to use self-guided mental health tools to support and augment their care. We also found there is considerable interest and perceived utility of real-world data for clinical decision making, but before these tools can be adopted into practice, a number of concerns should be addressed, and recommendations of providers put into action.

First and foremost, the accuracy of the high-frequency real-world acquired data to inform clinical decisions would need to be demonstrated before providers would use recommendations from a digital health app or CDSS using such data. This is a concern that has been voiced in other studies of patient opinion regarding such tools19,20 and by critics in the digital mental health field.21,22 Indeed, many existing decision support tools that rely on data from sensors and apps have not been properly vetted in their accuracy.23 Only support for research to develop and test the accuracy of these tools will mitigate provider and patient concerns around accuracy, and, importantly, this research must follow open science policies to ensure customer trust in the results from such research,23 rather than being held as proprietary information.

A second and related concern was the potential liability with which providers could be faced if they were not able to provide timely care. Providers were less likely to be interested in moment-to-moment changes in mood and activity, but instead preferred to have an overall general well-being score beyond what self-report measures would provide. They were particularly concerned with having access to alerts about adverse events, such as suicide risk. Providers indicated that the current system of care in primary care medicine was not nimble enough to respond to emergencies such as suicide. This particular feature, the potential for these tools to result in proximal risk detection of adverse events, presents an important challenge in creating alert tools. The first, as was indicated in this survey, is the ability for health-care systems, like primary care, to respond appropriately to these emergencies. For these tools to be useful, systems will need to provide resources to providers to respond in a timely manner. The second problem is one of accuracy, reflected to some degree in the first concern raised. If a tool is too sensitive to emergencies, in that there are too many false-positive alerts, providers and health-care systems will be expending limited resources unnecessarily. If the tool is insensitive, with too many false negatives, providers and health-care systems will be less likely to use this feature because of the dangers associated with false-negative reports. Given the inability of any past effort24 to identify the proximal risk of suicide, this may be a feature that should be omitted from the design of any CDSS or mental health app until better detection and a follow-up system is available. In sum, providers were more interested in information that could offer an overall response to treatment, rather than detailed functioning or emergency alerts.

Third, and as important a concern as the first, is the issue regarding the secure transfer of data from apps to CDSS in health records. Other studies have found that patients are very concerned about who would have access to such data and if very personal information could be shared accidentally with organizations that would use the data for other purposes.25 Data security and sharing problems have been elucidated in other studies,26,27 indicating these issues are valid. More methods to protect sensitive data are needed before tools of this nature are implemented.

Finally, providers do not want any tool to place additional burdens on the patient in terms of information overload or interference with daily activities, nor do providers want a tool that interferes with their workflow. These concerns have also been identified as potential reasons for why mental health apps have such poor uptake by patients, because of their poor design and limited accounting for patient burden.28

This study has some limitations that require mention. First, this is a convenience sample survey of providers in a group health model and, as such, the opinions are limited to people in these practices. In addition, these providers are potentially more interested in the use of technology than typical primary care providers. Kaiser Permanente has adopted a number of digital tools for health care, and thus these respondents may have a more positive attitude toward such tools and devices. Second, although we were able to acquire additional qualitative opinions, we were not able to have respondents interact with any tools to provide a more nuanced opinion into such tools. Future research will need to employ user-centered design methods to develop any end product integrating real-world data in CDSS.


The major takeaway from this provider survey of the use of data from personal digital technology to inform clinical decision making is that primary care providers who manage mental health problems are open to the use of such data, provided certain caveats are addressed. Primarily, ease of use, data security, data accuracy, and synthesized data are all important characteristics that need to be considered when developing these tools. We recommend more research be conducted to validate the signal from these tools, and to include providers and patients in the design of the information to be collected, and how it is compiled and used in care.

Supplemental Material

aSupplemental Material is available at:

Disclosure Statement

The author(s) have no conflicts of interest to disclose.


Kathleen Louden, ELS, of Louden Health Communications performed a primary copy edit.

Author Affiliations

1Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA

2ALACRITY Center, University of Washington, Seattle, WA

3Sage Bionetworks, Seattle, WA

4Kaiser Permanente Center for Health Research, Portland, OR

Corresponding Author

Patricia A Areán, PhD (

Author Contributions

Emily Friedman, MID, CPE; Abhishek Pratap, PhD; and Patricia Areán, PhD; developed the survey and survey methods for this study. This team also analyzed data. Gregory Clarke, PhD; Sara Gille, MPH; Shelley Reetz, BS; and Erin Keast, MPH; were responsible for participant recruitment. All parties contributed to writing the manuscript.


This project was supported by award nos. UL1 TR002319, KL2 TR002317, and TL1 TR002318 from the National Center for Advancing Translational Sciences/National Institutes of Health. Funding was also provided by a grant from the Kaiser Permanente Center for Health Research to Dr Clarke.


1. World Health Organization Regional Office for Europe. Integrated care models: An overview working document. Copenhagen, Denmark. 2016. August 2020.

2. Centers for Medicare & Medicaid Services. Behavioral Health Integration Services booklet 2018. August 2020.

3. Siu AL, Bibbins-Domingo K, Grossman DC, et al Screening for depression in adults. J Am Med Assoc 2016 Jan;315(4):380–7. DOI:

4. Schaeffer AM, Jolles D. Not missing the opportunity: Improving depression screening and follow-up in a multicultural community. Joint Comm J Qual Patient Saf 2019 Jan;45(1):31–9. DOI:, PMID:30139563.

5. Malpass A, Shaw A, Kessler D, Sharp D. Concordance between PHQ-9 scores and patients’ experiences of depression: A mixed methods study. Br J Gen Pract 2010 Jun;60(575):e231. DOI:, PMID:20529486.

6. Mancini AD, Moser LL, Whitley R, et al Assertive community treatment: Facilitators and barriers to implementation in routine mental health settings. Psychiatr Serv 2009 Feb;60(2):189–95. DOI:, PMID:19176412.

7. Ranmuthugala G, Plumb JJ, Cunningham FC, Georgiou A, Westbrook JI, Braithwaite J. How and why are communities of practice established in the healthcare sector? A systematic review of the literature. BMC Health Serv Res 2011 Oct;11, 273. DOI:, PMID:21999305.

8. Ranmuthugala G, Cunningham FC, Plumb JJ, et al A realist evaluation of the role of communities of practice in changing healthcare practice. Implement Sci 2011 May;6(1):49. DOI:, PMID:21600057.

9. Brown GS, Simon A, Cameron J, Minami T. A Collaborative Outcome Resource Network (ACORN): Tools for increasing the value of psychotherapy. Psychotherapy 2015 Dec;52(4):412–21. DOI:, PMID:26641371.

10. Lutz W, de Jong K, Rubel J. Patient-focused and feedback research in psychotherapy: Where are we and where do we want to go? Psychother Res 2015 Sep;25(6):625–32. DOI:, PMID:26376225.

11. Ng A, Kornfield R, Schueller SM, Zalta AK, Brennan M, Reddy M. Provider perspectives on integrating sensor-captured patient-generated data in mental health care. In: Proceedings of the ACM on human–computer interaction. Association for Computing Machinery: 2019 Nov; Galsgow Scotland. Article number 115. p 25.

12. Pusic M, Ansermino JM. Clinical decision support systems. British Columbia Med J 2004;46(5):236–9.

13. Lurio J, Morrison FP, Pichardo M, et al Using electronic health record alerts to provide public health situational awareness to clinicians. J Am Med Inf Assoc 2010 Mar–Apr;17(2):217–9. DOI:, PMID:20190067.

14. Singh H, Spitzmueller C, Petersen NJ, et al Primary care practitioners’ views on test result management in EHR-enabled health systems: A national survey. J Am Med Inf Assoc 2013 Jul;20(4):727–35. DOI:

15. Singh H, Spitzmueller C, Petersen NJ, Sawhney MK, Sittig DF. Information overload and missed test results in electronic health record-based settings. JAMA Intern Med 2013;173(8):702–4. DOI:

16. Eichner J, Das M. Challenges and barriers to clinical decision support (CDS) design and implementation experienced in the Agency for Healthcare Research and Quality CDS demonstrations. 2010. August 2020

17. Miro Software v1.0 (1995). Miro. San Francisco, CA; 2019.

18. Harboe G, Huang EM. Real-world affinity diagramming practices: Bridging the paper-digital gap. In: Conference on human factors in computing systems: Proceedings. Association for Computing Machinery: 2015; p 95–104.

19. Pifer R. Patient use of digital health tools lags behind hype, poll finds. Healthcare Dive. 2019. August 2020.

20. Renn BN, Hoeft TJ, Lee HS, Bauer AM, Areán PA. Preference for in-person psychotherapy versus digital psychotherapy options for depression: Survey of adults in the U.S. NPJ Digit Med 2019 Dec;2(1). DOI:

21. Hatch A, Hoffman JE, Ross R, Docherty JP. Expert consensus survey on digital health tools for patients with serious mental illness: Optimizing for user characteristics and user support. JMIR Ment Health 2018 Jun;5(2):e46. DOI:

22. Minen MT, Stieglitz EJ, Sciortino R, Torous J. Privacy issues in smartphone applications: An analysis of headache/migraine applications. Headache 2018 Jul;58(7):1014–27. DOI:, PMID:29974470.

23. Torous J, Andersson G, Bertagnoli A, et al Towards a consensus around standards for smartphone apps and digital mental health. World Psychiatry 2019 Feb;18:97–8. DOI:

24. Franklin JC, Ribeiro JD, Fox KR, et al Risk factors for suicidal thoughts and behaviors: A meta-analysis of 50 years of research. Psychol Bull 2017 Feb;143(2):187–232. DOI:, PMID:27841450.

25. Pratap A, Allred R, Duffy J, et al Contemporary views of research participant willingness to participate and share digital data in biomedical research. JAMA Netw Open 2019 Nov;2(11):e1915717. DOI:, PMID:31747031.

26. Huckvale K, Torous J, Larsen ME. Assessment of the data sharing and privacy practices of smartphone apps for depression and smoking cessation. JAMA Netw Open 2019 Apr 5;2(4):e192542. DOI:, PMID:31002321.

27. Rosenfeld L, Torous J, Vahia IV. Data security and privacy in apps for dementia: An analysis of existing privacy policies. Am J Geriatr Psychiatry 2017 Aug;25(8):873–7. DOI:, PMID:28645535.

28. Ledel Solem IK, Varsi C, Eide H, et al A user-centered approach to an evidence-based electronic health pain management intervention for people with chronic pain: Design and development of EPIO. J Med Internet Res. 2020 Jan;22(1):e15889. DOI:, PMID:31961331.

Keywords: decision support, digital health, mental health, primary care, real-world data, return of information


Click here to join the eTOC list or text ETOC to 22828. You will receive an email notice with the Table of Contents of The Permanente Journal.


2 million page views of TPJ articles in PubMed from a broad international readership.


Indexed in MEDLINE, PubMed Central, EMBASE, EBSCO Academic Search Complete, and CrossRef.




ISSN 1552-5775 Copyright © 2021

All Rights Reserved