ABSTRACT

Objective. Computerized decision-support systems (CDSSs) can offer clinical guidance, as well as promote doctor–patient collaboration and patient self-care. As such, they have great potential for improving chronic pain management, particularly in the primary care setting, where physicians often lack sufficient pain-specific clinical expertise and communication skills. The objective of this study was to examine the use of CDSSs in chronic pain management, and to review the evidence for their feasibility and effectiveness.

Design. A review of the available literature using search terms associated with computerized decision-support and chronic pain management. Major databases searched included: MEDLINE, CINAHL, PsychINFO, HealthSTAR, EMBASE, Cochrane Library, Computer and Information Systems Abstracts, and Electronics and Communications Abstracts. Descriptive and evaluative studies were included.

Results. Nine studies describing eight CDSSs met study inclusion criteria. With but two exceptions, CDSSs were specific to a pain-related condition(s). All were designed to assist clinicians to manage pain medically. Aside from pain status, input specifications differed markedly. Evaluative studies were exclusively feasibility studies and varied widely in design and level of description. All were nonexperimental; most were methodologically weak. Two primary care studies were reported. Patient and clinician acceptability ratings of CDSSs ranged from moderate to high. Due to insufficient data, definitive conclusions concerning the impact of CDSSs on provider performance and patient outcomes were not possible.

Conclusion. Research on CDSSs in chronic pain management is limited. The effects of CDSSs on provider and patient outcomes remain understudied, and their potential to improve doctor–patient collaboration and self-care largely untested.

Introduction

An estimated 9% of adults in America suffer from chronic pain and its sequelae, and over half of these individuals seek treatment for this condition from a primary care physician (PCP) [1]. In recognition of this fact, recent American Pain Society guidelines advocate that PCPs should “… participate in the process of screening, diagnosis, and long-term follow-up treatment of patients who suffer from chronic pain [j]ust as PCPs diagnose and maintain patients with other chronic diseases …”[2]. In order to accomplish this goal, however, PCPs must be equipped not only with the necessary clinical tools and expertise, but also with the communication and related interpersonal skills to build and sustain a strong alliance with their patient. Collaborative management that strengthens and supports self-care is recognized as the most appropriate and cost-effective way to treat chronic pain and is, moreover, the approach most often preferred by both patients and health care providers [2–6].

In reality, however, deficits in physicians' training and knowledge regarding pain management, coupled with time constraints during the primary care visit, frequently prevent PCPs from meeting these dual clinical and psychosocial expectations [5,7–15]. PCPs commonly focus more on technical aspects of care when treating chronic pain patients, and less on promoting patient self-management behaviors [7]. In addition, doctor–patient communication regarding pain is frequently inadequate [16–19]. Consequently, physicians' perceptions of patient pain are often incongruent with patients' self-ratings, treatment goals are frequently developed without patient input, and patient adherence to treatment plans is less than optimal [20–25].

Faced with declining reimbursement rates from both public and private payers, physicians need an alternative that enables them to reconcile the twin imperatives of providing high-quality pain care while maximizing efficiency in their clinical practice. Computerized decision-support systems (CDSSs) offer one potential solution to this dilemma. CDSSs are information systems designed to enhance the quality of clinical decision making and to minimize deviations in clinical performance from accepted professional standards [26]. Individual patient data are entered into the system whereupon predetermined algorithms, guided by a resident logic library of expert-based clinical data, generate patient-specific recommendations [26]. Data can be collected via a personal computer or a handheld device, such as a palm pilot. Typically, output is available in real time for use during the medical visit.

The process involved in using a CDSS based on personal computer technology is illustrated as follows. Upon arrival at a medical appointment, a patient is directed to a computer workstation equipped with a laptop, stylus, and printer. Following instructions on the computer screen, the patient completes an electronic questionnaire containing the following types of items: current pain status (e.g., intensity, location, duration), past and present pain treatment regimens, degree of adherence to prescribed pain medication, side effects of current medication, risk factors for opioid abuse, impact of pain on respondents' quality of life (QOL), psychological status, pain-related coping and self-management strategies, type and degree of social support, personal goals for daily living, and (for initial visit only) demographic information. The data collected are automatically stored electronically and then run through a series of algorithms. In the final step, a report is generated (electronically, in hard-copy format, or both) that contains a synopsis of the patient's pain status, recommendations as to possible improvements in the current pain treatment regimen, key issues to discuss with the patient (e.g., problems with medication adherence or side effects, signs of depression), and a list of patient-targeted suggestions designed to encourage pain self-management (e.g., recommendations to pursue a specific exercise regimen).

Both the type and application of CDSSs have been expanding rapidly in recent years. To date, CDSSs have been used to diagnose a variety of medical conditions, to enhance the provision of preventive care (e.g., cancer screening, vaccination), to facilitate disease management, and to assist with drug prescribing [27–32]. Evidence suggests that CDSSs can improve health care practitioner performance and patient outcomes, particularly in the areas of vaccinations, tetanus immunizations, breast cancer screening, colorectal cancer screening, cardiovascular risk reduction, and smoking cessation [26,33–36].

CDSSs can be designed not only to provide clinical guidance, but to capture and integrate patients' perspective on their illness, and to promote positive patient health-related behaviors. Sometimes referred to as “expert systems,” this variant of a CDSS is particularly promising for use in the clinical management of chronic pain for two reasons: (i) pain is a highly subjective experience, hence patient-reported data are essential to obtain, and (ii) successful outcomes of pain management are at least equally, if not more, dependent on appropriate self-care (e.g., self-monitoring of pain, adherence to prescribed medications, regular exercise, weight control) than on the quality of the clinical diagnosis or recommended therapeutic regimen [5]. Based on the data entered, the expert system produces tailored recommendations, often derived from behavioral change models, that are specific to the needs of the individual patient [37]. Recommendations are directed at either the clinician or the patient, or separate output can be generated for each party. To date, expert systems have been used widely in the arena of health behavior modification, such as smoking cessation and weight reduction [38–43].

The purpose of this study was to systematically review the research evidence for CDSSs to address the following questions: (i) To what extent have CDSSs been utilized in the context of chronic pain management? (ii) What are the characteristics of these systems? and (iii) To what degree have they been evaluated and in what types of clinical settings?

Methods

Data Sources

We conducted an automated literature search using the Ovid search engine. With the assistance of a research librarian, we searched the following databases: MEDLINE (1966 to April 2006), CINAHL (1982 to April 2006), PsychINFO (1967 to April 2006), HealthSTAR (1981 to April 2006), EMBASE, Cochrane Library, Computer and Information Systems Abstracts, Electronics and Communications Abstracts, Proust Digital Dissertations, Computer Retrieval of Information on Scientific Projects (CRISP), LISA, ERIC, Computer and Information Systems Abstracts, and Dissertation Abstracts. Key search words employed included the following: computer-generated decision support systems and expert systems. Additional terms included: chronic pain, primary care, tailored reports, personalized computer-based information, disease management for chronic pain, patient goals, pain diagnosis and management, decision support systems, neural networks, and fuzzy logic. We also conducted a manual search to supplement the automated search. The manual search was not limited in time period and included articles that had been referenced in other articles.

Study Selection Criteria

Eligibility for inclusion in the final set included: any studies describing the development and/or application of a CDSS, or expert system in the context of chronic pain management. We defined a CDSS as any electronic system designed to assist in clinical decision making regarding chronic pain management, and in which patient-specific assessments and recommendations were generated for use by a clinician and/or patient [44]. Consistent with this definition, we excluded any CDSSs that examined pain as only one component of an overall assessment of QOL, as well as any that addressed acute pain management only. We also excluded studies that were written in languages other than English.

Results

The cross-database search yielded 70 published articles and three federally funded research studies describing ongoing investigations. No additional studies were identified via the manual search process. Full-text articles were retrieved for all titles considered to be potentially relevant by the authors. Nine studies describing eight discrete CDSSs were identified as meeting our inclusion criteria. Lack of homogeneity among the final set of studies precluded a quantitative meta-analysis of the data. Due to these analytic constraints, we conducted a descriptive literature review only.

As shown in Table 1, all eight CDSSs were designed to assist in the diagnosis and/or management of chronic pain. Two of these systems, the Pain Management Advisor (PMA) and the Diagnostic Headache Diary, were also designed to offer education to the health care provider. One, the PMA, had an interactive capability that permitted users to query the system for explanations, therapeutic rationale, and therapy guidelines. Two other systems, the SymptomReport and the PAINReportIt, featured adjunctive software programs (SymptomConsult, PAINConsultN) that were expressly designed as interactive educational tools for the patient.

Table 1

Key features of clinical decision-support systems (CDSSs) developed for chronic pain management

Name of CDSS Purpose Description Stage of Development Hardware & Software Requirements Data Input(s) Output(s) Target Recipient(s) of Output 
RHINOS [45To assist physicians in the diagnosis of chronic headaches or facial pain Based on expert knowledge of headache and facial pain specialists. Uses 4 sets of conditional, probability-based rules: 1) exclusive rules (i.e., if patient has disease D, s/he must have symptoms S1, S2, … Sn); 2) inclusive rules; 3) associate rules; and 4) disease image rules Prototype only: no testing System developed using the Prolog-KABA programming language. Runs on personal computers with CPU 8086, memory 382 Kbytes Physician input based on patient interview: – Patient demographics – Onset of headache – History since onset – Pain characteristics– Neurological signs associated with pain– Sleep status– Personal and familial history of headache– Jolt headache (Yes/No) – Sclerosis of retinal artery   Initial output after first set ofscreening questions: – List of diseases not rejected. Output after additional question(s):– List of possible diseases – Diagnostic conclusions– Explanation of the disease required examinations and suggested therapyHospital chart can be printed from the input data in the form of a natural language representation Physicians 
IVAN [46To provide recommendations for controlling pain and providing symptom relief in cancer, osteo- and rheumatoid arthritis Case-based reasoning strategy to record and retrieve information stored in an internal knowledge base Prototype only: no testing IVAN software written in LPA Prolog programming language for Windows (“WinProlog”); runs on PC Windows 95/NT. WinProlog – Pain symptom checklist – Current pain diagnosis– Current pain treatment regimen Computer screen display: – Diagnostic confirmation– Description of symptoms that may appear later– Treatments proven successful in similar or related cases– Possible alternative causes of the pain Physicians and patients 
The Diagnostic Headache Diary [47To educate and provide diagnostic support to primary care providers in order to improve management of headaches Rule-based expert system using Boolean logic. A set of diagnostic rules used to determine a diagnosis based upon the data entered in patient diary. Diary data are transformed into a diagnosis following the International Headache Society's classification Prototype developed; system-generated diagnoses were validated against physician expert- generated diagnoses Stand-alone Windows 95 program written in Delphi programming language – Patient data – Headache diary entries– Medications used to alleviate headache – Diagnosis of headache type PCPs 
Pain Management Advisor (PMA) (NovaIntelligence Inc., San Diego, CA) [48To enhance primary care providers' (PCPs) management of chronic pain – Relies on rule-based algorithms derived from expert knowledge of pain specialists– User asked a series of questions to refine the diagnosis and determine appropriate therapy– Interactive capability (e.g., for explanations, therapeutic rationales, therapy guidelines) Working version developed: some field testing conducted – Pentium-based PCs– Windows 95– PMA written in MicroSoft Visual Basic, v. 5.0, run as an expert application in XpertRule– Algorithms stored in MicroSoft Access database– MicroSoft Help Utility used for explanations and queries – Patient demographics – Diagnosis – Pain characteristics– Laboratory tests & imaging studies– Current medications– Prior therapies– Concurrent disease conditions– Allergies – Psychological status – A prioritized list of recommendations: 1) medical management (pharmacologic and nonpharmacological management, physical, psychosocial modalities); 2) invasive procedures; 3) referrals PCPs 
SymptomReport and Symptom Consult [49To assist clinicians in assessing cancer- related chronic pain and fatigue, and clarify patients' misbeliefs about pain assessment and management Not described Working version developed; field testing conducted MicroSoft Windows 95/98 personal computers with touch screen (Pen-Tab)   Patient self-reported input: – Patient demographics– 1970 version of McGill Pain Questionnaire (MPQ)– Barriers Questionnaire (BQ)– Schwartz cancer fatigue scale (SCFS-6) 1) Hard copies of expert system report given to patient and to clinician2) Patient receives educational materials on how to report pain, use pain medications safely, and manage fatigue. Materials are customized to the patient's needs and presented in an interactive, multimedia format. Patients have option to read or listen to information on the computer, print any or all of the materials, or do both Oncology nurses; other clinicians 
Decision-support computer program for cancer pain management [50To improve the oncology nurses' decision making related to cancer pain management among culturally diverse female oncology patients – Survey data on multicultural cancer pain characteristics were analyzed using fuzzy inference logic to develop 4 modules: 1) a generic knowledge base; 2) a culture-specific knowledge base; 3) decision-making; and 4) self-adaptation.– Decision-making module consists of 2 sets of fuzzy inference logic developed via a genetic algorithm  Hardware not described– Adaptive fuzzy logic software used to develop and run the knowledge base generation, and the decision-making and self- adaptation modules   Nurse-entered data basedon patient interview: – Patient demographics – Pain characteristics Computer screen display of analgesic treatment recommendations based on the World Health Organization (WHO)'s analgesic ladder Oncology nurses 
PAINReportIt and PAINConsultN [51,52To assist clinicians in assessing chronic pain and to educate patients regarding pain monitoring and management Not described Working version developed; field testing conducted MicroSoft Windows 95/98 personal computers with touch screen (Pen-Tab); data stored in Access 97 database   Patient self-reported input: 1) Patient demographics – McGill Pain Questionnaire – Pain status– Patient goals for and expectations about pain– Type and effectiveness of previous pain treatments Hard copies of expert system report given to patient and to clinician; on-screen viewing of report is also possible Oncology nurses and other clinicians 
Touch-screen Computer Assessment of Chronic Low Back Pain [53To collect pain symptom status and other health information from patients with low back pain Not described Working version developed; limited field testing Web-based system using Dell Inspiron 1100 laptop with Microsoft XP operating system; 14′ touch screen (Magic Touch, Keytec) – Patient demographics– Oswestry Low Back Pain Disability Index (Version 2)– Beck Depression Inventory– MOS Short Form-36 (MOS SF-36) Not described Physicians 
Name of CDSS Purpose Description Stage of Development Hardware & Software Requirements Data Input(s) Output(s) Target Recipient(s) of Output 
RHINOS [45To assist physicians in the diagnosis of chronic headaches or facial pain Based on expert knowledge of headache and facial pain specialists. Uses 4 sets of conditional, probability-based rules: 1) exclusive rules (i.e., if patient has disease D, s/he must have symptoms S1, S2, … Sn); 2) inclusive rules; 3) associate rules; and 4) disease image rules Prototype only: no testing System developed using the Prolog-KABA programming language. Runs on personal computers with CPU 8086, memory 382 Kbytes Physician input based on patient interview: – Patient demographics – Onset of headache – History since onset – Pain characteristics– Neurological signs associated with pain– Sleep status– Personal and familial history of headache– Jolt headache (Yes/No) – Sclerosis of retinal artery   Initial output after first set ofscreening questions: – List of diseases not rejected. Output after additional question(s):– List of possible diseases – Diagnostic conclusions– Explanation of the disease required examinations and suggested therapyHospital chart can be printed from the input data in the form of a natural language representation Physicians 
IVAN [46To provide recommendations for controlling pain and providing symptom relief in cancer, osteo- and rheumatoid arthritis Case-based reasoning strategy to record and retrieve information stored in an internal knowledge base Prototype only: no testing IVAN software written in LPA Prolog programming language for Windows (“WinProlog”); runs on PC Windows 95/NT. WinProlog – Pain symptom checklist – Current pain diagnosis– Current pain treatment regimen Computer screen display: – Diagnostic confirmation– Description of symptoms that may appear later– Treatments proven successful in similar or related cases– Possible alternative causes of the pain Physicians and patients 
The Diagnostic Headache Diary [47To educate and provide diagnostic support to primary care providers in order to improve management of headaches Rule-based expert system using Boolean logic. A set of diagnostic rules used to determine a diagnosis based upon the data entered in patient diary. Diary data are transformed into a diagnosis following the International Headache Society's classification Prototype developed; system-generated diagnoses were validated against physician expert- generated diagnoses Stand-alone Windows 95 program written in Delphi programming language – Patient data – Headache diary entries– Medications used to alleviate headache – Diagnosis of headache type PCPs 
Pain Management Advisor (PMA) (NovaIntelligence Inc., San Diego, CA) [48To enhance primary care providers' (PCPs) management of chronic pain – Relies on rule-based algorithms derived from expert knowledge of pain specialists– User asked a series of questions to refine the diagnosis and determine appropriate therapy– Interactive capability (e.g., for explanations, therapeutic rationales, therapy guidelines) Working version developed: some field testing conducted – Pentium-based PCs– Windows 95– PMA written in MicroSoft Visual Basic, v. 5.0, run as an expert application in XpertRule– Algorithms stored in MicroSoft Access database– MicroSoft Help Utility used for explanations and queries – Patient demographics – Diagnosis – Pain characteristics– Laboratory tests & imaging studies– Current medications– Prior therapies– Concurrent disease conditions– Allergies – Psychological status – A prioritized list of recommendations: 1) medical management (pharmacologic and nonpharmacological management, physical, psychosocial modalities); 2) invasive procedures; 3) referrals PCPs 
SymptomReport and Symptom Consult [49To assist clinicians in assessing cancer- related chronic pain and fatigue, and clarify patients' misbeliefs about pain assessment and management Not described Working version developed; field testing conducted MicroSoft Windows 95/98 personal computers with touch screen (Pen-Tab)   Patient self-reported input: – Patient demographics– 1970 version of McGill Pain Questionnaire (MPQ)– Barriers Questionnaire (BQ)– Schwartz cancer fatigue scale (SCFS-6) 1) Hard copies of expert system report given to patient and to clinician2) Patient receives educational materials on how to report pain, use pain medications safely, and manage fatigue. Materials are customized to the patient's needs and presented in an interactive, multimedia format. Patients have option to read or listen to information on the computer, print any or all of the materials, or do both Oncology nurses; other clinicians 
Decision-support computer program for cancer pain management [50To improve the oncology nurses' decision making related to cancer pain management among culturally diverse female oncology patients – Survey data on multicultural cancer pain characteristics were analyzed using fuzzy inference logic to develop 4 modules: 1) a generic knowledge base; 2) a culture-specific knowledge base; 3) decision-making; and 4) self-adaptation.– Decision-making module consists of 2 sets of fuzzy inference logic developed via a genetic algorithm  Hardware not described– Adaptive fuzzy logic software used to develop and run the knowledge base generation, and the decision-making and self- adaptation modules   Nurse-entered data basedon patient interview: – Patient demographics – Pain characteristics Computer screen display of analgesic treatment recommendations based on the World Health Organization (WHO)'s analgesic ladder Oncology nurses 
PAINReportIt and PAINConsultN [51,52To assist clinicians in assessing chronic pain and to educate patients regarding pain monitoring and management Not described Working version developed; field testing conducted MicroSoft Windows 95/98 personal computers with touch screen (Pen-Tab); data stored in Access 97 database   Patient self-reported input: 1) Patient demographics – McGill Pain Questionnaire – Pain status– Patient goals for and expectations about pain– Type and effectiveness of previous pain treatments Hard copies of expert system report given to patient and to clinician; on-screen viewing of report is also possible Oncology nurses and other clinicians 
Touch-screen Computer Assessment of Chronic Low Back Pain [53To collect pain symptom status and other health information from patients with low back pain Not described Working version developed; limited field testing Web-based system using Dell Inspiron 1100 laptop with Microsoft XP operating system; 14′ touch screen (Magic Touch, Keytec) – Patient demographics– Oswestry Low Back Pain Disability Index (Version 2)– Beck Depression Inventory– MOS Short Form-36 (MOS SF-36) Not described Physicians 

All but two of the CDSSs were designed to target a specific type of chronic pain or pain-related condition. These included: headache (2), low back (1), and cancer-related (3). Input specifications also varied widely, both in terms of the type and amount of data required, and in terms of the party responsible for data entry (i.e., physician, other health care provider, or patient). All of the CDSSs required detailed input regarding pain symptomology. Half of the systems reviewed also elicited data on pain medications currently used, and three requested QOL information. Other data, such as psychological status, history of prior therapies (both pharmaceutical and nondrug-related therapies), patient goals for pain care, and barriers to pain management, were specified as required inputs in only two of the CDSSs reviewed. Three of the CDSSs used standard, psychometrically validated instruments (e.g., McGill Pain Questionnaire, Medical Outcomes Study Short Form 36, Oswestry Low Back Pain Disability Index) to collect input data.

The majority of the CDSSs (5 of 8) produced output that was intended for clinician use (e.g., physician or nurse) only. Three targeted both the clinician and the patient. Output varied considerably in terms of content, format, and delivery (e.g., electronic, paper, or both). Several CDSSs scored and summated patient responses on standard pain and QOL-related assessment measures. Based on the published descriptions, at least five of the CDSSs were designed to generate output in real time at the patient's medical visit.

In terms of systems architecture, all CDSSs reviewed were stand-alone, personal computer-based systems. None interfaced with existing electronic medical records systems, pharmacy, appointment scheduling, or laboratory results reporting. Three either were Web-based or had the capacity to use a Web-based platform.

Table 2 summarizes the types of studies conducted to date to evaluate chronic pain CDSSs. Of the eight CDSSs identified, five had published evaluation results. With one exception, all were feasibility studies exclusively. Studies were conducted in both inpatient and outpatient settings. Among the outpatient studies, two had been conducted with PCPs; the remainder involved specialists in tertiary care settings. Study designs used for evaluating these CDSSs varied: two were cross-sectional, two involved immediate pre- and post-assessments of CDSS use, one was longitudinal (12-month follow-up), and one was a focus group. Study sample sizes ranged from 213 to 4, with the majority having 50 or fewer subjects.

Table 2

Evaluation of computerized decision-support systems (CDSSs) in chronic pain management

Study Name of CDSS Sample Design Outcomes Assessed Results 
Nielsen et al.[47Diagnostic Headache Diary PCPs Not described –% agreement of computer-generated vs expert physician diagnoses – 100% agreement of computer-generated vs expert physician diagnoses 
Knab et al.[48Pain Management Advisor N = 50 PCPs; N = 50 chronic pain patients Longitudinal, with 12- month PCP follow-up – Ease of use– Medical appropriateness of recommendations–% physicians' adopting Pain Management Advisor treatment recommendations–% patients referred to pain specialty clinic Physician adoption of system-generated recommendations: 85% of cases Average physician time spent per case: 4.9 minutes (SD ± 3.4) Ease of use as rated by physicians: 4.2 (± 2.8 cm) on scale of 1–10 25% of patients referred to pain specialty clinic Average time to pain specialty referral: 3.7 months (SD ± 0.6)– 70% of nonreferred patients still receiving computer-recommended treatment 1 year post 
Wilkie et al.[49SymptomReport N = 41 outpatients with cancer Cross-sectional; telephone interviews – 13-item patient acceptability scale assessing ease of use of SymptomReport– Input completion time– Qualitative assessment of degree of communication with health care providers regarding pain and other symptoms – Mean time to complete SymptomReport: 38.2 minutes (SD ± 20.2)– Mean time to complete SymptomConsult: 20.9 minutes (SD ± 18.6)– 71% of participants rated SymptomReport as easy, enjoyable, and informative– 68% reported that the amount and content of their pain-related communication with their doctor had not changed much– Qualitative patient comments: 1) helped them talk more explicitly about pain; 2) gave them greater awareness of pain symptoms; 3) increased understanding of and enhanced compliance regarding symptom management – Patient comments re: SymptomConsult: 1) contained too much information; 2) not targeted to individual needs; (3) provided no new information 
Wilkie et al.[52PAINReportIt N = 213, of whom 106 were cancer inpatients; 10 metastatic cancer outpatients; and 97 were individuals experiencing acute or chronic pain recruited from nonhealth care settings Descriptive, cross- sectional study in 3 settings: 1) tertiary care, 2) radiation oncology clinic, 3) mobile clinic – 13-item scale measuring acceptability of PAINReportIt (i.e., time to complete, ease of use, understandability of directions, ergonomic elements of system, and completeness of responses) – 86% of respondents rated the PAINReportIt as beneficial for pain communication– 100% patient completion rate of PAINReportIt– Mean time to patient completion: 15.8 minutes (SD ± 6.7)– Mean patient acceptability score: 11.7 (SD ± 1.6); scores ranged from 6 to 13. 80% of patients rated acceptability as greater than minimum criterion of 10– User comments: 1) some mechanical difficulties; 2) reactivity to use of system (e.g., vomiting); 3) preference to relay information directly to provider 
Huang et al.[51PAINReportIt and PAINConsultN Pilot study 1: N = 9 patients with bone metastasis-related pain Pilot study 2: N = 15 patients with cancer and bone metastasis Physician focus group: N = 4 radiation oncologists 1) Pilot test 1: Feasibility study using a test–retest within- subject design 2) Pilot test 2: Feasibility study using an 11-day test–retest within-subject design 3) Physician focus group Outcome measures used for both pilot studies: – Acceptability – Completeness – Time to complete – Validity Physician focus group:– Receptivity to PAINReportIt and PAINConsultN Pilot study 1:– Mean time to complete PAINReportIt at pretest: 12 minutes (SD ± 4)– Time to complete PAINReportIt at post-test: 1–7 minutes per patient– Mean acceptability score: 11.2 (SD ± 1.8)– 100% completion rate Pilot study 2:– Mean time to complete PAINReportIt at pretest: 17 minutes (SD ± 6)– Time to complete PAINReportIt at posttest: 14 minutes (SD ± 8)– Mean acceptability score: 12.2 (SD ± 1.3)– 100% completion rate– PAINConsultN recommended a median of 4 drugs; physicians prescribed a median of 3 drugs postuse– Patient pain intensity average 4 at baseline and 2.7 at posttest (not significant)Focus group:– Physicians saw value of PAINReportIt: 1) increased efficiency during clinic visit; 2) supplemented pain service consultation; 3) provided outcome data– PAINConsultN was viewed as clinical adjunct but formatting needed improvement 
Koestler et al.[53Touch-screen Computer Assessment of Chronic Low Back Pain N = 30 low back pain patients Cross-sectional design in tertiary care clinic – Patient-ratings of ergonomic design; degree of technical difficulties; acceptability, ease of use; data security – Mean time to complete the 67-item touch screen: 27.4 minutes (SD ± 13.8) 
Study Name of CDSS Sample Design Outcomes Assessed Results 
Nielsen et al.[47Diagnostic Headache Diary PCPs Not described –% agreement of computer-generated vs expert physician diagnoses – 100% agreement of computer-generated vs expert physician diagnoses 
Knab et al.[48Pain Management Advisor N = 50 PCPs; N = 50 chronic pain patients Longitudinal, with 12- month PCP follow-up – Ease of use– Medical appropriateness of recommendations–% physicians' adopting Pain Management Advisor treatment recommendations–% patients referred to pain specialty clinic Physician adoption of system-generated recommendations: 85% of cases Average physician time spent per case: 4.9 minutes (SD ± 3.4) Ease of use as rated by physicians: 4.2 (± 2.8 cm) on scale of 1–10 25% of patients referred to pain specialty clinic Average time to pain specialty referral: 3.7 months (SD ± 0.6)– 70% of nonreferred patients still receiving computer-recommended treatment 1 year post 
Wilkie et al.[49SymptomReport N = 41 outpatients with cancer Cross-sectional; telephone interviews – 13-item patient acceptability scale assessing ease of use of SymptomReport– Input completion time– Qualitative assessment of degree of communication with health care providers regarding pain and other symptoms – Mean time to complete SymptomReport: 38.2 minutes (SD ± 20.2)– Mean time to complete SymptomConsult: 20.9 minutes (SD ± 18.6)– 71% of participants rated SymptomReport as easy, enjoyable, and informative– 68% reported that the amount and content of their pain-related communication with their doctor had not changed much– Qualitative patient comments: 1) helped them talk more explicitly about pain; 2) gave them greater awareness of pain symptoms; 3) increased understanding of and enhanced compliance regarding symptom management – Patient comments re: SymptomConsult: 1) contained too much information; 2) not targeted to individual needs; (3) provided no new information 
Wilkie et al.[52PAINReportIt N = 213, of whom 106 were cancer inpatients; 10 metastatic cancer outpatients; and 97 were individuals experiencing acute or chronic pain recruited from nonhealth care settings Descriptive, cross- sectional study in 3 settings: 1) tertiary care, 2) radiation oncology clinic, 3) mobile clinic – 13-item scale measuring acceptability of PAINReportIt (i.e., time to complete, ease of use, understandability of directions, ergonomic elements of system, and completeness of responses) – 86% of respondents rated the PAINReportIt as beneficial for pain communication– 100% patient completion rate of PAINReportIt– Mean time to patient completion: 15.8 minutes (SD ± 6.7)– Mean patient acceptability score: 11.7 (SD ± 1.6); scores ranged from 6 to 13. 80% of patients rated acceptability as greater than minimum criterion of 10– User comments: 1) some mechanical difficulties; 2) reactivity to use of system (e.g., vomiting); 3) preference to relay information directly to provider 
Huang et al.[51PAINReportIt and PAINConsultN Pilot study 1: N = 9 patients with bone metastasis-related pain Pilot study 2: N = 15 patients with cancer and bone metastasis Physician focus group: N = 4 radiation oncologists 1) Pilot test 1: Feasibility study using a test–retest within- subject design 2) Pilot test 2: Feasibility study using an 11-day test–retest within-subject design 3) Physician focus group Outcome measures used for both pilot studies: – Acceptability – Completeness – Time to complete – Validity Physician focus group:– Receptivity to PAINReportIt and PAINConsultN Pilot study 1:– Mean time to complete PAINReportIt at pretest: 12 minutes (SD ± 4)– Time to complete PAINReportIt at post-test: 1–7 minutes per patient– Mean acceptability score: 11.2 (SD ± 1.8)– 100% completion rate Pilot study 2:– Mean time to complete PAINReportIt at pretest: 17 minutes (SD ± 6)– Time to complete PAINReportIt at posttest: 14 minutes (SD ± 8)– Mean acceptability score: 12.2 (SD ± 1.3)– 100% completion rate– PAINConsultN recommended a median of 4 drugs; physicians prescribed a median of 3 drugs postuse– Patient pain intensity average 4 at baseline and 2.7 at posttest (not significant)Focus group:– Physicians saw value of PAINReportIt: 1) increased efficiency during clinic visit; 2) supplemented pain service consultation; 3) provided outcome data– PAINConsultN was viewed as clinical adjunct but formatting needed improvement 
Koestler et al.[53Touch-screen Computer Assessment of Chronic Low Back Pain N = 30 low back pain patients Cross-sectional design in tertiary care clinic – Patient-ratings of ergonomic design; degree of technical difficulties; acceptability, ease of use; data security – Mean time to complete the 67-item touch screen: 27.4 minutes (SD ± 13.8) 

PCP, primary care physician.

Patient acceptability of the CDSS was the single most commonly assessed variable. Evaluations of both the SymptomReport and the PAINReportIt employed a common tool to assess patient acceptability. Results across four pilot tests, involving a total of 254 subjects, consistently showed high acceptability of these two CDSSs and 100% completion rates in terms of data input [49,51,52]. The average amount of time required ranged from a high of 38 minutes to a low of 14 minutes.

Two studies addressed the issue of medical accuracy of the system-generated recommendation. Both studies examined this issue by comparing system-generated diagnoses and/or treatment recommendations with those generated by physician experts based on a select sample of patient cases. Results showed moderate to high agreement between the system- and expert-generated recommendations [47,48].

Clinician perceptions concerning ease of use and value of a CDSS for chronic pain management were examined in two studies. Overall, physicians found the system to be moderately easy to use and of some clinical worth [48,52]. Knab and colleagues [48] reported that the average clinician time spent per case on the PMA to obtain output was approximately 5 minutes (± 3.4 minutes) [48]. In addition, 85% of physicians adopted the PMA-generated recommendations, and 25% of the study patients seen were referred to a pain specialist, with the average time to referral being 3.7 months (± 0.6 months) [48].

Analyses concerning the impact of CDSSs on patient outcomes were limited. Huang and colleagues [51] assessed changes in pain intensity pre- and post-CDSS use in a sample of radiation oncology clinic patients. Although there was a downward trend in pain levels over time, results were not statistically significant, possibly due to the small size of the sample (N = 15). Wilkie and colleagues [49,52] reported qualitative data regarding the impact of the SymptomReport and the PAINReportIt on patient behavior. Results were contradictory. Of the 41 outpatients who used the SymptomReport, approximately 68% stated that it had not affected their pain-related communication. Some, however, felt that they talked more precisely and explicitly about their pain as a result of using it. Other comments associated with use of the SymptomReport included an increased awareness of pain symptoms and greater compliance with pain symptom management. In contrast, 86% of users of the PAINReportIt cited it as beneficial for patient–doctor pain-related communication, and that it “freed them to describe their pain.”

Discussion

Over the past two decades, there has been a gradual but steady growth in research on the use of CDSSs for chronic pain management. To date, the number of such systems is small but expanding. Advances have also been evident in terms of both the quantity and quality of evaluation studies conducted. While the earliest versions were presented in the literature as prototypes only [45,46], CDSSs developed since 2001 have all undergone some form of field testing. The majority of these studies, however, have been nonexperimental in design, and focused exclusively on process measures, such as patient or clinician ratings of system acceptability and usability. Other salient process measures, such as the degree to which the clinician and/or patient actually reviewed and utilized system output, or had confidence in its accuracy, have not been consistently assessed. Poor usability and practitioner nonacceptance of computer recommendations can serve as significant barriers to system adoption in routine clinical practice [36,54,55].

User preferences regarding the presentation of computer output, including content, formatting (e.g., color, graphics), and length, have not been solicited in most instances either. Similarly, there are few published data concerning technical difficulties (e.g., type and number of system crashes or touch-screen calibration problems) encountered by CDSS users. Both issues have important ramifications for future system refinements [52]. Additionally, there is a paucity of information on contextual circumstances (e.g., presence of a local “champion” of the system) or the processes used to integrate the CDSS into the existing clinical workflow, key considerations for successful system implementation. Not least, testing has been confined almost exclusively to either inpatient or tertiary care settings, with only two studies conducted in the primary care context to date.

The effects of these systems on patient outcomes remain understudied. Two studies reported qualitative data concerning CDSS impact on patients' perceived pain-related communication with their physician; however, sample sizes were small and results inconsistent [49,52]. One study reported system impact on patient pain intensity level over time, but the study lacked adequate statistical power to detect clinically important differences [52]. Other major patient outcomes, such as health care utilization, health care costs, pain relief, pain medication usage, communication with health care provider about pain, functional status, and QOL, have not been examined. One study reported evidence that CDSS use may invoke patient reactivity (e.g., vomiting, intensified pain symptoms). Potential adverse effects on patients should be measured in future investigations [52]. Similarly, there is need for more extensive and consistent examination of system impact on clinician pain management performance.

While we sought to be as comprehensive as possible in our literature search, our review was restricted to only English-language studies. In addition, it is possible that there are other CDSSs under development that we failed to identify. The limited size of the available literature, as well as the methods used in these primary studies, prevented us from conducting a meta-analysis of research findings, and from reaching more definitive conclusions about the impact of these systems on physician performance and patient pain, functioning, and other aspects of QOL. Notably, in all the studies we examined, study investigators and CDSS developers were one and the same, a fact that may have resulted in more positive findings [36]. Lastly, we did not conduct a separate evaluation of the clinical appropriateness of either the CDSS algorithms or treatment recommendations, nor of the underlying logic employed to generate such algorithms.

The potential for these computer-based systems to improve the quality of chronic pain management in the primary care context is substantial. To manage chronic pain effectively, PCPs first need to conduct a comprehensive patient assessment [56]. Information on the patient's pain experience, history of and preferences for pain treatment, psychological status, approach to self-management, and personal goals/priorities are key variables to collect during assessment, as they are critical for making an accurate diagnosis and for developing an appropriate treatment plan to which the patient will adhere [56]. An expert system-type CDSS provides a way to elicit and integrate such patient-specific information in a manner that is convenient and timely for both physicians and patients. Moreover, the ensuing system-generated recommendations are individualized to the needs and circumstances of the specific pain patient per best clinical practices [56].

CDSSs developed for chronic pain management have as yet, however, to fulfill this promise. As our review indicates, systems developed thus far have been predominantly biomedical in focus, and designed to assist physicians and other health care providers in the medical management of pain symptoms (including invasive procedures and referrals) exclusively. Only a few of these systems have reached a sufficiently advanced stage of development to warrant more rigorous testing in large-scale, randomized controlled trials [26,49,52]. Such trials are imperative for understanding system effect on provider performance and patient outcomes.

Significantly, none of the systems reviewed were integrated with existing electronic records systems, nor did they include reminder or documentation functionalities, features which have all been shown to increase the likelihood of physician adoption [57,58]. This lack of integration may reflect the fact that widespread adoption of electronic records systems by health care institutions has been a relatively recent occurrence. Potentially this trend, coupled with pressures from major accrediting agencies to document the provision of pain screening and treatment, along with the recent publication of primary care pain management guidelines, may serve to spur additional, more rigorous research on the use of CDSSs for chronic pain management in primary care [56,59,60]. Demonstrating the clinical value of these systems is a critical step in convincing health care organizations and clinicians that the benefits of investing in a CDSS for pain management outweigh potential risks. In particular, physicians need to be assured that this type of system can enhance rather than erode their decision-making abilities, and that time spent learning how to use a CDSS yields measurable improvement in patient health and well-being.

References

1
American Pain Society.
Chronic pain in America: Roadblock to relief, 1999. American Pain Society, APS News and Announcements. Available at: http://www.ampainsoc.org/whatsnew/toc_road.htm#toc.
2
Lande
SD
.
The problem of pain in managed care
. In:
Lande
SD
Kulich
RJ
, eds.
Managed Care and Pain
 .
Glenview, IL
:
American Pain Society
;
2000
:
19
.
3
Turner
JA
Leresche
L
Von Korff
M
Ehrlich
L
.
Primary care back pain patient characteristics, visit content and short term outcomes
.
Spine
 
1998
;
23
:
463
9
.
4
Von Korff
M
Gruman
J
Schaefer
J
Curry
SJ
Wagner
EH
.
Collaborative management of chronic illness
.
Ann Intern Med
 
1997
;
127
(
12
):
1097
102
.
5
Von Korff
M
.
Pain management in primary care: An individualized stepped-care approach
. In:
Gatchel
RJ
Turk
DL
, eds.
Psychosocial Factors in Pain: Critical Perspectives
 .
New York
:
The Guilford Press
;
1999
.
6
Von Korff
M
Katon
W
Bush
T
et al
Treatment costs, cost offset, and cost-effectiveness of collaborative management of depression
.
Psychosom Med
 
1998
;
60
(
2
):
143
9
.
7
Bertakis
KD
Azari
R
Callahan
EJ
.
Patient pain: Its influence on primary care physician–patient interaction
.
Fam Med
 
2003
;
35
(
2
):
119
23
.
8
Green
CR
Wheeler
JRC
Laporte
F
Marchant
B
Guerrero
E
.
How well is chronic pain managed? Who does it well?
Pain Med
 
2002
;
3
(
1
):
56
62
.
9
Von Roenn
JH
Cleeland
CS
Gonin
R
Hatfield
AK
Pandya
KJ
.
Physician attitudes and practice in cancer pain management: A survey from the Eastern Cooperative Oncology Group
.
Ann Intern Med
 
1993
;
119
:
121
6
.
10
Cleeland
CS
Cleeland
LM
Dar
R
Rinehardt
LC
.
Factors influencing physician management of cancer pain
.
Cancer
 
1986
;
58
:
796
800
.
11
Fife
BL
Irick
N
Painter
JD
.
A comparative study of the attitudes of physicians and nurses towards the management of cancer pain
.
J Pain Symptom Manage
 
1993
;
8
:
132
9
.
12
Wilson
JF
Brockoop
GW
Kryst
S
Steger
H
Witt
WO
.
Medical students' attitudes towards pain before and after a brief course on pain
.
Pain
 
1992
;
50
:
251
6
.
13
Weinstein
SM
Laux
LF
Thornby
JI
et al
Physicians' attitudes towards pain and the use of opioid analgesics: Results of a survey from the Texas Cancer Pain Initiative
.
South Med J
 
2000
;
93
(
5
):
479
87
.
14
Weinstein
SM
Laux
LF
Thornby
JI
et al
Medical students' attitudes towards pain and the use of opioid analgesics: Implications for changing medical school curriculum
.
South Med J
 
2000
;
93
(
5
):
472
8
.
15
Whedon
M
Ferrell
BR
.
Professional and ethical considerations in the use of high-tech pain management
.
Oncol Nurs Forum
 
1991
;
18
:
1135
43
.
16
Ward
S
Goldberg
N
Miller-Mccauley
V
et al
Patient-related barriers to management of cancer pain
.
Pain
 
1993
;
52
:
319
24
.
17
Pargeon
KL
Hailey
BJ
.
Barriers to effective cancer pain management: A review of the literature
.
J Pain Symptom Manage
 
1999
;
18
:
358
68
.
18
Breitbart
W
Passik
S
Mcdonald
MV
et al
Patient-related barriers to pain management in ambulatory AIDS patients
.
Pain
 
1993
;
52
:
319
24
.
19
Gunnarsdottir
S
Donovan
HS
Serlin
RC
Voge
C
Ward
S
.
Patient-related barriers to pain management: The barriers questionnaire II (BQ-II)
.
Pain
 
2002
;
99
:
385
96
.
20
Arora
NK
.
Interacting with cancer patients: The significance of physicians' communication behavior
.
Soc Sci Med
 
2003
;
57
:
791
806
.
21
Mccaffery
M
Thorpe
DM
.
Differences in perception of pain and the development of adversarial relationships among health care providers
. In:
Hill
CS
Fields
W
, eds.
Advances in Pain Research and Therapy: Drug Treatment of Cancer Pain in a Drug-Oriented Society
 , Vol.
11
.
New York
:
Raven Press
;
1989
.
22
Ong
LML
De Haes
JCJM
Hoos
AM
Lammes
FB
.
Doctor–patient communication: A review of the literature
.
Soc Sci Med
 
1995
;
40
:
903
18
.
23
Jones
WL
Rimer
BK
Levy
MH
Kinman
JL
.
Cancer patients' knowledge, beliefs, and behavior regarding pain control regimens: Implications for education programs
.
Patient Educ Couns
 
1984
;
5
(
4
):
159
64
.
24
Lukoschek
P
Fazzari
M
Marantz
P
.
Patient and physician factors predict patients' comprehension of health information
.
Patient Educ Couns
 
2003
;
50
:
201
10
.
25
Donovan
JL
Blake
DR
.
Patient non-compliance: Deviance or reasoned decision-making?
Soc Sci Med
 
1992
;
34
:
377
94
.
26
Garg
AX
Adhikari
NKJ
Mcdonald
H
et al
Effects of computerized clinical decision support systems on practitioner performance and patient outcomes a systematic review
.
JAMA
 
2005
;
293
(
10
):
1223
38
.
27
Nilasena
DS
Lincoln
MJ
.
A computer-generated reminder system improves physician compliance with diabetes preventive care guidelines
 .
Proc Ann Symp Comput Appl Med Care
 
1995
:
640
5
.
28
Chambers
CV
Balaban
DJ
Charlson
BL
Grassberger
DM
.
The effect of microcomputer-generated reminders on influenza vaccination rates in a university-based family practice center
.
J Am Board Fam Pract
 
1991
;
4
:
19
26
.
29
Flanagan
JR
Doebbeling
BN
Dawson
J
Beekmann
S
.
Randomized study of online vaccine reminders in adult primary care
 .
Proc AMIA Symp
 
1999
:
755
9
.
30
Burack
RC
Gimotty
PA
.
Promoting screening mammography in inner-city settings: The sustained effectiveness of computerized reminders in a randomized controlled trial
.
Med Care
 
1997
;
35
:
921
31
.
31
Rossi
RA
Every
NR
.
A computerized intervention to decrease the use of calcium channel blockers in hypertension
.
J Gen Intern Med
 
1997
;
12
:
672
8
.
32
Montgomery
AA
Fahey
T
Peters
TJ
Macintosh
C
Sharp
DJ
.
Evaluation of computer based clinical decision support system and risk chart for management of hypertension in primary care: Randomized controlled trial
.
BMJ
 
2000
;
320
:
686
90
.
33
Shea
S
Dumouchel
W
Bahamonde
L
.
A meta-analysis of 16 randomized controlled trials to evaluate computer-based reminder systems for preventive care in the ambulatory setting
.
J Am Med Inform Assoc
 
1996
;
3
:
399
409
.
34
Unrod
M
Smith
MY
DePue
J
Spring
B
Winkel
G
.
Randomized controlled trial of a computer-based, tailored intervention to increase smoking cessation counseling by primary care physicians
.
J Gen Intern Med
 
2007
;
22
:
478
84
.
35
Balas
EA
Austin
SM
Mitchell
JA
et al
The clinical value of computerized information services: A review of 98 randomized clinical trails
.
Arch Fam Med
 
1996
;
5
:
271
8
.
36
Hunt
DL
Haynes
RB
Hanna
SE
Smith
K
.
Effects of computer-based clinical decision support systems on physician performance and patient outcome: A systematic review
.
JAMA
 
1998
;
280
(
15
):
1339
46
.
37
Revere
D
Dunbar
PJ
.
Review of computer-generated outpatient health behavior interventions: Clinical encounters “in absentia.”
J Am Med Inform Assoc
 
2001
;
8
(
1
):
62
79
.
38
Prochaska
JO
Velicer
WF
Redding
C
et al
Stage-based expert systems to guide a population of primary care patients to quit smoking, eat healthier, prevent skin cancer, and receive regular mammograms
.
Prev Med
 
2005
;
41
(
2
):
406
16
.
39
Prochaska
JO
Velicer
WF
Fava
JL
Rossi
JS
Tsoh
JY
.
Evaluating a population-based recruitment approach and a stage-based expert system intervention for smoking cessation
.
Addict Behav
 
2001
;
26
(
4
):
583
602
.
40
Strecher
VJ
Kreuter
M
Den Boer
DJ
et al
The effects of computer-tailored smoking cessation messages in family practice settings
.
J Fam Pract
 
1994
;
39
(
3
):
262
70
.
41
Strecher
VJ
Shiffman
S
West
R
.
Randomized controlled trial of a web-based computer-tailored smoking cessation program as a supplement to nicotine patch therapy
.
Addiction
 
2005
;
100
(
5
):
682
8
.
42
Dijkstra
A
De Vries
H
Roijackers
J
.
Long-term effectiveness of computer-generated tailored feedback in smoking cessation
.
Health Educ Res
 
1998
;
13
(
2
):
207
14
.
43
Dijkstra
A
De Vries
H
Roijackers
J
Van Breukelen
G
.
Tailored interventions to communicate stage-matched information to smokers in different motivational stages
.
J Consult Clin Psychol
 
1998
;
66
(
3
):
549
57
.
44
Kawamoto
K
Houlihan
CA
Balas
EA
Lobach
DF
.
Improving clinical practice using clinical decision support systems: A systematic review of trials to identify features critical to success
.
BMJ
 
2005
;
330
:
765
73
.
45
Matsumura
Y
.
RHINOS: A consultation system for diagnosis of headache and facial pain
.
Comput Methods Programs Biomed
 
1986
;
23
:
65
71
.
46
Thomas
J
.
IVAN: An expert system for pain control and symptom relief in advance cancer
.
PC AI
 
1999
;
13
(
4
):
28
30
.
47
Nielsen
KD
Rasmussen
C
Russell
MB
.
The Diagnostic Headache Diary: A headache expert system
. In:
Paiva
T
Penzel
T
, eds.
European Neurological Network
 .
Amsterdam
:
IOS Press
;
2000
.
48
Knab
JH
Wallace
MS
Wagner
RL
Tsoukatos
J
Weinger
MB
.
The use of a computer-based decision support system facilitates primary care physicians' management of chronic pain
.
Anesth Analg
 
2001
;
93
:
712
20
.
49
Wilkie
DJ
Huang
H
Berry
DL
et al
Cancer symptom control: Feasibility of a tailored, interactive computerized program for patients
.
Fam Community Health
 
2001
;
24
(
3
):
48
62
.
50
Im
E
Chee
W
.
Decision support computer program for cancer pain management
.
Comput Inform Nurs
 
2003
;
21
(
1
):
12
21
.
51
Huang
H
Wilkie
DJ
Zong
S
et al
Developing a computerized data collection and decision support system for cancer pain management
.
Comput Inform Nurs
 
2003
;
21
(
4
):
206
17
.
52
Wilkie
DJ
Judge
MK
Berry
DL
Dell
J
Zong
S
Gilespie
R
.
Usability of a computerized PAINReportIt in the general public with pain and people with cancer pain
.
J Pain Symptom Manag
 
2003
;
25
(
3
):
213
24
.
53
Koestler
ME
Libby
E
Schofferman
J
Redmon
T
.
Web-based screen computer assessment of chronic low back pain: A pilot study
.
Comput Inform Nurs
 
2005
;
23
(
5
):
275
84
.
54
Wyatt
JC
Spiegelhalter
DJ
.
Evaluating medical expert systems: What to test and how?
Med Inform (Lond)
 
1990
;
15
(
3
):
205
17
.
55
Reisman
Y
.
Computer-based clinical decision-aids: A review of methods and assessment of systems
.
Med Inform (Lond)
 
1996
;
21
:
179
97
.
56
Gruener
D
Lande
SD
, eds.
Pain Control in the Primary Care Setting
 .
Glenview, IL
:
American Pain Society
;
2006
.
57
Shiffman
RN
Liaw
Y
Brandt
CA
Corb
GJ
.
Computer-based guideline implementation systems: A systematic review of functionality and effectiveness
.
J Am Med Inform Assoc
 
1999
;
6
:
104
14
.
58
Muller
ML
Ganslandt
T
Eich
HP
Lang
K
Ohmann
C
Prokosch
HU
.
Towards integration of clinical decision support in commercial hospital information systems using distributed, reusable software and knowledge components
.
Int J Med Inform
 
2001
;
63
:
369
77
.
59
Phillips
D
for the Joint Commission on Accreditation of Healthcare Organizations.
JCAHO pain management standards are unveiled
.
JAMA
 
2000
;
284
:
428
9
.
60
Frankenstein
RS
.
Letters to the editor reply
.
JAMA
 
2000
;
284
:
2317
8
.