Abstract

This article describes the results from a large, cross-sectional survey of social workers, psychologists, and licensed marriage and family therapists (LMFTs) in Texas (N = 865) regarding their orientation toward and implementation of evidence-based practice (EBP). All social workers were recruited by e-mail using the state NASW Listserv (analysis was limited to master's level social workers), whereas 500 psychologists and LMFTs were randomly selected from the state licensing lists for a postal mail survey. The Evidence-Based Practice Process Assessment Scale–Short Version was used, along with 10 background/demographic questions. Psychologists with doctoral degrees reported, on average, stronger orientations toward the EBP process than did social workers with master's degrees, but the effect sizes for these differences were typically weak to moderate. Social workers and LMFTs were, for most comparisons, similar in their orientations toward the EBP process. More recent social work graduates had more favorable views of the EBP process than less recent graduates. The results suggest that although the EBP process is not yet widely implemented in its entirety, there are grounds for optimism about master's level social workers’ engagement in that process and for their increased utilization of research.

Social workers and other helping professionals are now being urged in the professional literature and by third-party payers to engage in evidence-based practice (EBP). Indeed, we live in an era when the term EBP is virtually ubiquitous in our literature, pervading recently published journal issues, titles of books on practice, and professional newsletters. It is not surprising that this EBP tidal wave has irked quite a few scholars and practitioners, who have criticized EBP on many grounds, including (among others) epistemological arguments, concerns about its deleterious impact on practice, and skepticism about pragmatic obstacles (Adams, LeCroy, & Matto, 2009; Bolen & Hall, 2007; Graybeal, 2007; Manuel, Mullen, Fang, Bellamy, & Bledsoe, 2009; Mullen, 2004; Mullen, Bledsoe, & Bellamy, 2008; Nelson, Steele, & Mize, 2006; Rubin & Babbie, 2010; Rubin & Parrish, 2007).

Advocates of EBP have responded to its critics, arguing that most of them misconstrue the EBP process and proposing ways to alleviate the feasibility barriers to EBP (Gibbs & Gambrill, 2002; Lilienfield, 2007; Manuel et al., 2009; Mullen, 2004; Parrish & Rubin, 2011a; Rubin & Babbie, 2010; Thyer, 2004). Some have expressed optimism about the prospects for practitioner acceptance of and engagement in EBP, citing widespread Internet access to practice resources and research (Crisp, 2004; O'Neill, 2003; Shlonsky & Gibbs, 2006), the proliferation of systematic reviews and meta-analyses on effective interventions (de Smidt & Gorey, 1997; Gorey & Thyer, 1998; Kirk & Reid, 2002; Weissman et al., 2006), and the inclusion in the NASW Code of Ethics (NASW, 2000) of a statement that social workers should base their practice on empirically based knowledge (see section 4.01c).

Nevertheless, there remain ample grounds for skepticism about the extent of social work practitioner acceptance of and engagement in the EBP process, based not only on the criticisms mentioned earlier, but also on evidence emanating from older studies indicating that social work practitioners rarely used research evidence to guide their practice (Kirk & Reid, 2002; Mullen & Bacon, 2004; Pignotti & Thyer, 2009; Sanderson, 2002) and instead tended to rely on supervisors or colleagues or their own experiences with treatment (Bilsker & Goldner, 2004; Mullen & Bacon, 2004; Nelson et al., 2006). Recognizing that the EBP process can be successfully implemented in social work only if practitioners believe it is both important and feasible, the current study assesses practitioners’ views of and engagement in the EBP process.

It is important to distinguish the singular term, the EBP “process,” from the plural term, “evidence-based practices” (or more correctly, empirically supported treatments). The latter have been defined as “any practice that has been established as effective through scientific research according to a clear set of explicit criteria” (Mullen & Streiner, 2004, p. 113). In contrast, the EBP process has been defined by Sackett, Rosenberg, Gray, Haynes, and Richardson (1996) as the “conscientious, explicit and judicious use of current best evidence in making decisions about the care of individuals [clients]” (p. 71) and “the integration of best research evidence with clinical expertise and [client] values” (Sacket, Straus, Richardson, Rosenberg, & Haynes, 2000, p. 1). This process involves five steps: (1) formulating an answerable practice question; (2) searching for the best research evidence; (3) critically appraising the research evidence; (4) selecting the best intervention after integrating the research evidence with client characteristics, preferences, and values; and (5) evaluating practice decisions (Mullen, 2006; Shlonsky & Gibbs, 2006; Thyer, 2004).

Although practitioners’ attitudes regarding the EBP process have not yet been assessed, Mullen and Bacon (2004) compared the implementation of practice guidelines and empirically supported interventions among a sample of psychiatrists, psychologists, and social workers in a large agency that employed 500 staff engaged in the provision of clinical services. At the time of this survey, social workers were identified as being less knowledgeable of (but open to) practice guidelines, less likely to use research findings or methods in their practice, and less likely to report reading the research or other professional literature when compared with the other aforementioned disciplines (Mullen & Bacon, 2004). In contrast, social workers were reported as relying more heavily on consultation with colleagues or peers than other professions (Mullen & Bacon, 2004). This is the only study that has compared social work with other allied professions regarding the use of research and empirically supported interventions in practice, and these comparisons with social workers were limited to other disciplines that had received more graduate training (for example, psychiatrists, psychologists).

The current study reports on a large, statewide sample of master's level social workers and a random sample of licensed master's family therapists (LMFTs) and psychologists in Texas regarding their views and engagement in the EBP process. Its primary aims were to answer the following three questions: (1) How do social workers compare with LMFTs and psychologists with regard to their orientations toward the EBP process? (2) How often do social workers, LMFTs, and psychologists engage in various steps of the EBP process? and (3) Is there a difference between recent MSW graduates and those who have been practicing in the field longer with regard to orientation toward the EBP process? Orientation toward the EBP process is measured by assessing self-efficacy, attitudes, perceived feasibility and self-reported behaviors related to the EBP process. More specifically, we propose, on the basis of prior psychometric research (Parrish & Rubin, 2011b; Rubin & Parrish, 2010, 2011), that the sum of these four constructs will give us an overall sense of the orientation—or overall position—of practitioners regarding the EBP process. The rationale for comparing social workers to the other two professions is that it provides some perspective on how to interpret the social worker orientations. For example, if the proportions of social workers, LMFTs, and psychologists who engage in the EBP process are about the same, then even if those proportions are small, at least we could say that social workers do not differ from the other professions regarding engaging in the EBP process. The rationale for comparing social workers who earned their MSW degrees more recently with those who have been in the field longer is based on the notion that the more recent graduates are more likely to have had more exposure to EBP in their MSW courses than those who were educated in the pre-EBP era. Also, prior research has shown that practitioners with less experience tend to be more open to using empirically supported interventions (Aarons, 2004).

Method

Sample

The study sample was randomly drawn from all NASW members in Texas with a current e-mail address with the state NASW office and from the LMFTs and psychologists on Texas licensing lists. The link to the online survey was inserted in an e-mail sent out to all NASW members with current e-mail addresses. There were approximately 3,297 master's level NASW members who were not retired or unemployed at the time of the study. The exact sampling frame is difficult to estimate due to potential expired e-mail addresses, spam/junk mail technology, inaccurate e-mail address or full e-mail boxes, or members who do not routinely use or check their e-mail. Due to NASW privacy issues, the e-mails came directly from the NASW office—and consequently a Listserv e-mail address function was used, which precluded the tracking of the e-mail issues mentioned earlier. Some potential respondents, however, contacted the NASW executive director or the researcher with questions or concerns regarding the survey. Some shared that they experienced technical difficulties accessing the survey (n = 3) or were not knowledgeable enough to respond regarding the topic (n = 4). One moved to another state, one shared that he or she did not have time to participate, and a handful thought the survey did not apply to their practice because they did not work in a clinical setting (n = 6). Assuming a very conservative estimate of a sampling frame of 3,297 (with retired or unemployed removed) that does consider the aforementioned potential reasons for nonresponse or technology difficulties, the 688 master's level NASW members who completed the online questionnaire represented a 21 percent response rate.

Given that the e-mail addresses of LMFTs and psychologists were not readily available in Texas and postal mail surveys are more expensive than online surveys, only 500 LMFTs and psychologists were randomly selected from state licensing lists to participate. There were more psychologists (N = 3,709) than LMFTs (N = 2,765) on the state licensing lists, so the subgroups were sampled proportionate to size using systematic random sampling—with 57 percent of the sampling frame originating from the psychologist list (n = 285) and 43 percent coming from the LMFT list (n = 215). During the first two postal mailings, 13 psychologists and 30 LMFTs indicated their preference not to participate in the study by returning a prepaid, anonymous postcard. Completed survey questionnaires were received from 125 psychologists and LMFTs in the first mailing, and an additional 40 from the second mailing. The remainder of psychologist and LMFT responses (n = 15) were submitted online (they were provided with the link to the online survey in the cover letter). Neither the subscale scores nor the overall scale score differed between respondents to the first postal mailing and the second postal mailing, which may help to offset potential concerns about nonresponse bias.

A final (third) postal mailing included only a cover letter and prepaid postcard to assess reasons for nonresponse among psychologists and LMFTs who had not yet responded to the first two mailings. In addition to assessing the reasons for nonresponse, this final mailing served as a final reminder to send in the questionnaire, while also saving costs associated with mailing the entire questionnaire packet again. Of the remaining 283 potential participants, 17 psychologist envelopes and five LMFT envelopes were returned due to bad addresses, resulting in 261 psychologists and LMFTs who actively chose not to participate (and a smaller sampling frame). Of the remaining 261, 50 percent (n = 131) responded with a reason for nonresponse. As shown in Table 1, only seven LMFTs and four psychologists reported that their decision not to respond was due to a skeptical or negative view of EBP. This somewhat allays concerns that nonrespondents might have had more negative views of the EBP process than did respondents. In contrast, the most frequent reason for nonresponse was time. Given the altered sampling frame of 478 (500 less the 22 questionnaires returned in the mail for nondelivery from the third mailing), the final response rate was 40 percent for the psychologists (108/268) and 33 percent for the LMFTs (69/210).

Table 1:

Number and Percentage of Sampled Nonrespondents’ Reasons for Nonresponse (Postal Mail Only)

 Psychologists
 
LMFTs
 
Total
 
Response Number Number Number 
Not enough time 21 37.5 14 25.0 35 31.3 
On leave 8.9 8.9 10 8.9 
Retired 12.5 14.3 15 13.4 
Did not think it applied to me or my practice 12 21.4 11 19.6 23 20.5 
Skeptical/negative view of EBP 7.1 12.5 11 9.8 
Skeptical of survey methodology 1.8 10.7 6.3 
I believe I have already responded 10.7 8.9 11 9.8 
Total 56  56  112  
 Psychologists
 
LMFTs
 
Total
 
Response Number Number Number 
Not enough time 21 37.5 14 25.0 35 31.3 
On leave 8.9 8.9 10 8.9 
Retired 12.5 14.3 15 13.4 
Did not think it applied to me or my practice 12 21.4 11 19.6 23 20.5 
Skeptical/negative view of EBP 7.1 12.5 11 9.8 
Skeptical of survey methodology 1.8 10.7 6.3 
I believe I have already responded 10.7 8.9 11 9.8 
Total 56  56  112  

Note: LMFT = licensed marriage and family therapist. EBP = evidence-based practice.

Data Collection

The social work sample, invited to participate by e-mail, was surveyed using an online survey link created in Survey Monkey that was embedded in each e-mail. At approximately two- to three-week intervals, two additional follow-up e-mails were sent. Within the body of each e-mail, the purpose and importance of the survey were explained, and potential participants were informed of the anonymous nature of data collection.

Given that the LMFT and psychologist subsamples did not have available e-mail addresses, they were surveyed by postal mail and given the option to complete and return the hard copy of the questionnaire or respond online by going to the Web site included within the cover letter. Fifteen LMFTs and psychologists completed the online survey. To maintain the anonymity of psychologist and LMFT participants’ postal mail responses while tracking those who chose to respond and those who did not want to respond or receive future mailings, a postcard with prepaid postage (which was returned separately) was included in each mailing to allow participants to share this information so that a record could be maintained for future mailings. Those who responded by postcard were removed from subsequent follow-up mailings. Two postal mailings went out to the LMFTs and psychologists, with a third mailing that only included a prepaid postcard that asked for specific reasons for nonresponse.

All potential participants received, by e-mail or postal mail, a cover letter describing the research and informing them of the anonymous and voluntary nature of their participation as well as the importance of their participation. The questionnaire included the Evidence-Based Practice Process Assessment Scale (EBPPAS)–Short Version and 10 background/demographic questions. The background/demographic questions asked about age, gender, ethnicity, prior training and exposure to the EBP process, practice licensure, education, and the number of years in practice. As reported by Parrish and Rubin (2011b), the EBPPAS–Short Version (which originated from the original EBPPAS) (Rubin & Parrish, 2010, 2011) has excellent internal consistency reliability (α = .94), criterion validity, and construct and factorial validity. The EBPPAS–Short Version consists of 37 items followed by a five-point Likert scale, and it consists of the following four subscales: Familiarity/Self-efficacy with EBP (10 items), Attitudes toward EBP (14 items), Perceived Feasibility of EBP (five items), and Actual Self-reported EBP Behaviors (eight items). Higher scores indicate a more favorable response in each section and in the overall scale score, which can be summed to describe overall “orientation toward EBP.”

Data Analysis

SPSS 18.0 was used to run both descriptive and inferential statistics. Specified reverse-scored items were recoded, and missing data were analyzed. Missing data ranged well within acceptable levels (0 to 4.5 percent), did not differ by discipline [χ2(2) = 1.068, p = .586], and were found to be missing completely at random, so mean imputation methods were used to preserve sample size (Tabachnick & Fidell, 2007). To minimize the chance of a Type I error, a one-way multivariate analysis of variance (MANOVA) was run to compare the three disciplines (social workers, psychologists, and LMFTs) on each of the four subscales (self-efficacy, attitudes, perceived feasibility, and behaviors) as well as the overall scale score (orientation toward EBP); Bonferroni post-hoc tests were used to determine between-group differences. The assumptions of normality, homogeneity of variance–covariance matrices, linearity, and multicollinearity were satisfactory. Chi-square tests of independence were used to compare the three disciplines regarding the frequency of the eight behavioral items from the behaviors scale. To simplify this analysis, the five-point scale was collapsed into two categories: very often/often versus the less frequent categories. Independent t tests (using a Bonferroni correction) were used to compare master's level social workers with five years or less of experience with those with six or more years of experience on subscale and overall scale scores.

Results

Sample Characteristics

Overall, the sample was primarily female (77.4 percent) and white or Caucasian (79.3 percent), with an average age of 51 years (SD = 12.58). The three disciplines did differ with regard to age, self-reported prior courses/continuing education and exposure to EBP, years of practice experience, and gender (see Table 2). It is not surprising that the social work sample had much higher proportions of women and reported more prior courses on EBP (many schools of social work within Texas have increased the emphasis on EBP in their curriculums). Age differed between LMFTs (M = 54.40, SD = 12.68) and social workers (M = 51.26, SD = 12.03), but all three disciplines reported average ages in the early to mid-fifties.

Table 2:

Background Characteristics, by Discipline

 Social Work (n = 688)
 
Psychology (n = 108)
 
LMFT (n = 69)
 
 
Characteristic M SD M SD M SD F p 
Age 51.26 12.03 53.57 11.41 54.40 12.68 3.25 .039 
Number of prior continuing education courses on EBP 1.97 4.51 2.25 4.19 3.17 10.31 1.53 .217 
Number of prior courses on EBP 1.25 2.71 0.74 2.34 0.31 1.03 4.64 .01 
Years of practice experience 17.34 11.60 19.98 10.46 20.68 8.89 4.66 .01 

 
 n % n % n % χ2 p 
 
 

 

 

 

 

 

 

 
Gender         
 Female 556 81.5 56 56.0 44 67.7 36.37 <.001 
 Male 126 18.5 44 44.0 21 32.3   
Ethnicitya         
 White/Caucasian 521 76.4 93 93.9 55 84.6   
 African American 50 7.3  3.1   
 Hispanic 81 11.9 5.1 9.2   
 Asian/Pacific Islander 13 1.9    
 American Indian/         
  Alaskan Native 0.9 1.0 1.5   
Prior continuing education on EBP         
 Yes 277 40.7 54 54.0 27 41.5 6.30 .043 
Prior courses on EBP         
 Yes 275 40.2 32 32.3 13 20.3 11.27 .004 
Self-reported prior training in EBP         
 Quite a bit 101 14.8 28 28.6 11 16.9 18.31 .005 
 Some 241 35.4 31 31.6 16 24.6   
 Very little 244 35.8 33 33.7 25 38.5   
 None 95 14.0 6.1 13 20.0   
 Social Work (n = 688)
 
Psychology (n = 108)
 
LMFT (n = 69)
 
 
Characteristic M SD M SD M SD F p 
Age 51.26 12.03 53.57 11.41 54.40 12.68 3.25 .039 
Number of prior continuing education courses on EBP 1.97 4.51 2.25 4.19 3.17 10.31 1.53 .217 
Number of prior courses on EBP 1.25 2.71 0.74 2.34 0.31 1.03 4.64 .01 
Years of practice experience 17.34 11.60 19.98 10.46 20.68 8.89 4.66 .01 

 
 n % n % n % χ2 p 
 
 

 

 

 

 

 

 

 
Gender         
 Female 556 81.5 56 56.0 44 67.7 36.37 <.001 
 Male 126 18.5 44 44.0 21 32.3   
Ethnicitya         
 White/Caucasian 521 76.4 93 93.9 55 84.6   
 African American 50 7.3  3.1   
 Hispanic 81 11.9 5.1 9.2   
 Asian/Pacific Islander 13 1.9    
 American Indian/         
  Alaskan Native 0.9 1.0 1.5   
Prior continuing education on EBP         
 Yes 277 40.7 54 54.0 27 41.5 6.30 .043 
Prior courses on EBP         
 Yes 275 40.2 32 32.3 13 20.3 11.27 .004 
Self-reported prior training in EBP         
 Quite a bit 101 14.8 28 28.6 11 16.9 18.31 .005 
 Some 241 35.4 31 31.6 16 24.6   
 Very little 244 35.8 33 33.7 25 38.5   
 None 95 14.0 6.1 13 20.0   

Notes: LMFT = licensed marriage and family therapist. EBP = evidence-based practice. The means in the age row differ significantly between the social work sample and the LMFT sample. The means in the number of prior courses row differ between the social work sample and the LMFT sample. The means in the years of practice experience row differ between the social work sample and the psychologist sample, and the social work sample and the LMFT sample.

aEthnicity had too many cells with an expected frequency less than 5 to report nonparametric statistics.

Within the social work (master's degree only) NASW Texas sample, the mean age was 51, compared with a mean age of 45 among all NASW members nationally, which includes students and bachelor's level social workers (NASW, 2008). The difference in age may be attributed to the fact that the study social work sample consisted of master's level social workers and did not include students and bachelor's level practitioners. In addition, 82 percent of the study social work sample was female, compared with 83 percent of national NASW members (Stoesen, 2008). Our social work sample was more diverse than the national NASW member sample, with 12 percent of our sample reporting Hispanic/Latino origins, compared with 2 percent in the national sample, which may be a function of regional differences.

The psychologist respondents had a mean age of 54, compared with a mean age of 54 reported in recent national American Psychological Association (APA) statistics (APA, 2010). The proportions of male and female psychologists reported from the national APA member database were the same as in our psychologist subsample, with 56 percent women and 44 percent men (APA, 2010). Our psychologist respondents were more likely to be white (94 percent) than was the national APA membership (65 percent). However, 28 percent of the APA membership sample was “not specified” for race/ethnicity, which made this comparison difficult. Unfortunately, there were no data that the authors could locate that described the characteristics of LMFTs on a national or state level.

Question 1: How do social workers compare with LMFTs and psychologists with regard to their orientations toward the EBP process?

A one-way MANOVA was performed on five dependent variables: self-efficacy, attitudes, feasibility, behaviors, and overall EBPPAS scale score. The independent variable was discipline (social work, psychology, and LMFT). Based on Wilks’ criterion, the combined dependent variables were significantly affected by discipline [F(8, 1710) = 10.62, p < .001]. The results of post hoc analyses of mean differences and effect sizes are listed in Table 3 (effect sizes are listed in the notes section). Although not surprising given the difference in years of training and exposure to research courses, psychologists (M = 40.50, SD = 9.51) reported higher levels of self-efficacy with the EBP process than both social workers (M = 35.70, SD = 7.75) and LMFTs (M = 36.26, SD = 7.92,) and the differences were strong, but social workers and LMFTs did not differ with regard to self-efficacy. Psychologists (M = 132.25, SD = 19.02) also scored significantly higher on the overall scale score—orientation toward EBP—than social workers (M = 125.09, SD = 20.07) or LMFTs (M = 124.56, SD = 20.52), although these differences were not strong. Again, social workers and LMFTs did not differ in their overall orientation toward EBP.

Table 3:

Mean Differences between Disciplines on Scale and Subscale Scores

 Social Work (n = 688)
 
Psychology (n = 108)
 
LMFT (n = 69)
 
 
Scale M SD M SD M SD F p 
Self-efficacy 35.70 7.75 40.50 5.83 36.26 7.92 18.19 <.001 
Attitudes 49.95 7.99 48.52 9.51 47.44 7.96 3.94 .020 
Perceived feasibility 15.98 3.36 17.59 3.56 16.77 2.78 11.44 <.001 
Self-reported behaviors 23.46 6.65 25.64 5.83 24.08 6.13 5.12 .006 
Orientation toward EBP 125.09 20.07 132.25 19.02 124.56 20.52 5.97 .003 
 Social Work (n = 688)
 
Psychology (n = 108)
 
LMFT (n = 69)
 
 
Scale M SD M SD M SD F p 
Self-efficacy 35.70 7.75 40.50 5.83 36.26 7.92 18.19 <.001 
Attitudes 49.95 7.99 48.52 9.51 47.44 7.96 3.94 .020 
Perceived feasibility 15.98 3.36 17.59 3.56 16.77 2.78 11.44 <.001 
Self-reported behaviors 23.46 6.65 25.64 5.83 24.08 6.13 5.12 .006 
Orientation toward EBP 125.09 20.07 132.25 19.02 124.56 20.52 5.97 .003 

Notes: LMFT = licensed marriage and family therapist. EBP = evidence-based practice. Bonferroni post hoc tests were used to assess the differences between disciplines on each of the subscales and overall scale. The means in the self-efficacy row differ significantly between the social work sample and the psychologist sample (d = .71) and between the LMFT and psychologist sample (d = .62). The means in the attitudes row differ significantly between the social work and LMFT sample (d = .32), and was insignificant after controlling for demographics, prior exposure to EBP and length of practice. The means in the perceived feasibility row differ significantly between the social work sample and the psychologist sample (d = .47). The means in the self-reported behaviors row differ significantly between the social work sample and psychologist sample (d = .35). The overall scale scores differed significantly between the psychologist and social work samples (d = .37) and psychologist and LMFT samples (d = .39) with regard to overall orientation toward EBP.

With regard to attitudes, social workers (M = 49.95, SD = 3.36) had significantly more positive attitudes toward the EBP process than LMFTs (M = 47.44, SD = 7.96), but the magnitude of this difference was not large (d = 0.32) and was insignificant when controlling for demographic factors, length of practice, and prior training in EBP using both a multivariate regression analysis and an analysis of covariance (p > .05). Psychologists (M = 17.59, SD = 3.56) had higher scores with regard to the perceived feasibility of EBP compared with social workers (M = 15.98, SD = 3.36), and the effect size for this difference was moderate (d = 0.47). Social workers and LMFTs did not differ with regard to perceived feasibility. The mean score for self-reported EBP behaviors was higher for psychologists (M = 25.64, SD = 5.83) than social workers (M = 23.46, SD = 6.65), but of a somewhat weak effect size (d = 0.35). There was not a difference between social workers and LMFTs regarding behaviors. All other differences between disciplines were supported, even after controlling for demographic factors, length of practice, and prior training in EBP.

Given the large differences in gender between the three disciplines, a multivariate analysis of covariance was run in which gender served as the covariate for the aforementioned analysis. Controlling for gender did not significantly alter the aforementioned results.

Question 2: How often do social workers, LMFTs, and psychologists engage in various steps of the EBP process?

There are seven behaviors related to the EBP process that are assessed with the EBPPAS behaviors subscale, followed by a question focused on the implementation of all steps of the EBP process. These items and a comparison of the frequency (“very often” or “often”) with which social workers, psychologists, and LMFTs engage in these behaviors are displayed in Table 4. As shown within the table, and consistent with the aforementioned findings, psychologists tended to report reading and relying on research evidence or practice guidelines more frequently than social workers or LMFTs. However, 27.8 percent of social workers reported “relying on research evidence as the best guide for making practice decisions” often or very often (as compared with 21.5 percent of LMFTs and 38.5 percent of psychologists). Approximately 15 percent of social workers and LMFTs and 22 percent of psychologists reported implementing all steps of the EBP process often or very often. Approximately half of the social work respondents reported evaluating their practice often or very often (as compared with 56 percent of psychologists and 68 percent of LMFTs). In light of the relatively high percentages regarding evaluating practice, it is conceivable that this item was interpreted by some respondents to mean any type of practice evaluation, perhaps including unsystematic evaluations based on subjective clinical judgments.

Table 4:

Cross-Tabulation of “Often” or “Very Often” Responses to Behavioral Scale Items, by Discipline

 Social Work (n = 688)
 
Psychology (n = 108)
 
LMFT (n = 69)
 
  
Response n n n χ2 p 
I use the Internet to search for the best evidence to guide my practice decisions. 268 32.8 45 43.3 19 29.2 5.11 .078 
I read about research evidence to guide my practice decisions. 309 37.8 60 57.7 27 42.2 15.27 <.001 
I read research-based practice guidelines to guide my practice decisions. 271 33.2 46 44.2 19 30.2 5.46 .065 
I rely on research evidence as the best guide for making practice decisions. 226 27.8 40 38.5 14 21.5 6.78 .034 
I inform clients of the degree of research evidence supporting alternative intervention options. 208 25.6 38 36.9 21 32.3 6.81 .033 
I involve clients in deciding whether they will receive an intervention supported by the research evidence. 219 27.0 29 28.4 17 26.6 0.101 .951 
I evaluate the outcomes of my practice decisions. 398 49.1 58 56.3 44 67.7 9.53 .009 
I engage in all steps of the EBP process. 122 15.1 22 21.6 10 15.4 2.84 .242 
 Social Work (n = 688)
 
Psychology (n = 108)
 
LMFT (n = 69)
 
  
Response n n n χ2 p 
I use the Internet to search for the best evidence to guide my practice decisions. 268 32.8 45 43.3 19 29.2 5.11 .078 
I read about research evidence to guide my practice decisions. 309 37.8 60 57.7 27 42.2 15.27 <.001 
I read research-based practice guidelines to guide my practice decisions. 271 33.2 46 44.2 19 30.2 5.46 .065 
I rely on research evidence as the best guide for making practice decisions. 226 27.8 40 38.5 14 21.5 6.78 .034 
I inform clients of the degree of research evidence supporting alternative intervention options. 208 25.6 38 36.9 21 32.3 6.81 .033 
I involve clients in deciding whether they will receive an intervention supported by the research evidence. 219 27.0 29 28.4 17 26.6 0.101 .951 
I evaluate the outcomes of my practice decisions. 398 49.1 58 56.3 44 67.7 9.53 .009 
I engage in all steps of the EBP process. 122 15.1 22 21.6 10 15.4 2.84 .242 

Notes: LMFT = licensed marriage and family therapist. EBP = evidence-based practice.

Question 3: Is there a difference between recent MSW graduates and those who have been practicing in the field longer with regard to orientation toward EBP?

Independent t tests were used to assess the differences between master's level social workers who have recently begun practicing (zero to five years, n = 151) and those who have been in practice longer (six years or more, n = 523) to assess whether those who have more recently obtained their degrees are more open to the EBP process. As shown in Table 5, after a Bonferroni correction (.05/5 tests = .01), there was a significant difference with a moderate effect size (d = .66), with more recent graduates (M = 53.56, SD = 6.15) reporting more positive attitudes toward EBP than earlier graduates (M = 48.80, SD = 8.20). Recent graduates (M = 129.77, SD = 17.02) also had a higher overall orientation to EBP than earlier graduates (M = 123.54, SD = 20.63, d = .33). More recent graduates, however, did not differ significantly from earlier graduates with regard to self-efficacy, perceived feasibility, or the implementation of EBP in practice.

Table 5:

Mean Differences between Social Workers, by Length of Practice

 Social Work
 
  
Variable M SD t (df) p d 
Familiarity      
 0–5 years 36.99 7.11 2.50 (672) .142 .24 
 6 years or more 35.21 7.91    
Attitudes      
 0–5 years 53.56 6.15 7.61 (314.12) <.001 .66 
 6 years or more 48.80 8.20    
Feasibility      
 0–5 years 15.67 3.58 –1.23 (664) .221 .11 
 6 years or more 16.06 3.30    
Behaviors      
 0–5 years 23.56 6.12 .424 (672) .671 .04 
 6 years or more 23.30 6.80    
Orientation to EBP (Scale)      
 0–5 years 129.77 17.02 3.69 (283.14) <.001 .33 
 6 years or more 123.54 20.63    
 Social Work
 
  
Variable M SD t (df) p d 
Familiarity      
 0–5 years 36.99 7.11 2.50 (672) .142 .24 
 6 years or more 35.21 7.91    
Attitudes      
 0–5 years 53.56 6.15 7.61 (314.12) <.001 .66 
 6 years or more 48.80 8.20    
Feasibility      
 0–5 years 15.67 3.58 –1.23 (664) .221 .11 
 6 years or more 16.06 3.30    
Behaviors      
 0–5 years 23.56 6.12 .424 (672) .671 .04 
 6 years or more 23.30 6.80    
Orientation to EBP (Scale)      
 0–5 years 129.77 17.02 3.69 (283.14) <.001 .33 
 6 years or more 123.54 20.63    

Note: EBP = evidence-based practice.

Discussion

Those who have championed the EBP process in social work education and practice during the last decade may find some grounds for encouragement in our results regarding social workers’ orientations toward and engagement in this process. For example, the social workers in our sample were at least as positive as the LMFTs regarding EBP on every measure. Also, although psychologists were more strongly oriented toward the EBP process overall than both social workers and LMFTs, the effect size was weak. The weakness of this difference is particularly noteworthy in light of the fact that master's degree level social workers were being compared with psychologists with doctoral degrees and more extensive training in research. Moreover, the psychologists did not differ from the social workers with regard to their attitudes toward EBP. The strongest effect sizes were identified in comparisons of psychologists with both LMFTs and social workers with regard to their self-efficacy with EBP, which makes sense in light of the additional years of training in research that psychologists receive within their doctoral educational programs.

Likewise, those who have called for greater utilization of research to inform practice might also be encouraged by the finding that 38 percent of the social workers in our sample reported that they read research evidence to guide practice decisions “often or very often” and that 28 percent of them reported “relying on research evidence as the best guide for making practice decisions” often or very often. This finding contrasts with the findings of past studies that concluded that social work practitioners rarely use research to guide their practice (Kirk & Reid, 2002; Mullen & Bacon, 2004. The notion that progress is being made in research utilization (and thus in the implementation of the EBP process, which emphasizes the integration of the best available research with practitioner expertise and client values/culture) was further supported by the finding that the social workers in our sample who earned their MSW degrees within the past five years were more likely to report more positive attitudes toward EBP than their counterparts who earned their degrees longer ago.

Although some might find these findings encouraging, they might also see the need for improvement, as less than half of our social work sample reported engaging in the EBP process often. Moreover, given the social work profession's valuing of empowerment, self-determination, and informed consent, some might be concerned that only a little over 25 percent of our social work sample reported that they often inform clients of the research associated with disparate interventions and involve clients in collaborative decision making regarding the selection of an intervention.

Several limitations preclude generalizing our findings to the population of social workers. Feasibility constraints required that we limit our sample to practitioners in Texas. Although practitioners in Texas may be unlike practitioners elsewhere, we know of no reason to suppose that their attitudes toward and engagement in the EBP process should be more positive than those of practitioners in most other states. Our sample was also limited by the large number of nonresponders. It is conceivable that those who responded to the survey may have been more likely than nonresponders to have favorable views of EBP and be more likely to engage in it. However, this concern is offset somewhat by the responses to our follow-up survey of nonresponders, which found that only a tiny proportion of nonresponders said that their views of EBP were the reason they did not respond to the survey. Moreover, there is evidence to suggest that response rates to surveys in general are decreasing. A recent special issue of the Public Opinion Quarterly (“Special Issue: Nonresponse Bias,” 2006) was devoted to an analysis of the decline in response rates in household surveys (Singer, 2006). Its many analyses showed that lower response rates do not always produce inaccurate estimates of study populations. Likewise, Sheehan (2001) found that response rates to e-mail surveys are decreasing, possibly due to the large number of unsolicited e-mails that individuals receive within the workplace each day.

Our findings also should be considered in light of the fact that—as with all surveys—what people say may not match what they do. However, this potential social desirability bias would have applied to older surveys of social work practitioners, as well, and those surveys found much less reported utilization of research among social work practitioners than did our study. Therefore, despite the aforementioned limitations, our study provides encouraging findings regarding the progress within the field of social work in closing the gap between research and practice and in practitioner views of and engagement in EBP.

References

Aarons
G. A.
Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-based Practice Attitude Scale (EBPAS)
Mental Health Services Research
 , 
2004
, vol. 
6
 (pg. 
61
-
74
)
Adams
K. B.
LeCroy
C. W.
Matto
H. C.
Limitations of evidence-based practice for social work education: Unpacking the complexity
Journal of Social Work Education
 , 
2009
, vol. 
45
 (pg. 
165
-
186
)
American Psychological Association
Demographic characteristics of APA members by membership status, 2009: Center for Worforce Studies
 , 
2010
Washington, DC
Author
Bilsker
D.
Goldner
E. M.
Teaching evidence-based practice: Overcoming barriers
Brief Treatment and Crisis Intervention
 , 
2004
, vol. 
4
 (pg. 
271
-
275
)
Bolen
R. M.
Hall
J. C.
Managed care and evidence-based practice: The untold story
Journal of Social Work Education
 , 
2007
, vol. 
43
 (pg. 
463
-
479
)
Code of ethics of the National Association of Social Workers
2000
 
Crisp
B. R.
Evidence-based practice and the borders of data in the global information era
Journal of Social Work Education
 , 
2004
, vol. 
40
 (pg. 
73
-
85
)
de Smidt
G. A.
Gorey
K. M.
Unpublished social work research: Systematic replication of a recent meta-analysis of published intervention effectiveness research [Research Note]
Social Work Research
 , 
1997
, vol. 
21
 (pg. 
58
-
62
)
Gibbs
L.
Gambrill
E.
Evidence-based practice: Counterarguments to objections
Research on Social Work Practice
 , 
2002
, vol. 
12
 (pg. 
452
-
476
)
Gorey
K. M.
Thyer
B. A.
Pawluck
D. E.
Differential effectiveness of prevalent social work practice models: A meta-analysis
Social Work
 , 
1998
, vol. 
43
 (pg. 
269
-
278
)
Graybeal
C. T.
Evidence for the art of social work
Families in Society
 , 
2007
, vol. 
88
 (pg. 
513
-
523
)
Kirk
S. A.
Reid
W. J.
Science and social work: A critical appraisal
 , 
2002
New York
Columbia University Press
Lilienfield
S. O.
Psychological treatments that cause harm
Perspectives on Psychological Science
 , 
2007
, vol. 
2
 (pg. 
53
-
70
)
Manuel
J. I.
Mullen
E. J.
Fang
L.
Bellamy
J. L.
Bledsoe
S. E.
Preparing social work practitioners to use evidence-based practice
Research on Social Work Practice
 , 
2009
, vol. 
19
 (pg. 
613
-
627
)
Mullen
E. J.
Roberts
A. R.
Yeager
K. R.
Facilitating practitioner use of evidence-based practice
Evidence-based practice manual: Research and outcome measures in health and human services
 , 
2004
New York
Oxford University Press
(pg. 
205
-
210
)
Mullen
E. J.
Roberts
A. R.
Yeager
K. R.
Facilitating practitioner use of evidence-based practice
Foundations of evidence-based social work practice
 , 
2006
New York
Oxford University Press
(pg. 
152
-
159
)
Mullen
E. J.
Bacon
W.
Roberts
A. R.
Yeager
K. R.
Implementation of practice guidelines and evidence-based treatment
Evidence-based practice manual: Research and outcome measures in health and human services
 , 
2004
New York
Oxford University Press
(pg. 
210
-
218
)
Mullen
E. J.
Bledsoe
S.
Bellamy
J.
Implementing evidence-based social work practice
Research on Social Work Practice
 , 
2008
, vol. 
18
 (pg. 
325
-
338
)
Mullen
E. J.
Streiner
D. L.
The evidence for and against evidence-based practice
Brief Treatment and Crisis Intervention
 , 
2004
, vol. 
4
 (pg. 
111
-
121
)
Stoesen
L.
Member survey yields findings
NASW News
 , 
2008
, vol. 
53
 pg. 
1
  
Nelson
T. D.
Steele
R. G.
Mize
J. A.
Practitioner attitudes toward evidence-based practice: Themes and challenges
Administration and Policy in Mental Health Services Research
 , 
2006
, vol. 
33
 (pg. 
398
-
409
)
O'Neill
J. V.
Nearly all members linked to the Internet
NASW News
 , 
2003
, vol. 
48
 pg. 
9
 
Parrish
D. E.
Rubin
A.
An effective model for continuing education training in evidence-based practice
Research on Social Work Practice
 , 
2011
, vol. 
21
 (pg. 
77
-
87
)
Parrish
D. E.
Rubin
A.
Validation of the Evidence-Based Practice Process Assessment Scale–Short version
Research on Social Work Practice
 , 
2011
, vol. 
21
 (pg. 
200
-
211
)
Pignotti
M.
Thyer
B. A.
Use of novel unsupported and empirically supported therapies by licensed clinical social workers: An exploratory study
Social Work Research
 , 
2009
, vol. 
33
 (pg. 
5
-
17
)
Rubin
A.
Babbie
E.
Research methods for social work
 , 
2010
(7th ed.)
Belmont, CA
Brooks/Cole
Rubin
A.
Parrish
D.
Challenges to the futre of evidence-based practice in social work education
Journal of Social Work Education
 , 
2007
, vol. 
43
 (pg. 
405
-
428
)
Rubin
A.
Parrish
D.
Development and validation of the EBP Process Assessment Scale: Preliminary findings
Research on Social Work Practice
 , 
2010
, vol. 
20
 (pg. 
629
-
640
)
Rubin
A.
Parrish
D.
Validation of the EBP Process Assessment Scale
Research on Social Work Practice
 , 
2011
, vol. 
21
 (pg. 
106
-
118
)
Sackett
D. L.
Rosenberg
W.M.C.
Gray
J.A.M.
Haynes
R. B.
Richardson
W. S.
Evidence-based medicine: What it is and what it isn't
BMJ
 , 
1996
, vol. 
312
 
7023
(pg. 
71
-
72
)
Sackett
D. L.
Straus
S. E.
Richardson
W. S.
Rosenberg
W.M.C.
Haynes
R. B.
Evidence-based medicine: How to practice and teach EBM
 , 
2000
(2nd ed.)
New York
Churchill Livingstone
Sanderson
W. C.
Why we need evidence-based psychotherapy practice guidelines
Medscape General Medicine
 , 
2002
, vol. 
4
 
4
pg. 
14
 
Sheehan
K.
E-mail survey response rates: A review
Journal of Computer-Mediated Communication
 , 
2001
, vol. 
6
  
Shlonsky
A.
Gibbs
L.
Roberts
A. R.
Yeager
K. R.
Will the real evidence-based practice please stand up? Teaching the process of evidence-based practice to the helping professions
Foundations of evidence-based social work practice
 , 
2006
New York
Oxford University Press
(pg. 
103
-
121
)
Singer
E.
Introduction: Nonresponse bias in household surveys
Public Opinion Quarterly
 , 
2006
, vol. 
70
 (pg. 
637
-
645
)
Special issue: Nonresponse bias in household surveys
Public Opinion Quarterly
 , 
2006
, vol. 
70
 (pg. 
637
-
809
)
Tabachnick
B. G.
Fidell
L. S.
Using multivariate statistics
 , 
2007
Boston
Pearson Education
Thyer
B. A.
What is evidence-based practice?
Brief Treatment and Crisis Intervention
 , 
2004
, vol. 
4
 (pg. 
167
-
176
)
Weissman
M. M.
Verdeli
H.
Gameroff
M. J.
Bledsoe
S. E.
Betts
K.
Mufson
L.
, et al.  . 
National survey of psychotherapy training in psychiatry, psychology, and social work
Archives of General Psychiatry
 , 
2006
, vol. 
63
 (pg. 
925
-
934
)