-
PDF
- Split View
-
Views
-
Cite
Cite
Laura M Bogart, Chong-Min Fu, Jodi Eyraud, Burton O Cowgill, Jennifer Hawes-Dawson, Kimberly Uyeda, David J Klein, Marc N Elliott, Mark A Schuster, Evaluation of the dissemination of SNaX, a middle school-based obesity prevention intervention, within a large US school district, Translational Behavioral Medicine, Volume 8, Issue 5, October 2018, Pages 724–732, https://doi.org/10.1093/tbm/ibx055
- Share Icon Share
Abstract
Few evidence-based school obesity-prevention programs are disseminated. We used community-based participatory research principles to disseminate an evidence-based middle-school obesity-prevention program, Students for Nutrition and eXercise (SNaX), to a large, primarily Latino, school district. In the 2014–2015 school year, we trained a district “champion” to provide training and technical assistance to schools and supplied print- and web-based materials (www.snaxinschools.org). In one district region, 18 of 26 schools agreed to participate. We evaluated the dissemination process using the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) framework. All 18 schools implemented at least one SNaX component. Of 6,410 students who attended an informational session, 1,046 registered and 472 were selected to be Student Advocates, of whom 397 attended at least one meeting. Of 60 activities observed across schools, 77% were conducted with fidelity, but local resource constraints limited most activities to a relatively small number of Student Advocates (vs. the entire student body). Qualitative data from 46 school staff and 187 students indicated positive attitudes about the program. Teachers suggested that SNaX be implemented as part of the curriculum. In the 2015–2016 school year, 6 of the original schools continued to implement SNaX, and the champion trained 94 teachers from 57 schools districtwide. Cafeteria servings overall and fruit and vegetable servings, the primary outcomes, did not increase in SNaX schools versus matched-comparison schools. Our mixed-methods evaluation of SNaX showed acceptability and fidelity, but not effectiveness. Effectiveness may be improved by providing technical assistance to community stakeholders on how to tailor core intervention components while maintaining fidelity.
Practice: School-based obesity prevention programs can be used to reach a large proportion of students, but core program components must be implemented with fidelity to the original protocol for effectiveness.
Policy: School districts that wish to implement and sustain school-based obesity prevention interventions need to allocate funds for a program “champion” and to cultivate support from multiple stakeholders at the school and district levels.
Research: Future research on dissemination of evidence-based interventions needs to ensure that adequate training and technical assistance are provided to help tailor core intervention components without compromising their effectiveness.
INTRODUCTION
Adolescent obesity prevention research often focuses on schools as a vehicle for reaching large and diverse populations of youth, and several school-based interventions have shown promise in randomized controlled trials [1,2]. However, few evidence-based school interventions have been disseminated for widespread use, and even fewer studies have examined the dissemination process for school interventions.
The few studies on dissemination of school-based obesity-prevention interventions have found high acceptability, feasibility, fidelity, reach into the school population, and sustainability (i.e., maintenance) over time in the presence of adequate long-term training and technical assistance [3–5]. The success of dissemination may depend on several program implementation factors. Higher levels of implementation (with greater fidelity to the original program) are more effective [6]. Implementation can be improved through training and technical assistance [6], which can increase staff skill levels and commitment. Active training approaches are more effective than passive ones (e.g., providing only written materials without allowing for interaction) [7,8]. The existence of a program “champion” who is respected and has the status to make changes is also important for ensuring successful implementation, as is shared decision-making, in which key community stakeholders are engaged and feel ownership over the program [6].
We disseminated an evidence-based program, Students for Nutrition and eXercise (SNaX), to middle schools in the Los Angeles Unified School District [9–11]. SNaX combines school-wide environmental changes, encouragement to eat healthy food and engage in physical activity, and peer-led education. In a randomized controlled trial, SNaX led to significant increases in the proportions of students served fruit and lunch in the cafeteria and a significant decrease in the proportion of students buying snacks at school, as well as greater obesity prevention knowledge, more positive attitudes about cafeteria food and tap water, increased intentions to drink water, and greater self-reported water consumption. Students in SNaX with obesity at baseline (in seventh grade) showed significant reductions in BMI-percentile in ninth grade compared with control students with obesity at baseline.
SNaX was disseminated to Los Angeles Unified School District middle schools in the 2014–2015 school year by actively engaging stakeholders and training a “champion” within the district to provide training and technical assistance [12]. We used mixed methods to evaluate dissemination and implementation with the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) framework [13]. RE-AIM has been used to improve the translation of research into practice and test whether interventions and policies are effective and sustainable in real-world settings and thus worthy of investment [14]. RE-AIM has been applied to child health and obesity prevention interventions [4,15–18]. Our approach allowed for the examination of areas in which dissemination efforts were effective or ineffective, leading to insights about potential ways to disseminate and implement evidence-based obesity prevention interventions [19].
METHODS
Community engagement and setting
SNaX was developed through a community-academic partnership [9–11]. The team met regularly with a community advisory board that included academic child obesity researchers and community stakeholders (district administrators and board members, non-profit organizations focused on child obesity, parent representatives, and public health department staff). All dissemination activities took place at the Los Angeles Unified School District. In the 2013–2014 school year, the district was 73% Latino, 10% African American, 9% White, 6% Asian/Filipino/Pacific Islander, and <1% American Indian; 1% were two or more races. The total K-12 enrollment was 651,322.
Intervention
SNaX consisted of school-wide food environmental changes and school-wide social marketing. The environmental changes included offering a variety of sliced/bite-sized fruits and vegetables, and freely available chilled filtered water at lunch. Social marketing focussed on school cafeteria food, healthy beverage intake, fruits/vegetables, and physical activity/inactivity, and included school-wide announcements and posters. Schools were asked to organize a kick-off event at which all students could watch a half-hour SNaX video designed to promote physical activity and healthy eating. Student Advocates were trained using role-plays on how to promote SNaX in a non-confrontational and encouraging manner. Student Advocates approached other students at lunchtime, where they encouraged participation in SNaX activities (e.g., taste tests of filtered water and cafeteria food, which had been reformulated to be healthier). They passed out bookmarks with relevant messages (e.g., “Tap water has zero sugar, zero calories, and fluoride to help prevent cavities”) to other students. Teachers distributed activities for students to complete with parents weekly (e.g., a physical activity checklist for the family).
Intervention adaptation
Prior to starting dissemination, the team revised the program for greater sustainability in collaboration with the community advisory board and a teacher selected to be a full-time district SNaX Coordinator (the SNaX “champion”). In its efficacy test, SNaX was conducted as a 5-week program, in which a new group of Student Advocates participated in voluntary lunchtime clubs each week, and additional fruit and vegetable options were provided at lunchtime, based on research showing that increasing food variety leads to greater consumption [20]. The team also adapted SNaX for classroom use for the duration of the school year, extending material from each unit to fill multiple classroom sessions. To facilitate use, the team created new video shorts for classroom discussion that continued the storyline of the video shown to all students at the SNaX kick-off event; mobile apps (with games and short exercise videos) to engage students outside of school; a teacher website (teachers.snaxinschools.org) with print-ready materials and instructions about how to use the program; and a student website (www.snaxinschools.org) containing all videos and parent activities. A lunchtime obstacle course was added to promote physical activity. Teachers had the flexibility to tailor SNaX to their school with supplementary activities (e.g., walking clubs).
Several additional changes were made to increase feasibility, due to budgetary and staffing constraints. Cafeteria staff did not provide additional fruits and vegetables, and instead Student Advocates promoted existing fruit and vegetable offerings. In addition, in the efficacy test, a new group of Student Advocates participated in the club sessions each week, and Student Advocates were asked to bring a peer with them to an activity, because we hypothesized that diffusion of messages throughout the student body would lead to behavior change. In contrast, because engagement of a large number of students was not feasible for most schools in the dissemination test, a smaller number of Student Advocates participated per school (e.g., one group of students in a class or club).
Sample
Figure 1 shows the numbers of schools and students who participated in the dissemination evaluation. The evaluation included the 18 schools in the district’s northern region that agreed to participate in SNaX, the remaining 8 schools in the northern region that did not participate (but that were invited), and 47 comparison middle schools that were not offered SNaX (because they were in a different district region).

Procedure and measures
The outcomes and data sources used to evaluate the dissemination process, as well as definitions of each RE-AIM component, are described below and in Table 1.
RE-AIM element . | Data source . |
---|---|
Reach: % and socio-demographic characteristics of northern region middle school students who received SNaX, of northern students in eligible schools; student reports of awareness of SNaX | Publicly available school district data; SNaX Student Advocate Survey |
Effectiveness: changes in cafeteria servings in SNaX schools vs. matched comparison schools | School-wide cafeteria servings data |
Adoption: # and % of selected schools that attended training and conducted SNaX, of schools in district’s northern region | Training log (kept by SNaX coordinator) |
Implementation: % of SNaX components implemented with fidelity; % of schools implementing each component; # of SNaX teacher website users; implementation barriers/facilitators | Observational fidelity checklist; teacher surveys; website usage data; interviews/focus groups with school staff and Student Advocates |
Maintenance: % of northern schools implementing SNaX in the next school year; suggestions for sustainability | Surveys of teachers; interviews/focus groups with school staff and Student Advocates |
RE-AIM element . | Data source . |
---|---|
Reach: % and socio-demographic characteristics of northern region middle school students who received SNaX, of northern students in eligible schools; student reports of awareness of SNaX | Publicly available school district data; SNaX Student Advocate Survey |
Effectiveness: changes in cafeteria servings in SNaX schools vs. matched comparison schools | School-wide cafeteria servings data |
Adoption: # and % of selected schools that attended training and conducted SNaX, of schools in district’s northern region | Training log (kept by SNaX coordinator) |
Implementation: % of SNaX components implemented with fidelity; % of schools implementing each component; # of SNaX teacher website users; implementation barriers/facilitators | Observational fidelity checklist; teacher surveys; website usage data; interviews/focus groups with school staff and Student Advocates |
Maintenance: % of northern schools implementing SNaX in the next school year; suggestions for sustainability | Surveys of teachers; interviews/focus groups with school staff and Student Advocates |
RE-AIM element . | Data source . |
---|---|
Reach: % and socio-demographic characteristics of northern region middle school students who received SNaX, of northern students in eligible schools; student reports of awareness of SNaX | Publicly available school district data; SNaX Student Advocate Survey |
Effectiveness: changes in cafeteria servings in SNaX schools vs. matched comparison schools | School-wide cafeteria servings data |
Adoption: # and % of selected schools that attended training and conducted SNaX, of schools in district’s northern region | Training log (kept by SNaX coordinator) |
Implementation: % of SNaX components implemented with fidelity; % of schools implementing each component; # of SNaX teacher website users; implementation barriers/facilitators | Observational fidelity checklist; teacher surveys; website usage data; interviews/focus groups with school staff and Student Advocates |
Maintenance: % of northern schools implementing SNaX in the next school year; suggestions for sustainability | Surveys of teachers; interviews/focus groups with school staff and Student Advocates |
RE-AIM element . | Data source . |
---|---|
Reach: % and socio-demographic characteristics of northern region middle school students who received SNaX, of northern students in eligible schools; student reports of awareness of SNaX | Publicly available school district data; SNaX Student Advocate Survey |
Effectiveness: changes in cafeteria servings in SNaX schools vs. matched comparison schools | School-wide cafeteria servings data |
Adoption: # and % of selected schools that attended training and conducted SNaX, of schools in district’s northern region | Training log (kept by SNaX coordinator) |
Implementation: % of SNaX components implemented with fidelity; % of schools implementing each component; # of SNaX teacher website users; implementation barriers/facilitators | Observational fidelity checklist; teacher surveys; website usage data; interviews/focus groups with school staff and Student Advocates |
Maintenance: % of northern schools implementing SNaX in the next school year; suggestions for sustainability | Surveys of teachers; interviews/focus groups with school staff and Student Advocates |
Reach: school-wide socio-demographic data
Reach was assessed by comparing publicly available student body socio-demographic data (percentage of English learners, National School Lunch Program eligible students, and physically fit students; racial/ethnic distribution; and total enrollment; http://achieve.lausd.net/Page/8037) of the SNaX program vs. non-program schools in the northern region.
Reach/implementation: student advocate surveys
On surveys administered at the end of 2014–2015, Student Advocates reported socio-demographic characteristics and rated each SNaX component from 1 = Poor to 5 = Excellent, with an option “Was not aware of this/Did not participate.”
Effectiveness: school-wide cafeteria data
The district provided school-level daily cafeteria records (number of students obtaining lunch, number of fruit and vegetable servings) for the pre-program period (2013–2014) and program period (2014–2015) for the program schools and set of comparison schools. We analyzed daily cafeteria records by matching each program school with a non-program (comparison) school on socio-demographic characteristics by calculating Euclidean distances between all possible pairs to determine the closest matches. Each program school was matched with the comparison school with the smallest Euclidean distance, after which remaining comparison schools were assigned to whichever program school had the closest Euclidean distance to it. The number of comparison schools per program school ranged from 1 to 9. No comparison school was used twice.
Multivariate linear regressions were used to conduct difference-in-differences tests for the percentage of fruits, vegetables, and lunches served per day per student in attendance. Predictors included an indicator for the program period (during vs. pre), school type (program vs. comparison), and the program-period-by-school-type interaction (i.e., the average change in program and comparison schools over time, before and after the intervention start-date). Models included indicators for school membership into one of the 18 groups of schools, each with one program school and up to 9 comparison schools.
Adoption: training log
The district SNaX Coordinator kept a log of all schools that received training in 2014–2015 and 2015–2016.
Implementation/maintenance: teacher surveys
Teachers were sent a web-based, anonymous survey in 2014–2015 (to assess implementation) and 2015–2016 (to assess maintenance) with a checklist of SNaX activities. If teachers did not continue SNaX in 2015–2016, they were asked why. Furthermore, independently of the present project, the district sponsored a training on SNaX in August 2015 for 94 teachers representing 57 schools. In June 2016, a web-based, anonymous survey was sent to one trained teacher per school to assess maintenance, that is, their use of SNaX in 2015–2016.
Implementation: website data
To assess implementation, we collected data on the number of users of the SNaX teacher website from February 2015 (when the website was fully online) to June 2016 (the end of maintenance period). Because the website was public, users could include individuals not associated with the school district.
Implementation fidelity: SNaX session observational checklists
The district SNaX Coordinator rated fidelity (how closely SNaX activities adhered to the manual) by intermittently visiting each school and observing whether core components of each activity were completed “not at all,” “somewhat,” or “completely.” The Coordinator also recorded the setting of Student Advocate sessions (e.g., in class, during mealtimes).
Implementation barriers and facilitators: qualitative data
Teachers, principals, and Cafeteria Managers were interviewed about the adequacy of the training and technical assistance, ease of implementation, and suggestions for improvement. Student Advocate focus groups were asked their overall impressions of SNaX and its activities. Interviews and focus groups were recorded and transcribed. Across the 17 schools, 16 teachers, 16 Principals, 14 Cafeteria Managers, and 154 students participated.
Qualitative data were managed in Dedoose (v6.1.18). Three team members read all transcripts and independently developed a list of positive and negative perceptions of SNaX (overall and for each component), and ideas for sustainability, resulting in 27 coding categories. After attainment of inter-rater reliability between two coders (using a random sample of 20% of transcripts), one coder systematically applied the codebook to all interviews. Kappas across categories ranged from 0.66 to 1.0 (mean = 0.92).
RESULTS
Reach
Eighteen of the 26 middle schools in the district’s northern region agreed to participate and sent teacher representatives to an in-person half-day training in August 2014 (Fig. 1). One school dropped out after the training due to staffing issues. Thus, 17 middle schools representing 22,311 students implemented SNaX in 2014–2015.
The 18 schools that agreed to participate did not differ from the 8 schools that did not on most socio-demographic characteristics. Schools that did not agree to participate had higher enrollment, M (SD) = 1,688 (451) versus 1,232 (371), t (24) = 2.71, p = .01. Because intervention setting may affect reach and general implementation, we compared the 18 schools that agreed to participate to the 5 intervention schools in the original efficacy test. The racial/ethnic distribution of students in the two sets of schools was generally similar, but the dissemination schools had a significantly higher percentage of White students [14% vs. 4%; t (20.9) = 2.6, p = .02]. The dissemination schools also had a higher percentage of physically fit students [49% vs. 18%; t (21) = 4.6, p = .0001] and a lower percentage of students in the National School Lunch Program (70% vs. 86%; t (21) = 2.1, p = .046).
Across the 17 program schools, 6,410 students saw the half-hour video (16 in an assembly, 1 in classrooms); of whom, 1,046 signed up for SNaX and 472 were invited to be a Student Advocate (due to limited resources to accommodate all interested students). Of these 472 students, 397 attended the initial SNaX meeting and 242 were still enrolled at the end of the school year. A total of 187 of the 242 Student Advocates across the 17 schools completed surveys at the end of the school year in which they participated in SNaX. Overall, 65% of Student Advocates were female; 67% were Latino, 18% were White, 15% were Asian/Pacific Islander, 8% were Black, 2% American Indian/Alaskan Native, and 9% “other.” (Students could select multiple races/ethnicities.)
In Student Advocate surveys (n = 187), 65% of students reported being aware of all SNaX components, with high levels of awareness for the videos (99%), lessons (98%), posters (98%), lunchtime activities (96%), hydration station (90%), and kick-off video/assembly (88%). The components with the least awareness were the app (59%) and website (63%). Two-thirds said that they participated in at least three of the five SNaX units.
Effectiveness
None of the multivariate linear regressions indicated significant results for program versus comparison schools on the cafeteria outcomes [i.e., fruits, vegetables, and lunches served per day per student in attendance: b (se) = 2.04 (1.55), b (se) = 2.65 (3.73), and b (se) = 3.90 (2.05), respectively; all p-values >.05].
Adoption
As noted above, 18 schools participated in the 2014 SNaX training, and 17 adopted SNaX. The school that did not adopt SNaX did not significantly differ from the 17 schools that adopted SNaX on socio-demographic characteristics.
Implementation
Observations
Of 60 activities observed across the schools, 46 (77%) were rated as being completely or somewhat conducted with fidelity. The average number of observations per school was 3.33 (range = 1–6). Sixteen schools conducted sessions during non-class times (11 during lunch, 2 during breakfast or the morning nutrition break, 1 during homeroom, 2 during nutrition break/breakfast and homeroom). One school conducted the sessions during a culinary arts class, and one of the schools that conducted the sessions during lunch also did some activities in English class.
Teacher survey
The average number of activities implemented per school was 11.5 (SD = 7.2, range = 3–30). Teachers reported implementing an average of 7.5 of 8 activities (84%) in Unit 1, 6.8 (85%) in Unit 2, 6.5 (81%) in Unit 3, 6.1 (69%) in Unit 4, and 5.4 (62%) in Unit 5. Nearly all schools implemented most activities (Table 2). Two schools did not conduct Unit 4, and three did not conduct Unit 5, because the school year ended. The videos were the most well-implemented, and lunchtime taste tests were the least, possibly because this activity required coordination with the cafeteria.
Activity . | # (%) of schools completing activity . | ||||
---|---|---|---|---|---|
Unit 1: water . | Unit 2: cafeteria food . | Unit 3: fruits and vegetables . | Unit 4: physical activity . | Unit 5: physical inactivity . | |
SNaX short videos | 17 (100) | 17 (100) | 17 (100) | 15 (88) | 14 (82) |
Parent activity distribution | 17 (100) | 15 (88) | 16 (94) | 14 (82) | 12 (71) |
Hydration station promotion | 16 (94) | N/A | N/A | N/A | N/A |
Physical activity | N/A | N/A | N/A | 12 (71) | N/A |
Game show | 16 (94) | 15 (88) | 15 (88) | 12 (71) | 12 (71) |
Role-plays | 16 (94) | 16 (94) | 16 (94) | 13 (77) | 13 (77) |
Lunchtime bookmarks | 14 (82) | 14 (82) | 14 (82) | 12 (71) | 10 (59) |
Lunchtime taste tests | N/A | 10 (59) | 7 (41) | N/A | 10 (59) |
Parent activity debriefing | 15 (88) | 13 (77) | 14 (82) | 12 (71) | 10 (59) |
School-wide announcements | 16 (94) | 15 (88) | 11 (65) | 13 (77) | 11 (65) |
Activity . | # (%) of schools completing activity . | ||||
---|---|---|---|---|---|
Unit 1: water . | Unit 2: cafeteria food . | Unit 3: fruits and vegetables . | Unit 4: physical activity . | Unit 5: physical inactivity . | |
SNaX short videos | 17 (100) | 17 (100) | 17 (100) | 15 (88) | 14 (82) |
Parent activity distribution | 17 (100) | 15 (88) | 16 (94) | 14 (82) | 12 (71) |
Hydration station promotion | 16 (94) | N/A | N/A | N/A | N/A |
Physical activity | N/A | N/A | N/A | 12 (71) | N/A |
Game show | 16 (94) | 15 (88) | 15 (88) | 12 (71) | 12 (71) |
Role-plays | 16 (94) | 16 (94) | 16 (94) | 13 (77) | 13 (77) |
Lunchtime bookmarks | 14 (82) | 14 (82) | 14 (82) | 12 (71) | 10 (59) |
Lunchtime taste tests | N/A | 10 (59) | 7 (41) | N/A | 10 (59) |
Parent activity debriefing | 15 (88) | 13 (77) | 14 (82) | 12 (71) | 10 (59) |
School-wide announcements | 16 (94) | 15 (88) | 11 (65) | 13 (77) | 11 (65) |
Activity . | # (%) of schools completing activity . | ||||
---|---|---|---|---|---|
Unit 1: water . | Unit 2: cafeteria food . | Unit 3: fruits and vegetables . | Unit 4: physical activity . | Unit 5: physical inactivity . | |
SNaX short videos | 17 (100) | 17 (100) | 17 (100) | 15 (88) | 14 (82) |
Parent activity distribution | 17 (100) | 15 (88) | 16 (94) | 14 (82) | 12 (71) |
Hydration station promotion | 16 (94) | N/A | N/A | N/A | N/A |
Physical activity | N/A | N/A | N/A | 12 (71) | N/A |
Game show | 16 (94) | 15 (88) | 15 (88) | 12 (71) | 12 (71) |
Role-plays | 16 (94) | 16 (94) | 16 (94) | 13 (77) | 13 (77) |
Lunchtime bookmarks | 14 (82) | 14 (82) | 14 (82) | 12 (71) | 10 (59) |
Lunchtime taste tests | N/A | 10 (59) | 7 (41) | N/A | 10 (59) |
Parent activity debriefing | 15 (88) | 13 (77) | 14 (82) | 12 (71) | 10 (59) |
School-wide announcements | 16 (94) | 15 (88) | 11 (65) | 13 (77) | 11 (65) |
Activity . | # (%) of schools completing activity . | ||||
---|---|---|---|---|---|
Unit 1: water . | Unit 2: cafeteria food . | Unit 3: fruits and vegetables . | Unit 4: physical activity . | Unit 5: physical inactivity . | |
SNaX short videos | 17 (100) | 17 (100) | 17 (100) | 15 (88) | 14 (82) |
Parent activity distribution | 17 (100) | 15 (88) | 16 (94) | 14 (82) | 12 (71) |
Hydration station promotion | 16 (94) | N/A | N/A | N/A | N/A |
Physical activity | N/A | N/A | N/A | 12 (71) | N/A |
Game show | 16 (94) | 15 (88) | 15 (88) | 12 (71) | 12 (71) |
Role-plays | 16 (94) | 16 (94) | 16 (94) | 13 (77) | 13 (77) |
Lunchtime bookmarks | 14 (82) | 14 (82) | 14 (82) | 12 (71) | 10 (59) |
Lunchtime taste tests | N/A | 10 (59) | 7 (41) | N/A | 10 (59) |
Parent activity debriefing | 15 (88) | 13 (77) | 14 (82) | 12 (71) | 10 (59) |
School-wide announcements | 16 (94) | 15 (88) | 11 (65) | 13 (77) | 11 (65) |
Website data
From February 2015 to June 2016, the average number of SNaX website users per month was 520 (SD = 580, range = 22–2,210). The number of users increased steadily in the semester when SNaX was implemented, starting at 205 in February 2015 and peaking at 2,210 in June 2015. Fewer users accessed the website during the maintenance period, decreasing from 845 in August 2015 to 148 in June 2016.
Student advocate survey
Among the 187 Student Advocates surveyed, all were aware of SNaX. Of those who were aware of SNaX, 91% rated SNaX “in general” as excellent or very good. In addition, students reported ratings of excellent or very good for the SNaX posters (73%), videos (72%), assembly (75%), club sessions (86%), lunchtime activities (80%), hydration station (89%), website (59%), and apps (52%). Most (73%) completed at least one parent activity; 80% thought that SNaX should be continued in their school; and 85% recommended that SNaX should be provided at other schools.
Qualitative data
Stakeholders had an overall positive view of SNaX, with many describing it as fun, interesting, and engaging, as well as educational. As one student said, “It was a good program because we learned how to eat healthy and what not to eat…it made us aware of what was really bad for us.” Teachers said that the manual was easy to follow, the activities were easy to implement and, in most schools, students showed very high interest in participating. For example, one teacher said: “I felt pretty confident about implementing the lessons…I found myself opening the curriculum fifteen minutes before the club started, and it was really easy to go through.” Many school staff felt that, because the district supported the program and the program was consistent with the district health and wellness policy (see http://achieve.lausd.net/healthandwellness), the program was more easily adopted: “Cafeteria was helpful and willing. Front office was helpful and willing. Any teacher that we wanted to join in with their activity was open and willing…It was like an open door for me. It was just a matter for me figuring out what goes when and where…but there was no obstacles for me at all.”
Staff and students provided insights about why some components were implemented more than others. When asked to specify which components contributed to their positive assessment, both students and school staff mentioned the videos and the hydration station. As one school staff member said, “The kids loved [the video], and the message was clear. You have a lot of English learners in the audience…the visuals helped to communicate the message.” Another commented on the hydration station: “We had a good turnout…both the upper grade and lower grades… . Normally the upper grade is like, ‘Nah, we too grown for this stuff…’. But they came and they drank the water.”
Although not all students were aware of the videos and apps, those who had seen the video or used the app generally had positive feedback, describing the videos as “funny,” “interesting,” and “amusing” while being “educational,” and the app as “fun.” Teachers said that students liked the taste tests. As one teacher observed, “I think the tasting is what really does seem to stand out…[The Cafeteria Manager] put [the samples] in these little packages and she sealed them and the kids ate it...just because it was presented differently, they ate the food.” Students also liked the giveaways (“I liked the wristband because I could take it everywhere and keep it on”). Staff noted that students brought the messages home to their families, working with parents on the take-home activities and advocating for change within their homes: “The kids loved doing it at home…I’ve had a couple of kids tell me that because they’ve gone over it with their mom when they go shopping, they know what not to get anymore…I went to the gym with my mom, or I went running, or I don’t eat that with my mom anymore, we don’t eat that.”
Some components were viewed less favorably. Student Advocates sometimes had difficulty engaging fellow students when distributing bookmarks: “I wasn’t that comfortable passing them out…some people would rip it up and put it on the floor.” Teachers also felt that some students were uncomfortable with the bookmark activity (“when they knew that they had to start talking to other kids…they just shied away”).
Teachers and Cafeteria Managers also noted challenges. Some students did not have water bottles to use when cups were not available at the hydration station (“water bottles were the limiting factor”), and the water jugs broke, were too small for a whole school, and were difficult to keep clean over time. The hydration station required a dedicated staff member or parent volunteer to supervise (“The kids want to play with the water and the water containers”). Furthermore, some teachers did not use the website due to functionality issues or lack of access to computers in the classroom.
Other implementation barriers were discussed by some teachers, including the perception that SNaX was logistically challenging, complex, and labor-intensive. They also found the whole-school kick-off assembly hard to organize and said that it was difficult for SNaX to compete with other activities and clubs offered at the same time. Although most staff felt the school supported the program, some principals and Cafeteria Managers were more involved than others. In these schools, principals and Cafeteria Managers said that they could have helped more had there been better communication with the teachers: “I think the greatest barrier was the fact that the teacher went to the training, so the teacher had the big vision, whereas the administrator got bits and pieces, and didn’t really have clarity on what the expectation was for support of the program, and the Cafeteria Manager didn’t have that, either. So I think in the future, although the principal and the Cafeteria Manager may not need the 8 hours training… even if they were to be brought in…for the first hour or so, just to get the big picture—the teacher needs to have a cheerleader, and when you don’t have an administrator on the site who really understands the program, it’s just another program... .”
Maintenance
Of the teachers in the 17 schools that implemented SNaX in 2014–2015, 14 (82%) completed the web-based survey that assessed maintenance in 2015–2016. Six (35%) of the original 17 schools had continued SNaX. All six hung SNaX posters and/or banners, five had club meetings, five had giveaways (bookmarks, buttons), five set up the hydration station, and four showed videos. Two schools distributed the parent activities, two made school-wide announcements and one provided the lunchtime physical activity promotion. Four distributed water samples, and none organized taste tests. Teachers in four schools used the SNaX website. Of the eight other schools that responded to the survey, three did not continue SNaX due to time constraints and three due to staffing limitations; one school reported that they were not asked to continue SNaX, and the other school did not answer the question.
Of the 94 teachers trained by the district to implement SNaX in 2015–2016, 26 (from 46% of the schools) responded to the web-based survey, indicating that 14 of these schools conducted SNaX activities in the 2015–2016 school year. Six teachers said that they showed the videos, six said that they hung posters and/or banners, five engaged in lunchtime activities (with water samples, taste tests, or physical activity promotions), and four distributed parent activities. Only three distributed giveaways (bookmarks, buttons) and only two set up the hydration station. Eight said that they used the SNaX website.
Twelve teachers noted reasons why their school did not implement SNaX. The most common reason was insufficient time to cover SNaX in addition to curriculum requirements, especially if they had already covered nutrition in their class (n = 6). Two said that they had planned to conduct SNaX but had not yet been able to do so. Two did not feel that the training was sufficient. One teacher thought that the reading level was too high for students, and another was no longer teaching a related subject.
Insights regarding how to improve the program for greater sustainability were drawn from the 2014–2015 qualitative interviews. Many staff thought SNaX could be scaled up to be a whole-school activity, in which more teachers and students could be involved as part of a school wellness initiative. A principal said, “If they want it to have an impact, it just can’t be viewed as a club. It needs to be something implemented throughout the whole school.” Teachers who conducted SNaX sessions during lunchtime felt that a longer period of time was necessary, suggesting that SNaX be conducted during homeroom, as an elective, or as an after-school club (e.g., “build this program into the curriculum, and then have the fun activities connected with the club…”). Staff also suggested creating a learning network of teachers in different schools who could share their experiences and successes. Teachers suggested developing training videos and sponsoring online chat rooms or in-person sharing sessions for teachers throughout the semester (e.g., “a program is going to blossom if there’s a connection”).
DISCUSSION
In this study of the dissemination of SNaX, we used RE-AIM to examine real-world challenges in translating evidence-based interventions into practice in schools. To facilitate implementation, we used teacher training and technical assistance by a full-time paid district champion [6], as well as the provision of all program materials. SNaX was acceptable to multiple types of stakeholders and feasible to implement. Nearly all of the schools implemented most of the core program activities. Nevertheless, relatively few schools sustained SNaX, possibly because few institutional supports were provided to continue the program. In addition, SNaX did not show significant overall effects on school-wide cafeteria servings, as had been shown previously [11].
One reason for the observed lack of effectiveness may be the difference in reach between the efficacy test and the dissemination test. In the efficacy test, a greater number of seventh graders in each school participated as Student Advocates. In the dissemination test, a smaller number of Student Advocates participated per school, possibly because, consistent with community-based participatory research principles, we encouraged schools to tailor the program to their circumstances and resources. In addition, the efficacy test took place in schools with a higher percentage of disadvantaged students (in the National School Lunch Program) and a lower percentage of physically fit students. Students who receive free and reduced-price lunch may have been more likely to eat lunch in the cafeteria and thus to be exposed to program activities (such as the lunchtime taste tests), potentially contributing to greater program effectiveness. Furthermore, the efficacy test showed long-term effects on BMI only for students with obesity, perhaps because SNaX messages were more likely to resonate with this group; thus, the lower proportion of students with obesity in the dissemination schools may have reduced intervention effectiveness. An additional reason for the lack of effects may be because schools did not provide extra fruit and vegetable servings, as they did in the efficacy test.
Limited district and school resources may have contributed to lower fidelity, because, as noted above, key intervention elements thought to lead to behavior change were revised due to feasibility concerns. To guard against inconsistent implementation between an original protocol and a disseminated program, program developers could address the potential for key intervention components to be compromised, by designing the original trial to try to determine which core intervention components are essential to conducting with fidelity during dissemination and which can be tailored and adapted to individual settings, especially in the face of fewer resources. The need for fidelity to key components of the protocol could be emphasized during dissemination training with stakeholders, and technical assistance could help with adaptation to local contexts and problem-solving around how to maintain fidelity. In this way, stakeholders who wish to implement the program would be fully informed and trained on necessary program elements, and thus can justify the cost of implementation fidelity with the greater likelihood of community improvement.
In the case of SNaX, one way to ensure that the program is delivered to a greater number of students is to integrate it into the curriculum, as suggested by principals and teachers in the present study. Curriculum-based obesity prevention programs have shown effectiveness in dissemination—and also are better positioned for long-term maintenance [3–5]. Integration of a program into the curriculum is likely to be more successful if it aligns with existing curriculum requirements [3–5]—and SNaX is consistent with district wellness policy that requires, for example, nutrition education.
A key SNaX component, providing chilled filtered water to students at mealtimes, is being sustained through CA SB1413, which requires districts to provide free drinking water during mealtimes in cafeteria areas, and CA SB496, which provides funding for school water stations. SNaX’s formative research about students’ lack of access to clean, working fountains prompted members of the community advisory board to help draft CA SB1413, which became a model for the federal Healthy Hunger Free Kids Act.
Several limitations should be noted. Our dissemination evaluation was limited to one region of one large urban school district. We could not conduct a valid test of effectiveness without a randomized control group; we also did not have the resources to conduct a full nutrition and physical activity assessment. For example, the hydration station was well-implemented in the dissemination study, but we did not assess water consumption—a behavior on which we found effects in the efficacy test.
In summary, the present evaluation showed that SNaX is acceptable, but only a handful of schools had the capacity to maintain the program, and the disseminated program did not show effects on eating behaviors in the school as a whole, possibly because only a relatively small number of students were actively involved. Adequate training and technical assistance are likely key in helping schools to appreciate the importance of maintaining fidelity to core intervention components linked to behavior change and when necessary, to tailor such components without compromising their effectiveness.
Compliance with Ethical Standards
Conflict of Interest: None of the authors has any potential conflicts of interest to report.
Primary Data: The findings reported have not been previously published and the manuscript is not being simultaneously submitted elsewhere. The data have not been previously reported. The authors have full control of all primary data and agree to allow the journal to review their data if requested.
Funding: This work was funded by the National Institute on Minority Health and Health Disparities (R24 MD001648).
Ethical Approval: All study procedures were performed in accordance with the ethical standards of the RAND Human Subjects Protections Committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed Consent: Informed consent was obtained from all individual participants included in the study. No animals were used in this study.