Sign Up For Newsletter

Blueprints For Healthy Youth Development logo

Achieving Successful Implementation of a Blueprints Program

A Letter from Our Director

There is an age-old saying, “It is not what you do, but how well you do it that counts.” This statement is a gentle reminder that we tend to emphasize the outcome of our efforts over the process. Much attention has focused on identifying effective research-based programs. In contrast, there has been much less awareness of the factors needed to successfully implement such programs. In other words, we now know what to implement, but we know very little about how.

The importance of the process of implementation cannot be overstated. In fact, as the adage suggests, the process of implementation influences the product. While programs are often thought of as a uniform set of elements that are provided to clients in a consistent manner, there can be great variability in the manner in which programs are delivered. For example, it is likely that adopting sites will vary in their level of support from key staff; organizational capacity to plan for and support the program; and availability of capable, trained staff to conduct the program. Deficits in these areas may undermine program effectiveness. Therefore, it becomes clear that improving the health and well-being of youth requires us to pay close attention to how programs are implemented.

Although schools have become a primary locus of prevention efforts, the National Study of Delinquency Prevention in Schools concluded that the quality of school prevention activities is generally poor, and prevention activities are not being implemented with sufficient strength and fidelity to produce a measurable difference in the desired outcomes (Gottfredson, et al., 2000).

These types of findings underscore the need for greater attention to the quality of implementation. Strategies used to help teachers implement with high fidelity (e.g., yearly training, onsite visits, classroom observations) and lessons learned from a large-scale Blueprints dissemination project of the LifeSkills Training program (featured in this Newsletter) are highlighted below.

The Center for the Study and Prevention of Violence (CSPV) was awarded corporate funding to disseminate and monitor fidelity of a Blueprints-certified Model Plus program, LifeSkills Training (LST). The 2016-2017 academic year marked the first year of a three-year funding cycle, and LST was delivered to 129,051 middle school students from 148 school districts across 14 states. Throughout LST implementation, CSPV provides yearly teacher training, training of trainers, and sustainability training. Process evaluation data is collected to monitor and reinforce program fidelity through annual site visits, classroom observations, and site personnel surveys. Site visits entail semi-structured interviews with LST instructors and school administration as well as observations of LST lessons. The goal of site visits is to identify and rectify challenges during implementation and prevent these issues from recurring. Additionally, throughout program delivery, on-site classroom observers conduct intermittent, unscheduled observations using a standardized worksheet, which assess the thoroughness and quality of the LST lesson. In order to ensure reliability of observations, observers are trained by CSPV prior to program implementation and complete a co-observation with CSPV during the site visit. Lastly, after the curriculum is delivered, teachers provide feedback through an online questionnaire pertaining to implementation fidelity, dosage, and student response.

Using these methods, in 2016-2017, CSPV obtained favorable results in regards to program fidelity. Ninety-three percent of eligible students within the grant-funded school districts received LST. Moreover, the majority of instructors reported implementing the entire core curriculum (84%) and delivering LST at least once per week for consecutive weeks (88%) in accordance with fidelity guidelines. In addition, observation data revealed that instructors covered an average of 87% of key lesson points and objectives in the portion of the lesson taught in a given session. Furthermore, classroom observers indicated that instructors amply incorporated the interactive teaching techniques recommended by the program developer, comprising discussion (40% of the time, on average), behavioral rehearsal/practice (21%), and coaching/demonstration (13%).

These findings demonstrate that practices exist to assuage concerns related to fidelity of prevention program delivery. Monitoring of implementation by an outside agency helps maintain program momentum and quality implementation, which ultimately result in successful outcomes.

Gottfredson, G.D., Gottfredson, D.C., & Czeh, E.R. (2000). Summary: National study of delinquency prevention in schools. Ellicott City, MD: Gottfredson Associates, Inc.

Return to Blueprints Bulletin Issue 6. April 2018.

Contact

Blueprints for Healthy Youth Development
University of Colorado Boulder
Institute of Behavioral Science
UCB 483, Boulder, CO 80309

Email: blueprints@colorado.edu

Sign up for Newsletter

If you are interested in staying connected with the work conducted by Blueprints, please share your email to receive quarterly updates.

Blueprints for Healthy Youth Development is
currently funded by Arnold Ventures (formerly the Laura and John Arnold Foundation) and historically has received funding from the Annie E. Casey Foundation and the Office of Juvenile Justice and Delinquency Prevention.