This page is optimized for a taller screen. Please rotate your device or increase the size of your browser window.

Re-Thinking Evidence-Based Practice

May 25, 2021

A number of years ago, I was presenting on a systematic evidence review of high school dropout prevention programs at a conference.

The gist of the talk was that there was a great variety of programs available, many worked (though some did not), and—most important—the type of program was not associated with the size of the impacts. That is, a number of different program models had positive impacts on dropout and I could not point to a particular program model that worked better than any other. During the Q&A after the talk, a guy stood up and said something to the effect of “Dropout is a real problem in my school district and I need to do something about it. So, if you had to bet your retirement on one program, which one would you pick?”

This was not an unexpected question. Figuring out which programs work best is, after all, the point of doing an evidence review. Nevertheless, at the time, I didn’t have a good answer, but it got me thinking about how we typically think about evidence-based practice and what we might do differently to inform dropout prevention programs. Along with a number of other researchers who are thinking along these lines, I realized that identifying effective program models might not be the only thing we could do with systematic evidence reviews;  we might focus on effective practices instead of (or in addition to) effective programs.

The most common approach to evidence-based practice focuses on identifying distinct model programs that have demonstrated positive impacts. Such model programs usually have a brand name (e.g., Triple P Positive Parenting Program, Multisystemic Therapy), are generally accompanied by a manual, and sometimes offer training or certification by the program developer. These programs typically receive the “evidence-based” designation from an evidence clearinghouse, such as the What Works Clearinghouse or the Prevention Services Clearinghouse, if a rigorous evaluation of the program shows a statistically positive impact. But, despite the considerable investment required to develop and rigorously test model programs–and the additional investment required to review, organize, and post such programs by the various clearinghouses–a large majority of programs in operation in the field are not model programs at all, but rather homegrown or locally developed programs.

If most agencies are delivering programs already and not seeking to adopt a new model program, how can program effectiveness research and systematic evidence reviews support them? The developing area of core components research offers one answer. This alternative approach to evidence-based practice hinges on the fact that most programs aren’t monolithic. Programs are often made up of multiple components (e.g., afterschool programs may have a tutoring or homework segment, arts or recreational elements, and a social skills program). The impacts of any program may also be partly a function of other factors, including the setting, participants, providers, dosage, fidelity. In our meta-analysis work, factors such as format or dosage or personnel can be as important as the particular program in predicting the size of program impacts (e.g., Wilson, Tanner-Smith, Lipsey, Steinka-Fry, & Morrison, 2011). The idea behind core components approaches is to treat programs not as silos, but rather to dig into their components and the contexts of their delivery to identify practices that are effective.

Our application of this alternative approach is to use meta-analysis to empirically identify the program characteristics that are strongly associated with positive program effects and then translate those core components into specific actionable guidance for improving practice in the field. The resulting practice guidelines for use by provider agencies are flexible and adaptable to a range of settings, may be more cost conscious, and permit providers more control over their own processes of improvement than the model programs approach.

In our work with partners at the Office of the Assistant Secretary for Planning and Evaluation (ASPE) within the U.S. Department of Health and Human Services, we used a comprehensive meta-analysis of hundreds of rigorous studies of youth programs to develop recommendations for reducing behavior problems and improving social competence. Here are three examples of how core components were translated into practice recommendations:

  • For programs that seek to reduce behavior problems through a skills-focused approach, the evidence indicates that using a manual or dedicated lesson plans produces greater reductions in participants’ problem behavior as opposed to using an intervention that is not manualized.
  • For programs that seek to improve social competence through a family or parenting approach, the evidence indicates that incorporating opportunities for individualized interactions with participants and families produces greater improvements in social competence.
  • For programs that seek to improve social competence through a skills-focused approach, the evidence indicates that delivering the intervention more than once per week produces greater improvements in social competence than less frequent programs.

For programs that aren’t already using the recommended practices, our guidelines offer practitioners some feasibility considerations that can help them determine if implementing a particular practice is right for their circumstances. If it is, our guidelines include several action steps to assist in implementation. The guidelines are intended to be flexible and modular so that practitioners can use evidence for program improvement without having to adopt an entirely new evidence-based model.

The next phase for continuing this work would be to examine whether programs that use the recommended practices show better outcomes and test whether programs that use the guidelines to change their practices actually see better outcomes as a result.

Find the full practice guides and corresponding technical reports here.

 
Work With Us
Ready to change people's lives? We want to hear from you.
We do more than solve the challenges our clients have today. We collaborate to solve the challenges of tomorrow.