This page is optimized for a taller screen.
Please rotate your device or increase the size of your browser window.
Online Tool for Systematic Evidence Review
Many social service environments, including child welfare, have seen a clear movement toward evidence-based practice as a way to improve the outcomes for children, youth, and families. Evidence clearinghouses help determine the quality and credibility of the evidence about particular programs, performing a valuable service for policymakers, program administrators, and researchers. But reviewing the evidence is a challenging task.
One challenge of carrying out a systematic evidence review is the large volume of studies on programs and services. Another is the review process itself. It requires trained reviewers to individually rate studies on multiple dimensions of study quality and record details about the outcomes and findings of studies that meet minimum quality standards. Trained reconcilers then reconcile the individual reviews into a master review. The work of the Title IV-E Prevention Services Clearinghouse is a case in point. The program supports children and families and prevents foster care placements. To arrive at program and service ratings, the Clearinghouse considers all studies of a program that meet design and execution standards and uses a transparent process to arrive at one of four ratings: well-supported, supported, promising, and does not currently meet criteria.
In the past, most evidence clearinghouses performed this process using separate spreadsheets or forms for each study. Reconciling the forms and aggregating the findings involved additional data entry and recordkeeping to keep track of the forms.
Abt Associates uses an innovative approach to help automate parts of the review process: the Online Review Tool (ORT). The ORT cuts that time while increasing the review process’s quality and consistency. The tool supports the entire systematic evidence review process and feeds directly into a public web interface to disseminate the findings from the reviews. A custom Drupal application, the ORT can work with any existing Drupal system as well as other systems.
The automated process is straightforward. Abt staff import studies and document references into an eligibility screening module. This facilitates double screening and priority scoring of each study. Eligible studies move into the study review module for double-blind review to assess them for quality, capture comparison groups and target outcomes (e.g., child safety, child and adult well-being), and record study findings. Most data entry forms use either custom Twig templates or Vue.js components with validation checks and skip logic to ensure high quality data.
The outcomes and study findings are recorded in a spreadsheet-like grid for rapid data entry across multiple records for each study review. The grid has groupings of columns to hide rarely-used sections of data to improve efficiency. Abt staff can make multiple copies of any record in the grid with a click of a button. Study findings are recorded using data from the study documents and are then subjected to extensive calculations to generate effect sizes, p-values, and numerous other computed values, including a suggested overall rating for each finding.
Once the study reviews are approved for a program or service, Abt staff enter information about the program characteristics and implementation details. As an additional quality check, the program edit page contains an embedded Drupal view to display all findings rated high and moderate and outcome domain information. Abt staffers use this to establish the program rating and check findings for consistency and accuracy. A “Save & Preview” mechanism gathers all findings and details about the studies and saves the aggregated data back to the program record.
The tool then generates a series of outcome domain-level and individual-study findings records and a preview version of the program page displays. This functionality enables Abt staff and our federal clients to view the program before it is published to the program page on the public site. Upon publication, the system captures a snapshot of the program and associated findings.
Abt staff used a stakeholder-driven, human-centered design process to gather input from practitioners to inform the criteria for the website, which includes a faceted search interface. We then refined the requirements for the front-end, public interface, which displays the program data. The public interface uses the Drupal content management modules and a clean visual design. We implemented Google Analytics dashboards for ease in viewing results, applied search-engine optimization techniques to increase site visits, and established a MailChimp email notification system to provide important site updates. To protect data, the system has FISMA Low Authorization to Operate in an Amazon Web Services production environment consisting of auto-scaled web servers with automated recovery using CodeDeploy.
Planned enhancements may include:
dashboards for internal use that will enable us to automate reporting to clients
a query tool for internal use to extract data from a variety of programs and study characteristics to enable alternative analyses
data visualizations on the public website to compare domain effect sizes and other data across programs.