This page is optimized for a taller screen. Please rotate your device or increase the size of your browser window.

Conference 5/2-3: 2018 UKES Annual Evaluation Conference

Wednesday, May 2, 2018 – Thursday, May 3, 2018

Abt Associates evaluation experts will present at the 2018 UK Evaluation Society (UKES) Annual Evaluation Conference. This year's theme is "the quality of evidence from evaluation: demand, supply and use" and focuses on quality of evaluation throughout the evaluation cycle.

Abt staff will present the following:

Parallel Session 2.3: Quality of Evidence

EVIRATER – A System for Rating Research Evidence
Wednesday, May 2, 2:45 p.m.

Cristofer Price, Abt Associates
The focus of this session will be on the development and application of the EVIRATER, a promising tool for evaluating the rigor of research evidence. Session participants will be introduced to the existence, characteristics and function of the EVIRATER tool and after a brief orientation to the EVIRATER review protocol will apply the tool to several example research designs to gain an understanding of how the tool works and the kinds of ratings it produces. Feedback from study participants will contribute to future versions of the tool.

Parallel Session 4.1:  Improving the Demand for High Quality Evaluations and Building Trust in Evaluation Evidence
Challenges and Facilitators in Gaining Buy-In for High Quality Evaluations: General Lessons Drawn
from the Example of AgResults

Thursday, May 3, 9:30 a.m.
Dr. Tulika Narayan, Abt Associates; Andrew Shaw, UK DFID
The discussion panel will explore the challenges that programs face in getting a high quality evaluations, and when high quality evaluations are particularly needed. Drawing from the experience of DFID generally, and AgResults specifically, the presenters will share the key elements that have helped generate high demand for quality evaluations, and trust in the evaluation results. Through a discussion, the panelists will draw out details from evaluations of AgResults pilots, and the key lessons learned from the evaluation that led to influential changes in the program itself. In doing so, they will also discuss the challenges in conducting the evaluation if the early lessons of the evaluation results in program changes, and if there are any conflicts there in.   

Parallel Session 4.5:  Approaches to Conducting High Quality Evaluations and Generating Useful Evidence

Lessons for Experimental Evaluation in Practice
Thursday, May 3, 9:30 a.m.
Dr. Laura Peck, Abt Associates
This presentation will address recent developments in experimental evaluation research with respect getting inside the so-called black box. The “black box” refers to program implementation and operations, which are thought to be ignored in impact evaluations; but they need not be. From a design perspective, “black box opening” methods include variants of the standard treatment-control evaluation designs such as multi-armed, multi-stage and multi-dimensional (factorial) designs. These design options offer flexible solutions to practitioners interested in learning more about various aspects of their programs’ operations.  Both established and new ideas will be brought together to give fresh attention to how experiments can be designed to be more informative to meet future program and policy needs. 

When: May 2-3, 2018

Where: London, UK

For more information, visit the conference website.