Evaluation and Monitoring Methods

The Evaluation and Monitoring Methods Center reinforces and improves the quality of evaluation work at Abt Associates by offering short courses, one-on-one consulting, and support groups on evaluation methods. 

These efforts focus on three levels of expertise:
  • Basic competency – including short courses on current, but standard, methods for people at a variety of levels. This is the knowledge needed by our staff for the day-to-day work of evaluation; 
  • State of the art – in which reading groups, external speakers, design review seminars, one-on-one consulting, and other forums bring staff up to the most current, most appropriate, and most innovative work related to a field., and; 
  • New methods development – which focuses in part on developing a solid understanding of state of the art methods because that is the foundation on which new methods usually are built. The Evaluation Methods Center also supports pre-submission reviews of articles and other innovation support. 
Social Policy Impact Pathfinder The Social Policy Impact Pathfinder (SPI-Path) is a suite of tools that offers clients the opportunity to learn more from the experimental impact evaluations in which they invest, to address not just overall effectiveness but key question of “what works?”
Learn More >
Switch to single-page view (no tabs)

Center Director

KlermanJacob A. Klerman is one of Abt’s eight Senior Fellows and a Principal Associate in Abt’s Social & Economic Policy division. He has more than 25 years of experience in social policy research, including social welfare policy, labor markets, nutrition policy, military manpower, data quality issues, and applied econometrics. He is an expert in both experimental (i.e., random assignment) and quasi-experimental (difference-in-differences, regression discontinuity, instrumental variables) evaluation of social programs. 

Klerman has published widely in refereed journals including: American Economic Review, Journal of the American Medical Association, Health Affairs, Journal of Human Resources, Journal of Labor Economics, Journal of Policy Analysis and Management, Demography, Health Services Research and Evaluation Review (where he serves as an Associate Editor).

Klerman received a B.S. in Economics-Applied Mathematics from Brown University and his M.A. in Economics from the University of Chicago. 


Abt uses evaluations research methods in many aspects of our work. Below are some Abt methodological papers that the Evaluation Methods Center shares as adept examples of our Abt colleagues’ methods work.

Bell, Stephen H. & Laura R. Peck.  (2012). “Obstacles to and Limitations of Social Experiments: 15 False Alarms,” Bethesda, MD: Abt Thought Leadership Paper on Improving Policy Evidence Through Research Methods Innovation, #2012-1.
Epstein, Diana and J.A. Klerman.  2012.  “When is a Program Ready for Rigorous Impact Evaluation?”  Evaluation Review.  36(5):  373-399.
Judkins, D., Morganstein, D., Zador, P., Piesse, A., Barrett, B., Mukhopadhyay, P. (2007).  “Variable Selection and Raking in Propensity Scoring,” Statistics in Medicine, 26, 1022-1033. 
Klerman, J.A.  2010. “Contracting for Independent Evaluation:  Approaches to an Inherent Tension.”  Evaluation Review 34(4, August):  299-333.
Olsen, Robert B., Larry L. Orr, Stephen H. Bell, and Elizabeth A. Stuart. “External Validity in Policy Evaluations that Choose Sites Purposively,” Journal of Policy Analysis and Management, Winter 2013.
Peck, Laura R. (forthcoming).  “On Analysis of Symmetrically-Predicted Endogenous Subgroups: Part One of a Method Note in Three Parts,” American Journal of Evaluation, 34(2).
Peck, Laura R.  (2005). “Using Cluster Analysis in Program Evaluation,” Evaluation Review, 29(2), 178-196. DOI: 10.1177/01933841X04266335
Olsen, R.B., Unlu, F., Jaciw, A.P., and Price, C. (2011). Estimating the Impacts of Educational Interventions Using States Tests or Study-Administered Tests. (NCEE 2012- 4016). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Puma, M. J., Olsen, R.B., Bell, S.H., Price, C. (2009). What to Do When Data Are Missing n Group Randomized

Controlled Trials (NCEE 2009-0049). Washington DC. National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.