Methods Centers

Evaluation and Monitoring Methods

The Evaluation and Monitoring Methods Center reinforces and improves the quality of evaluation work at Abt Associates by offering short courses, one-on-one consulting, and support groups on evaluation methods. 

These efforts focus on three levels of expertise:
 
  • Basic competency – including short courses on current, but standard, methods for people at a variety of levels. This is the knowledge needed by our staff for the day-to-day work of evaluation; 
  • State of the art – in which reading groups, external speakers, design review seminars, one-on-one consulting, and other forums bring staff up to the most current, most appropriate, and most innovative work related to a field., and; 
  • New methods development – which focuses in part on developing a solid understanding of state of the art methods because that is the foundation on which new methods usually are built. The Evaluation Methods Center also supports pre-submission reviews of articles and other innovation support. 
Social Policy Impact Pathfinder The Social Policy Impact Pathfinder (SPI-Path) is a suite of tools that offers clients the opportunity to learn more from the experimental impact evaluations in which they invest, to address not just overall effectiveness but key question of “what works?”
Learn More >
Switch to single-page view (no tabs)

Center Directors

KlermanJacob A. Klerman is one of Abt’s eight Senior Fellows and a Principal Associate in Abt’s Social & Economic Policy division. He has more than 25 years of experience in social policy research, including social welfare policy, labor markets, nutrition policy, military manpower, data quality issues, and applied econometrics. He is an expert in both experimental (i.e., random assignment) and quasi-experimental (difference-in-differences, regression discontinuity, instrumental variables) evaluation of social programs. 

Klerman has published widely in refereed journals including: American Economic Review, Journal of the American Medical Association, Health Affairs, Journal of Human Resources, Journal of Labor Economics, Journal of Policy Analysis and Management, Demography, Health Services Research and Evaluation Review (where he serves as an Associate Editor).

Klerman received a B.S. in Economics-Applied Mathematics from Brown University and his M.A. in Economics from the University of Chicago. 

Minki Chatterji, a principal associate for International Health, has extensive experience in evaluation and business development. Chatterji has been the Research Director for the USAID-funded Strengthening Health Outcomes through the Private Sector (SHOPS) project since 2011. She set an ambitious research agenda to answer key programmatic questions related to private sector health in developing countries. The agenda includes more than 25 research studies, process evaluations, and impact evaluations, with a focus on family planning, HIV and AIDS, and maternal and child health. Her countries of experience include Angola, Ethiopia, Ghana, Haiti, India, Indonesia, Jamaica, Jordan, Kenya, Nigeria, Rwanda, Tanzania, Thailand, Uganda, United States, and Zambia.

Prior to Abt, Chatterji she contributed to 24 international evaluation bids with a variety of clients for Mathematica Policy Research Inc., including the Millennium Challenge Corporation, USAID, the Department for International Development, UNICEF, the World Bank, the Gates Foundation, and Merck. These bids spanned a wide range of content areas, including health, education, early childhood development, infrastructure, agriculture, and democracy and governance. These bids resulted in several new international evaluation contracts for Mathematica

Dr. Chatterji holds a Ph.D. in Demography from the University of California at Berkeley, an M.A. in Southeast Asian Studies from Cornell University, and a B.A. in History and Asian Studies at Colgate University. 




Publications

Abt uses evaluations research methods in many aspects of our work. Below are some Abt methodological papers that the Evaluation Methods Center shares as adept examples of our Abt colleagues’ methods work.

Bell, Stephen H. & Laura R. Peck.  (2012). “Obstacles to and Limitations of Social Experiments: 15 False Alarms,” Bethesda, MD: Abt Thought Leadership Paper on Improving Policy Evidence Through Research Methods Innovation, #2012-1.
 
Epstein, Diana and J.A. Klerman.  2012.  “When is a Program Ready for Rigorous Impact Evaluation?”  Evaluation Review.  36(5):  373-399.
 
Judkins, D., Morganstein, D., Zador, P., Piesse, A., Barrett, B., Mukhopadhyay, P. (2007).  “Variable Selection and Raking in Propensity Scoring,” Statistics in Medicine, 26, 1022-1033. 
 
Klerman, J.A.  2010. “Contracting for Independent Evaluation:  Approaches to an Inherent Tension.”  Evaluation Review 34(4, August):  299-333.
 
Olsen, Robert B., Larry L. Orr, Stephen H. Bell, and Elizabeth A. Stuart. “External Validity in Policy Evaluations that Choose Sites Purposively,” Journal of Policy Analysis and Management, Winter 2013.
 
Peck, Laura R. (forthcoming).  “On Analysis of Symmetrically-Predicted Endogenous Subgroups: Part One of a Method Note in Three Parts,” American Journal of Evaluation, 34(2).
 
Peck, Laura R.  (2005). “Using Cluster Analysis in Program Evaluation,” Evaluation Review, 29(2), 178-196. DOI: 10.1177/01933841X04266335
 
Olsen, R.B., Unlu, F., Jaciw, A.P., and Price, C. (2011). Estimating the Impacts of Educational Interventions Using States Tests or Study-Administered Tests. (NCEE 2012- 4016). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
 
Puma, M. J., Olsen, R.B., Bell, S.H., Price, C. (2009). What to Do When Data Are Missing n Group Randomized

Controlled Trials (NCEE 2009-0049). Washington DC. National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.