DATES: SEP 25 - OCT 6, 2017    
       
TUITION: $3950    
       

 

 

 


Overview

This seminar familiarizes participants with project monitoring and evaluation (M&E) systems and tools that focus on results in international development. The seminar offers participants both a conceptual framework and practical skill development.

 

Course Outline 

Results-Based Management (RBM) in International Development

  • Understanding and distinguishing between monitoring and evaluation in the context of RBM
  • Problem identification
  • Development of casual hypotheses (inputs, outputs, outcomes and impacts)
  • Feeding monitoring and evaluation findings into decision-making
  • Role of partners and stakeholders
  • Significance of "soft" assistance

Planning for and Executing the Monitoring and Evaluation Processes

  • Key principles for overall work planning
  • Purpose and timing (including ex-post) of monitoring and evaluation
  • Involving key partners and stakeholders
  • Building teams with defined roles and strong capabilities
  • Establishing a hierarchy of project objectives
  • Defining scope of monitoring and evaluations
  • Selecting analytical tools, methodologies or approaches enabling measurement and attribution
  • Importance of data quality and collection, and baseline data
  •  
    Developing indicators to measure progress and identify gaps
  • Development and selection of evaluation questions and teams
  • Budgeting for monitoring and evaluation
  • Managing monitoring and evaluation processes
  • Anticipating and resolving problems

Tools, Methods and Approaches Facilitating Monitoring and Evaluation

  • Performance indicators and common rating systems 
  • Logical framework approach (LogFrame) and results framework approach
  • Qualitative and quantitative data collection methods
  • Formal surveys
  • Rapid appraisal methods
  • Participatory methods
  • Field visits
  • Public expenditure tracking surveys
  • Economic analysis, including cost-benefit and cost-effectiveness analysis
  • Performance and process evaluation design 
  • Impact evaluation design and purpose
  • Evaluation and tracking plans
  • Annual reviews and reports
  • Comparative overview of other tools, methods and approaches used by leading global institutions

Knowledge and Learning

  • Learning from evaluative evidence and applying recommendations from feedback
  • Improving evaluation feedback
  • Knowledge management
  • Institutionalization of learning

Course Advisor

Ms. Danielle de Garcia  is the Director of Performance Evaluation, Innovation, and Learning at Social Impact (SI). She has 12 years’ experience with monitoring and evaluation (M&E), organizational capacity building, and participatory methodologies in more than 25 countries. As a facilitator, Mrs. de García has developed curriculum and trained hundreds of U.S. Agency for International Development (USAID), U.S Department of State (DOS), Millennium Challenge Corporation (MCC), and non-governmental organization (NGO) personnel in results-based management and M&E. Her recent work includes the design, development, and delivery of M&E trainings for the US Institute of Peace, USAID, the International Law Institute and MCC; providing Managing for Results training and Country Development and Cooperation Strategy assistance to USG staff globally; providing strategic planning and project alignment for the World Bank; and serving as a team member or team leader on a number of assessments and evaluations for Carter Center, IREX, USAID, MCC, MasterCard Foundation, and MacArthur Foundation initiatives around the world. Beyond serving as an evaluation team leader and team member, she also provides advice and technical assistance to national and international organizations in the development of M&E systems. Mrs. de García holds an MPA in International Management, a certification in Development Project Management, and is a Certified Performance Technologist for human and institutional capacity development.