Request for Proposals – Evaluation Consultant
Thematic Evaluation of the Impact of Leadership Training in Transition Contexts
I. Abstract
The Center for International Private Enterprise (CIPE) is seeking an independent evaluation consultant to conduct a formative and impact evaluation comparing the effectiveness of entrepreneur and leadership training for four different program models.
II. CIPE Overview
The Center for International Private Enterprise (CIPE) is a non-profit affiliate of the U.S. Chamber of Commerce and a core institute of the National Endowment for Democracy. CIPE’s mission is to strengthen democracy around the globe through private enterprise and market-oriented reform, fulfilling a vision of a world where democracy delivers the freedom and opportunity for all to prosper. Founded in 1983 and headquartered in Washington DC, CIPE partners with local business associations, chambers of commerce, universities, think tanks, and advocacy groups to advance democratic and economic reforms worldwide. Headquartered in Washington, DC, CIPE has offices and representatives in more than 20 countries. In addition, CIPE maintains a network of more than 130 partner organizations spanning over 70 countries.
CIPE is continually updating our strategies and approaches so that we can deliver the most effective programming possible. In 2019, CIPE adopted an evidence-based and data-driven approach to how evaluation supports programming. This approach is intended to go beyond project-based donor reporting and accountability to developing better understanding of the outcomes and impact of distinct approaches to demonstrate innovation and “what works.” To support this initiative, the CIPE Evaluation Department is developing more standardized approaches to its use of program data across projects to guide improvement, with particular focus on how CIPE provides entrepreneur and leadership training in transitioning economies.
III. Purpose
The key purpose of this evaluation is to provide important, actionable lessons learned for identifying and validating evidence-based program approaches and assessing their replicability and scalability based upon an identification and analysis of the results (or likely results) whether intended or unintended, positive or negative. A secondary purpose is to improve understanding and learning about CIPE’s value-added contribution in working locally and with partners. Evaluation users include the CIPE Board, management, program and evaluation staff.
IV. Context
Leadership is a fundamental part of how CIPE works to expand democracies that deliver. This approach comprises a significant portion of how CIPE works in all regions of the globe using its partner-driven model. However, CIPE lacks good information about which approach works best in what contexts. Comparing these four models is complex because the teaching method differs (workshops, standard curricula, use problem-solving skills and case studies methods); the target beneficiaries differ (from refugees to youth, college students, aspiring and current entrepreneurs and mid-to-senior cross-sectoral professionals). Some programs utilized a Training-of-Trainers (TOT) model for multiplier and sustainability purposes while others were taught directly by existing experts drawn from universities and the private sector. Finally, the level of evaluation data available varies considerably, with existing program data only available for two projects (Papua New Guinea traditional model and the Turkey project William Davidson model with Syrian refugees). CIPE is interested in better understanding how to support evidence-based programming with more rigorous monitoring and evaluation approaches. In 2019, CIPE adopted an evidence-based and data-driven approach to how evaluation supports programming. This approach is intended to go beyond project-based donor reporting and accountability to developing better understanding of the outcomes and impact of distinct approaches to demonstrate innovation and “what works.”
V. Description of Project/Activity to be Evaluated
This evaluation will assess and compare four program approaches to democratic leadership that have been used in a variety of CIPE projects in all regions of the world. The four program approaches include:
- Traditional entrepreneurship training using local or peer speakers in informal workshops based upon speaker-developed materials.
- In-country partner model using formal entrepreneurship and leadership curricula developed by our local partner with CIPE guidance and input.
- Global best practices model using a formal entrepreneurship curriculum developed in partnership with CIPE by the William Davidson Institute at the University of Michigan.
- Global best practices model using a leadership curriculum led by Stanford University’s Center for Democracy Development and Rule of Law (CDDRL) Leadership Academy for Development.
VI. Objectives and Scope of the Evaluation
The evaluation will be a comparative non-experimental evaluation focused on impact or potential impact. Evaluation questions will be determined in consultation with the consultant to meet the following four objectives:
- To determine the level of evidence supporting results achieved by the four approaches;
- To identify the evaluability and replicability of each of the four models;
- To draw conclusions about the level of sustainable impact in leadership development in the transition context; and
- To identify lessons learned about CIPE’s value-added in contextualizing and developing leadership programs.
CIPE anticipates that the evaluation questions will include an evaluability and likely impact assessment of the four models using the OECD-DAC standards and American Evaluation Association Guiding Principles, with emphasis on the five dimensions of impact. As defined by the consultant in consultation with CIPE, the evaluation will compare the four models based upon the level of evidence and the strength of the results obtained in order to ascertain which model(s) are unproven, promising, or proven in terms of evidence-based approaches. The consultant will propose methods consistent with the level of effort and budget to generate the highest quality and most credible evidence appropriate for answering the evaluation questions. Data collection will include document analysis, questionnaires/survey of Leadership Academy projects, and key informant interviews. Data analysis will include triangulation and validation of information defined by the Impact Management Project.
VII. Evaluator Roles and Accountability
The evaluator (or firm) will report to the Director of Evaluation. Evaluation at CIPE uses the internal independent model where the Evaluation Department operates independently of the CIPE Program Division Unit and reports to the Managing Director for Planning and Human Resources. On evaluation issues, the CIPE Evaluation Team reports to the CIPE Board Committee on Evaluation. The consulting individual (or firm) should adopt a consultative and participatory approach while maintaining an independent perspective consistent with OECD DAC and American Evaluation Association standards.
VIII. Deliverables and Timeline
Deliverables include an inception report, including research methodology, draft and final report, following two feedback rounds and learning presentation. CIPE anticipates selecting the evaluation consultant in February and requires the final report to be completed in July 2020.
IX. Team Composition and Evaluator Competencies
The individual or consultancy firm needs to be a legally licensed international/national organization with a commendable track record and at least with 10 years’ experience in providing consultancy with evaluation skills and expertise in in and understanding of the theory and practice of leadership programs, the role of entrepreneurship programs in developing problem-solving skills, evaluation of training using the Kirkpatrick model; and experience in the development context who maintains an independent perspective consistent with OECD-DAC and American Evaluation Association standards.
X. Scheduling and Logistics
To be determined in consultation with the consultant.
XI. Level of Effort
The level of effort is expected to include a level of effort of approximately 32 to 45 working days between February 2020 and July 2020. The evaluation will include document review of project documents and analysis of a survey of Leadership Academy participants. No travel is required.
XII. How to Apply
Interested parties should request the full Leadership Statement of Work from Dr. Denise Baer, Director of Evaluation via email to dbaer@cipe.org.
Please note that for full consideration, applications must be received by 5 p.m., E.S.T., January 15, 2020. Applications will be reviewed on a rolling basis, and earlier applications will be prioritized. Later applications may be considered based upon quality. Consultants should send a resume(s), a cover letter, and a proposed scope of work of 5-7 pages for the evaluation.