This post is about a mentoring approach to capacity development in the field of evaluation, more specifically using Utilization-focused Evaluation (UFE). What we detail here is noteworthy in that the funder (Canada’s IDRC) has allowed its partners the leeway to design their own evaluations without prescribing the purposes or methods. Our experience contrasts with issues raised by Goran Buldioski blog of Nov. 11th (in this series) where he ponders about the benefits of Think Tanks performing in the absence of donor funding. In our experience a preferred avenue is possible when donor funding remains in place with fewer strings attached.
Our experience began in 2009/10 when we mentored five research projects in Asia and helped them test-drive UFE. UFE is an approach that emphasizes evaluations that are used. As simple as this sounds, we discovered that the approach has enormous potential and is –strangely- not widely applied.
When we started, we soon discovered that UFE was not new: it has been applied and described in detail by its author Michael Quinn Patton – the latest version being Essentials of Utilization-focused Evaluation, Sage 2013. Nevertheless, it was new to us. We completed the first phase of our project ‘Developing Evaluation Capacity in ICTD (DECI)’ by producing five case studies and a UFE Primer for Evaluators that is available for free in English, French and Spanish. What we describe in this post is the mentoring approach that we developed and continue to apply in a second phase of the same project (DECI 2) that is funded by IDRC.
For starters, our initiative was – and continues to be- a research project. It combines three pillars: methodological innovation, capacity development, and in-service learning for research partners. This combination drives our research and our practice in evaluation capacity development. We provide capacity development to several research networks (supported by IDRC’s Information & Networks Program) through regional mentors. These mentors have a strong background in evaluation, but they were new to UFE. They, in-turn, provided a pre-determined number of person-days of mentoring to the research projects. Since we were all new to UFE, we provided the training less as experts and more as peers interested in learning-by-doing. The projects and our team agreed on the schedule for the delivery of the mentoring based on a commonly agreed calendar. By scheduling the mentoring on the basis of the projects’ needs, we were able to provide our advice during timely ‘windows of learning’ when our partners were ready for the assistance.
This approach contrasts with conventional workshop events where content is standardized and delivered at an event that is scheduled independently of the participants’ particular needs or timelines. We had designed plenty of those sessions in the past, yet if we are to be honest about their outcomes, we have little to show. The main difference between standard workshops and mentoring lies in the concept of ‘readiness’ that is part and parcel of the UFE approach. ‘Readiness’ is about being able to wait for the time and space when a partner project team is able to use your advice. Readiness also means having partners with staff that are hungry to learn the new skills and project managers who buy-in and allocate the time and resources required at the outset of the work. We feel that this readiness feature was central to the successful mentoring of the first five projects we supported. The individual project evaluation reports were not only completed, but we have ample evidence that they were utilized. Based on our experience, we have summarized a set of readiness requirements as a prerequisite for our support.
We are currently implementing the follow-up DECI-2 project. Our partners are IDRC-supported research projects based in Africa, Asia and Latin America. Our mentoring now integrates UFE with Research Communication; this combination is new to us (readers with previous experience combining them are invited to share their insights). However, it builds on our recent UFE experiences. Our mentor teams comprise evaluation and communication professionals based in South America, Asia and East Africa. During our first year of operation, we spent significant time and resources ascertaining and promoting readiness among our partners. At times, this step meant just sitting it out and awaiting their communiqués. Having the flexibility to wait for the appropriate time rather than delivering a workshop event that was stipulated for delivery at a set time, has been welcomed by the projects.
In our experience, it is rare to have a capacity development opportunity like this one, especially where the first phase focuses on achieving a Memoranda of Understanding with partner projects and determining with them their readiness. As a result of this experience, we are shifting away from the workshop ‘formula’. Our challenge now is to find mechanisms to share our learning with a User Circle willing to accompany this learning experiment. Earlier posts on this site have already given us some ideas.