MandE NEWS re methods, tools, systems, approaches, innovations,.. >.. measure, monitor, track, audit, evaluate, assess.. >.. results, impact, achievements, outcomes, changes, indicators, performance ..>.. projects, programs, strategies, plans, policies, models, theories..>.. people, participation, poverty, social development ..>..adjustment, improvement, learning, reporting, accountability.. >.. aid organisations, development organisations, NGOs, bilaterals, multilaterals
Edited by Rick Davies, Cambridge, UK. | Email the Editor | First content: 1997 | Last Edited: 28th January 2008 | Going carbon negative
PLEASE NOTE: The MandE NEWS website has recently undergone a major reconstruction. The new (May 2008) site is now located  here.
Funded from 1997 to 2005 by: Oxfam (GB) ,  Save the Children Fund (UK) , ActionAid (UK) , Christian Aid , CAFOD (UK), CIIR (UK), IDRC (Canada), Water Aid (UK), World Vision (UK), WWF (UK) and Exchange via BOND
How you can contribute and network Current NEWS on this page Other pages on this website

You can support this website 
Search the Archives (630+ items) and www.mande.co.uk/docs/ directory (190+ items) Google Custom Search
Try this Swiki search engine that includes social bookmarking (where your responses help others find what they need) 

1. COMING EVENTS  (please also check the Open Forum)

September 2008
  • The Third High Level Forum on Aid Effectiveness (HLF 3) will be hosted in Accra by the Government of Ghana on 2-4 September 2008. The HLF 3 builds on several previous high level international meetings, most notably the 2003 Rome HLF which highlighted the issue of harmonisation and alignment, and the 2005 Paris HLF which culminated with the endorsement of the Paris Declaration on Aid Effectiveness by over 100 signatories from partner governments, bilateral and multilateral donor agencies, regional development banks, and international agencies. The primary intention of the HLF 3 is to take stock and review the progress made in implementing the Paris Declaration, also broaden and deepen the dialogue on aid effectiveness by giving ample space and voice to partner countries and newer actors (such as Civil Society Organsations and emerging donors). It is also a forward-looking event which will identify the action needed and bottlenecks to overcome in order to make progress in improving aid effectiveness for 2010 and beyond.  The HLF 3 will be organised as a three-tier structure:  *   The Marketplace, which will provide an opportunity for a wide range of actors to showcase good and innovative practices and lessons from promoting aid effectiveness;   *    Roundtable meetings, which will provide an opportunity for in-depth discussion on selected key issues to facilitate and support decision taking and policy endorsement on aid effectiveness; and    *          Ministerial-Level Meeting, which is expected to conclude the HLF 3 with an endorsement of a ministerial statement based on high-level discussions and negotiation around key issues.

May 2008
  • The Canadian Evaluation Society is excited to be holding our Annual Conference in Québec City May 11-14, 2008. Conference objectives:
    1. Share your recent evaluation experiences, with a focus on methodology and practices. 2. Take an objective look at your practices in order to promote evaluation development. # Some of the issues inspired by our Sharing Heritages theme include the following: A. Taking stock of your evaluation practices B. Reviewing trends in evaluative thinking, the factors that influence it, and modes of action C. Qualifying, comparing, and imagining evaluation support structures, D.  Defining a world evaluation heritage agenda (15/12/07)


March 2008
  •  Advanced M & E Workshop, Pretoria, South Africa, organised by IMA International. This 5 day course will provide a step-by-step process for designing sustainable M & E systems at programme level, addressing common challenges such as the needs of multiple stakeholders; operating within specific organisational and social cultures; dove-tailing the requirements of project, area, regional and national levels. This course aims to support experienced M & E practitioners looking to develop sustainable M & E systems. Date: 10 – 14 March, 2008. For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com  (05/12/07)

  • Workshop on: Rethinking Imact: Capturing the Complexity of Poverty and Change March 26-28, 2008 Headquarters of the International Center for Tropical Agriculture (CIAT), Cali, Colombia. Objective The objective of the workshop is to draw from the experiences of professionals from multiple disciplines of natural and social sciences regarding evaluation of research aimed at poverty reduction, social inclusion and sustainable development. We are particularly interested in new methods and metrics that have been developed and tested in ongoing evaluations and impact assessment efforts supporting learning. Workshop themes
    1. Practical case studies with lessons learned in relation to, or that provide empirical evidence of, reductions in poverty, and analysis of how that impact was achieved, with a focus on one or more of the following:  * System dynamics  * Roles of different players
     * Innovation and markets  * Research-to-development processes  * Learning processes as they affect performance.
    2. Impact assessment and evaluation approaches that address issues such as:   * Assessing contributions in complex partnerships
    * Interdisciplinary research  * Combining quantitative and qualitative data  * Linking the contribution of processes to outcomes and impact  * Innovation systems analysis, and new metrics for understanding and measuring outcomes and impacts.
    3. Institutionalization of new approaches for research management and impact assessment:   * Communication challenges
        * Training and capacity development for poverty-oriented research and impact assessment  * Policy and operational environments (including institutional culture).For more detailed information about the partners, drivers and assumptions behind this workshop, please visit the workshop website: www.prgaprogram.org/riw  (06/11/07)

  • Workshop on Rethinking Impact: Capturing the Complexity of Poverty and Change Cali-Colombia. March 26-28, 2008. Have you been working with some interesting and innovative teams that have been effective in achieving a better understanding of how agricultural innovations, coupled with linkages between research and policy communities and scientific understanding of human-environment systems, can have more impact on poverty reduction and social inclusion, while protecting the environment? If so, we would like you to consider joining us (see organizing committee below) at a 3 day workshop: Rethinking Impact: Capturing the Complexity of Poverty and Change, March 26-28, 2008 at the Headquarters of the International Center for Tropical Agriculture (CIAT) in Cali, Colombia. Objective. The objective of the workshop is to draw from the experiences of professionals from multiple disciplines of natural and social sciences regarding evaluation of research aimed at poverty reduction, social inclusion and sustainable development. We are particularly interested in new methods and metrics that have been developed and tested in ongoing evaluations and impact assessment efforts supporting learning. (posted 16/10/07)

  • EASY-ECO Vienna Conference 2008 Governance by Evaluation: Institutional Capacities and Learning for
    Sustainable Development 11-14 March 2008 Vienna, Austria Please find the call for papers for the EASY-ECO Vienna Conference here: http://www.sustainability.eu/easy/pdf/vienna/CfP_Vienna_070809.pdf With the current call for papers we encourage researchers from all disciplines (including young researchers), professionals from related fields of work, commissioning agents, and evaluation users and other stakeholders to submit abstracts for presentations at the Vienna Conference. Submissions can address one or more of the following key topics in general terms or in the context of a case study: * Institutional aspects of sustainable development evaluations * Policy learning and sustainable development evaluations * Methodological challenges and innovations with regard to sustainable development evaluations The EASY-ECO Vienna Conference will be hosted by the Research Institute for Managing Sustainability (RIMAS) at the Vienna University of Economics and Business Administration. EU grants are available for young researchers (with less than ten years research experience) to cover all costs (travel expenses, participation fees,  accommodation and living allowance). The current call for papers, together with the application for EU grants, will be open until 10 October 2007.  Details can be found at the project website www.easy-eco.eu. For any enquiries please contact judith.galla@wu-wien.ac.at (01/10/07)

February 2008
  • Monitoring and Evaluation in Development, Pretoria, South Africa, organised by IMA International.  This 2-week intensive practical course provides a structured approach to developing, maintaining and implementing an M & E system. Including SMART indicators, data collection, analysis and communicating results. Emphasis on participatory methods to promote stakeholder commitment for sustainability. This course is suitable for any development professional with M & E responsibilities. Date: 25 February – 7 March 2008. For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com  (05/12/07)

  •  TrainEval - Training for Evaluation in Development  (February-June 2008)  TrainEval is an advanced training programme for evaluation in development, which is specifically focused on the European Development Cooperation and the EC evaluation approach. Our cursus has been developed by experienced trainers and evaluators to respond to the increasing demand for evaluation expertise.  
    TrainEval is composed of 4 modules of 4 days each. In addition to the modules, participants will have the opportunity to interact with guest speakers from relevant institutions for discussion meetings in the evening. The deadline for registration for the complete course 2008 is 31 December 2007. Individual modules can be booked until 6 weeks in advance of the respective module. To register, use the registration form provided as download here.  We are looking forward to seeing you in Brussels! Nota bene: All tuition is provided in English. (Posted 26/11/07)

  • Advanced Participatory Monitoring & Evaluation 28th January - 1st February 2008 There are limits to the more traditional monitoring and evaluation (M&E) methods, based on cause and effect interpretations of social development. During this course we will explore how to develop a cost-effective monitoring and evaluation system. Such a system has as its objective the generation of a sufficient, but not excessive, quality of data, and enough information to provide a development agency with a reliable understanding of the outputs, effects and impacts of capacity building processes. To book and get further information: training@intrac.org , www.intrac.org (14/11/07)

  • International Course on Participatory Planning, Monitoring & Evaluation, from 04-22  February 2008, Wageningen, The Netherlands. This course is organised by Wageningen International, part of Wageningen University and Research Centre. The course focuses on how to design and institutionalise participatory planning and M&E systems in projects, programmes and organisations for continuous learning and enhanced performance. Particular attention is paid to managing for impact and to the relationship between management information needs and responsibilities and the planning and M&E functions. For more info please visit our website:  http://www.cdic.wur.nl/UK/newsagenda/agenda/Participatory_planning_monitoring__evaluation__managing_and_learning_for_impact.htm
    or contact us: training.wi@wur.nl or cecile.kusters@wur.nl

December 2007
  • Parivartan  announces a three day training programme on ‘Gender Mainstreaming through Gender Responsive Budgeting’ from 20th to 22nd December 2007 at Delhi. The training programme shall introduce the concept of gender responsive budgeting highlighting the strategic framework adopted for undertaking the same as a practice. The programme aims at illustrating the background, approaches and methodologies under the realm of gender responsive budgeting that are being followed and their effectiveness in the context of empowerment and mainstreaming of gender concerns in development. Training Outcomes:The training is specifically aimed at developing appreciation of the need and rationale for integrating gender concerns in project plans and budgets. The training outcomes would be:
    Enhanced understanding for formulation and implementation of gender-sensitive projects and budgets. Skills for developing and using gender criteria and indicators for evaluating and monitoring the budgets at different levels.Program Fee:Fee for the programme is INR 6,000 per participant. Payments can be made through demand draft in favour of  “Parivartan Samaj Sewa Samiti”, New Delhi Further Details: For more details please visit www.parivartan.org.in , write to us on contact@parivartan.org.in or contact Amrat Singh on 91-11-40560734, 65492502.  

  • Channel Research is pleased to announce a short training course on ‘EVALUATION OF HUMANITARIAN ACTION’, in collaboration with ALNAP, facilitated by Margie Buchanan-Smith and John Cosgrave. It will take place near Brussels from Wednesday 12 December, to Friday 14 December 2007 inclusive. The full price for the session is €1,400. The maximum number of participants is 20.
    The learning objectives of the course are to achieve: (a) Greater clarity of the purpose and objectives of EHA, and of the principal challenges of doing EHA. (b) Better understanding of evaluation criteria and of the most relevant frameworks against which humanitarian assistance should be evaluated. (c) A practical approach to planning, designing, implementing and following through on evaluations of humanitarian assistance. If you are interested in joining this course, please fill in the application form and send it back with your CV or a short biographical notice, preferably before 1 November 2007 to: Ms Annina Mattsson  E-mail: mattsson@channelresearch.com
    Tel: +32 2 633 65 29.  Further information on Channel Research is available on the website: www.channelresearch.com, on the training page. (posted 18/10/07)
November 2007
  • UKES Annual Conference 2007  Great Expectations? Meeting the changing needs of stakeholders in evaluation  22-23 November 2007 The Queens Hotel, Leeds The 2007 UKES annual conference takes as its theme the challenges we face in effectively engaging with and meeting the evolving needs of a multiplicity of stakeholders.  Stakeholders can include commissioners, evaluators, co-evaluators, participants, beneficiaries or those who may be put at risk by evaluation processes and findings; in effect all those who have a  stake in, or interest in, the particular programme being evaluated.  These stakeholders differ in terms of the influence they can exert over evaluation design, capacity to engage with the evaluation, or ability to access and utilise findings.  Stakeholders may have contrasting views on what evaluation should achieve and how it should be carried out.  Priorities and perspectives are not fixed over time.  Evaluation is situated in a dynamic environment characterised by policy change, shifting institutional structures and new thinking from both within and outside the evaluation community. The conference streams explore changing stakeholder needs and expectations from a number of angles – from the core issue of effective engagement in the evaluation process, to considering the needs of specific stakeholder groups, the emerging priorities of stakeholders such as the sustainable development agenda and the challenge of involving multiple stakeholders.  (19/10/07)

  • A One week Training on Monitoring and Evaluation of Health Programs, Addis Continental Public Health Institute, Ethiopia,
    October 21-25, 2007 and November 18-23, 2007   •The training is designed to provide basic knowledge and skills on the monitoring and evaluation of health programs. •The training aims building competence in monitoring and evaluation to improve capacity in programs and services that are aimed at improving the health status of the in Ethiopia. •Workshop will be facilitated by experienced trainers in monitoring and evaluation who have been involved in facilitating a similar training offered for the Anglophone Africa region. Focus of the Training Skills in monitoring and evaluation of health programs are crucial to the successful design, and implementation of prevention and control programs. Monitoring and Evaluation, and sharing of results of it have been increasingly emphasized by national and international policy makers and program mangers. There is, however, a shortage of staff with skills to conduct effective monitoring and evaluation. This training workshop focuses on building the capacity of professionals in Ethiopia by providing training in monitoring and evaluation tools and techniques. Interested persons should send completed applications and other supporting documents to: Prof. Yemane Berhane
    Addis Continental Institute of Public Health  Email: aciph@ethionet.et, yemaneberhane@ethionet.et  Addis Ababa, Ethiopia
    Deadline for application: October 16, 2007 Venue: Addis Continental Institute of Public Health, Addis Ababa Training fee: 2, 500 Birr only (including course materials)  Applicants should send completed application form with their CV. Application form is
    available at Addis Continental Institute of Public Health. Office address: Tibebu Building 2nd Floor, Kirkos Sub city, Kebele 03 #274, Gabon Street (Half way between Olympia on Bole Road and Meskel Flower Hotel towards Debreziet Road). For further information call Meskerem at +251-(0)913-038534/05.Email aciph@ethionet.et to obtain electronic application forms. (01/10/07)

  • Evaluation 2007:  Evaluation and Learning  The American Evaluation Association invites evaluators from around the world to attend its annual conference to be held Wednesday, November 7, through Saturday, November 10,
    2007 in Baltimore, Maryland. AEA's annual meeting is expected to bring together over 2500 evaluation practitioners, academics, and students, and represents a unique opportunity to gather with professional colleagues in a
    supportive, invigorating, atmosphere. The conference is broken down into 38 Topical Strands that examine the field from the vantage point of a particular methodology, context, or issue of interest to the field as well as the Presidential Strand highlighting this year's Presidential Theme of "Evaluation and Learning." Presentations may explore the conference theme or any aspect of the full breadth and depth of evaluation theory and practice.  For more information, please visit: http://www.eval.org/eval2007/

  • PCM Group training course on Monitoring & Evaluation (3 days @ €744, excl. 21% VAT) 12 - 14 November 2007 (English) This course deals with techniques and approaches both applicable to monitoring, mid-term and ex post evaluations. Evaluation analyses the whole cycle, compares the initial plan with what is actually taking place or has taken place, assessing at the same time the quality of the initial plan and, in the case of the mid-term evaluation, how the current plan could be adjusted and improved according changes which may have occurred since the initial plan or Logframe. The core content of our Monitoring & Evaluation course includes (1) a brief introduction on PCM, (2) Design of the monitoring plan (in the form of a LogFrame) (3) the reconstruction of the understanding of the design of the intervention through the `famous` ex-ante Assessment Technique (4) Practice the formulation of different indicators (Results, Purpose and ASSUMPTIONS) (5) the design of the information flow in the Monitoring system. The course has some theory but aims at being very practical and at encouraging communication and the sharing of experience between participants from different organisations. Thus group work and presentation is a major pedagogical aspect of the course. (16/01/07)

  • 5-28 November 2007 Suivi-Evaluation pour le Secteur Public et les Organisations Non Gouvernementales COTONOU, BENIN.  Africa Sourcing organise un mois de formation de suivi et d’évaluation à l’intention des cadres du secteur public, cabinets d’études et des Organisations Non Gouvernementales.  Les participants à ces 4 semaines de transfert de connaissance seront exposés aux fondements et essentiels du suivi-évaluation, aux techniques de mesures de performances, aux différentes méthodologies d’évaluation ainsi qu’à plusieurs cas pratiques d’évaluations effectués en Afrique.  Des certificats attestant des compétentes et connaissances acquises seront délivrés à tous les participants.  Pour toutes informations et inscription veuillez consulter notre site www.africasourcing.com ou nous écrire à mdognon@africasourcing.com Tel : +610-797-0994  Fax + 610-797-0717

  • Monitoring and Evaluation in Development, Bangkok, Thailand, organised by IMA International.  A 2-week intensive practical course to enhance practical monitoring and evaluation skills and methods required in a demanding and rapidly changing world. The first week concentrates on monitoring tools and techniques. The second week focuses on the complete evaluation process. A comparative field visit consolidates the learning. Participants develop a personal and professional action plan to be implemented upon return to their organisations. Date 19 November 2007 - 30 November 2007 For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com (16/12/06)

October 2007
  • Logic of Evaluation A Distance-Learning Course with Dr. Michael Scriven  This Fall, Claremont Graduate University will be offering a one-time only event to allow participants from across the globe a chance to learn the fundamentals of Evaluation Science from one of the founders and deep thinkers in the field.  The Logic of Evaluation is a combination of an overview of the discipline, with the reasons for thinking that it is a discipline, with an outline of the basic concepts that work across all subdivisions of the field, ie., personnel and product evaluation (and seven more) as well as program evaluation. The program is designed to be completed from a distance with the aid of electronic video and voice contact, electronic whiteboard, and asynchronous access to the recordings of each class.
    Dr. Michael Scriven will be leading this seven-week, once-a-week seminar.  This course is recommended for evaluators with an interest in the questions that underlie all of our practice.  Cost: $200 (plus $5 registration fee) First Virtual Meeting: Saturday, October 27, at 10 am Subsequent Meeting Dates: TBD (Between Oct. 29 and Dec. 21, 2007) Technical Requirements: High-speed internet access (DSL or Cable); Webcam and microphone (Posted 1/10/07)

  • PCM Group training course on Monitoring & Evaluation (3 days @ €744, excl. 21% VAT) 8 - 10 octobre 2007 (Francophone) This course deals with techniques and approaches both applicable to monitoring, mid-term and ex post evaluations. Evaluation analyses the whole cycle, compares the initial plan with what is actually taking place or has taken place, assessing at the same time the quality of the initial plan and, in the case of the mid-term evaluation, how the current plan could be adjusted and improved according changes which may have occurred since the initial plan or Logframe. The core content of our Monitoring & Evaluation course includes (1) a brief introduction on PCM, (2) Design of the monitoring plan (in the form of a LogFrame) (3) the reconstruction of the understanding of the design of the intervention through the `famous` ex-ante Assessment Technique (4) Practice the formulation of different indicators (Results, Purpose and ASSUMPTIONS) (5) the design of the information flow in the Monitoring system. The course has some theory but aims at being very practical and at encouraging communication and the sharing of experience between participants from different organisations. Thus group work and presentation is a major pedagogical aspect of the course. (16/01/07)

  • Symposium on  Evaluation in the Knowledge Society organized in cooperation between European Evaluation Society and University of Southern Denmark  Time: October 18-19, 2007  Place: University of Southern Denmark, Odense  Format: key notes, paper sessions, panels, discussions, and informal interaction.  Maximum no. of participants: 175  More information on the EES website in the near future. Official call for papers will be issued in December 2006.  Theme:  The theme of the conference is knowledge as it is produced, facilitated, discussed, and made politically relevant as a part of the practice of evaluation in contemporary society.   The symposium is in itself a bridge-building exercise between various fields of knowledge. The field of evaluation might benefit from renewed inspiration from academic perspectives. Academics interested in the role of knowledge may analyze evaluation as an interesting example of a knowledge-producing socio-political practice. With kind regards, Mirjam Vlaar  Secretariat EES PO Box 1058 3860 BB NIJKERK THE NETHERLANDS  Phone +31 33 2473488 Fax.    +31 33 2460470  E-mail: ees@mos-net.nl Website:  www.europeanevaluation.org (12/12/06)

  • Monitoring and Evaluation in DevelopmentCape Town, South Africa, organised by IMA International.  A 2-week intensive practical course to enhance practical monitoring and evaluation skills and methods required in a demanding and rapidly changing world. The first week concentrates on monitoring tools and techniques. The second week focuses on the complete evaluation process. A comparative field visit consolidates the learning. Participants develop a personal and professional action plan to be implemented upon return to their organisations. Date 22 October 2007 - 02 November 2007 For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com (16/12/06)

September 2007
  • 3rd Symposium Forum Media and Development: Measuring Change: Planning - Monitoring - Evaluation in Media Development September 27-28, 2007 Katholisch-Soziales-Institut (KSI) Bad Honnef, near Bonn (Germany)
    Just as any other field in development cooperation, media support has to prove its relevance, effectiveness and impact. Key questions for media assistance are therefore - how to effectively promote a media system and its enabling environment that fosters democracy and contributes to the overall development goals and, - how this impact could be substantiated?  The third symposium of the Forum Medien und Entwicklung entitled "Measuring Change: Planning, Monitoring, Evaluation in Media Development" will not only follow up these general questions but also focus on the current practice of monitoring and evaluation in the field. The symposium aims at sharing the knowledge of methods for steering existing projects and at providing lessons learned for future initiatives. The symposium is organised by the Catholic Media Council (CAMECO), a consultancy for media and communication, in cooperation with the "Forum Medien und Entwicklung" (Forum Media and Development). The progamme and registration forms can be downloaded at http://www.cameco.org/mez/fome.html. Completed registration forms should be sent to the CAMECO by August 15, 2007.

  • 11th Praxis Annual Commune on Participatory Development. Community-led Monitoring and Evaluation 11-22 September. Hyderabad, India. The module on ‘Community-led Monitoring and Evaluation’ seeks to explore and discuss effective ways of enabling communities and primary stakeholders of any development process to have a decisive influence over the objectives, processes, policies and outcomes of transformations aspired by them. The thrust of discussions in the module would be on possibilities of addressing the adverse politics of development through appropriate institutional arrangements, systems and processes according primacy to an empowered role of marginalised communities. The discussion would be aided by presentations of relevant case studies and experiences, besides focused sessions on suitable participatory methods, e.g. social audits, citizen jury processes and large scale PME systems. More specific contents of the module will be discussed and agreed with the participants prior beginning of training course.

  • Managing for Impact (MfI) training workshop. A training workshop will be held in Lesotho 17-27 September 2007 for service providers and project staff interested in developing their skills and capacity to implement an emerging approach to improving the impact of development interventions called ‘managing for impact’. The training is part of a regional learning and capacity building programme called Strengthening Management for Impact (SMIP) implemented by Wageningen International, CARMPoLEA (ISNAR and Haramaya University), Impact Alliance and Khanya-aicdd. The overall purpose of the workshop is to continue to develop and strengthen the MfI network within Eastern & Southern Africa to support pro poor projects/programme to effectively manage toward impact. For further information on SMIP and the Managing for Impact approach, visit the SMIP Electronic Resource and Information Learning Center (ERIL) - http://www.managing4impact.com/  or email Mine Pabari at mine.pabari@gmail.com (25/06/07)

  • Methodologies for the evaluation of peace-building programmes, advanced level, 4-7 September 2007. Channel Research is pleased to announce this course which aims to improve methodologies and tools to carry out assessments of conflict situations and the evaluation of peace-building programmes. The course is an advanced course and intended for those with experience in evaluations and an interest and general experience with conflict situations. The participants in previous years have come from both aid agencies (headquarters and fieldpersonnel), donor governments, consultancies and academia.’ The course is facilitated by Emery Brusset, Director of Channel Research and Tony Vaux, Conflict expert and will take place near Brussels (La Converserie, www.converserie.be ). The price of this course is 1000 Euros and includes transport from the centre of Brussels to the venue, training, accomodation and VAT. Further information, Course Description and Application form can be downloaded on our webpage: http://www.channelresearch.com/training.html  (12/07/07)


  • Melbourne, Australia has been honoured with the privilege of hosting the 2007 AES International Conference – Doing Evaluation Better to be held in 3 – 7 September 2007, which will take place at the Carlton Crest Hotel in 2007.   This Congress is being hosted by The Australasian Evaluation Society (AES). Conference Details 2007 AES International Conference – Doing Evaluation Better  Venue:    Carlton Crest Hotel, Melbourne, Australia Dates: 3 – 7 September 2007 Expected attendance:  Approximately 250 delegates Contact details:  AES 2007 Conference Managers, 91- 97 Islington St Collingwood, Melbourne VIC 3066 AUSTRALIA  P: + 61 3 9417 0888 l F: + 61 3 9417 0899 l E: aes2007@meetingplanners.com.au  www.aes2007.com.au

August 2007
  • Announcing a Regional Workshop on  MONITORING AND EVALUATION OF HIV/AIDS PROGRAMS  August 6 - 17, 2007 Pretoria, South Africa  USAID's Regional HIV/AIDS Program for Southern Africa RHAP/SA and the MEASURE Evaluation Project are pleased to announce an opportunity for training in monitoring and evaluation for professionals in the Anglophone Africa region. The regional workshop "Monitoring and Evaluation of HIV/AIDS Programs" will take place August 6 - 17, 2007 in Pretoria, South Africa. This two-week course is offered in collaboration with the School of Health Sciences and Public Health and Continuing Education at University of Pretoria and will provide intensive training in the fundamental concepts and tools for monitoring and evaluating HIV/AIDS programs. The course is designed for national and sub-national level M&E professionals and their counterparts, assistants and advisors who are involved with the implementation of HIV/AIDS programs.  A limited number of fellowships are available to citizens of USAID-assisted countries that are working in priority countries in Southern Africa.  All applications must be received by July 7, 2007. For more information, contact the MEASURE Evaluation Training & Communications Officer: Email: measure_training@unc.edu or fax: (919) 966-2391, or visit MEASURE Evaluation's website at www.cpc.unc.edu/measure  click on "training"  (02/07/07)

  • Two Day Evaluation Workshop, Measuring Engagement, Measuring Empowerment South Australia  A two day evaluation workshop will be delivered in Clare, South Australia by Dr Kate Roberts and Dr Jeff Coutts on 15 and 16 August 2007. The workshop is based on research undertaken for the Cooperative Venture for Capacity Building (RIRDC) and has been delivered throughout Australia. Day One of the workshop looks at measuring engagement through an evaluation of five models of extension. It is of relevance to those working in the community development and NRM sectors, local and state government organisations, catchment management authorities and other organisations, and who are working on reviewing and delivering extension and engagement activities. The second day focuses on measuring empowerment. It is particularly relevant to those working with groups, capacity building and empowerment. Contact: Justine Lacey at research@robertsevaluation.com.au (posted 12/06/07)

  • Claremont Graduate University's 5th Annual PROFESSIONAL DEVELOPMENT WORKSHOP SERIES EVALUATION AND APPLIED RESEARCH METHODS August 17-23, 2007. California, USA. Sessions include:
    Basics of Evaluation & Applied Research Methods Stewart I. Donaldson & Christina A. Christie  Introduction to Applied Quantitative Analysis Dale E. Berger  Introduction to Qualitative Research Methods Michelle C. Bligh Saturday, August 18, 2007 Quasi-Experimental Designs William D. Crano  Introduction to Multilevel Modeling Jodie Ullman  Designing Qualitative and Advanced Research Studies: Beyond the Basics Sharon F. Rallis  Sunday, August 19, 2007 Practical Program Evaluation: A Program Theory Approach Huey T. Chen & Stewart I. Donaldson  Using Appreciative Inquiry for Evaluation and Organizational Change Hallie Preskill  Introduction to Structural Equation Modeling Dale E. Berger  Using Qualitative Methods to Determine Causality Michael Scriven Monday, August 20, 2007 Needs Assessment Michael Scriven  Considering Culture in Evaluation and Applied Research Rodney K. Hopson  Evaluation on a Shoestring: Participatory Strategies for Stretching the Budget Jean A. King  Cost-Effectiveness and Cost-Benefit Analysis Brian Yates Tuesday, August 21, 2007 Building Evaluation Capacity Jean A. King  Sampling and Survey Methods Gary T. Henry  Qualitative Data Analysis Michelle C. Bligh Wednesday, August 22, 2007  Using RCTs in Educational Research Tiffany Berry & Rebecca M. Eddy  Logic Models for Program Evaluation and Planning Thomas Chapel  How to Write Successful Grant Proposals Allen M. Omoto  Policy Evaluation Gary T. Henry Thursday, August 23, 2007 Using Evaluation Theory to Inform Practice Christina A. Christie  Evaluating Public Health Programs and Initiatives Thomas Chapel  Consulting Gail Barrington Basics of Evaluation & Applied Research Methods Stewart I. Donaldson & Christina A. Christie(Posted 23/03/07)

July 2007
  • Aportes del Seguimiento y la Evaluación !Las inscripciones a la Conferencia ReLAC están abiertas! El Comité Organizador, tiene el placer de anunciar la proxima realización de la II Conferencia de la ReLAC: "Aportes del Seguimiento y la Evaluación a la Gobernabilidad y Democracia", Bogotá Colombia 17 al 21 de julio de 2007.
    Los objetivos de la Conferencia son: i) Propiciar un diálogo regional con la participación de actores de gobierno, sociedad civil y organismos de cooperación, orientado a proponer mejores formas de enfocar y de hacer que el seguimiento y la evaluación de tal suerte que contribuyan de manera más eficiente y efectiva a la democracia y gobernabilidad en América Latina.  ii) Identificar las estrategias que permitan a la ReLAC y a las redes nacionales que la conforman, ser más efectivas y eficientes en sus esfuerzos en pro del fortalecimiento de capacidades y de la profesionalización del seguimiento y la evaluación.  iii) Ampliar y fortalecer las redes de seguimiento y evaluación en la región y su articulación con redes de otras regiones y con redes globales. (02/07/07)

  • Two Day Evaluation Workshop, Measuring Engagement, Measuring Empowerment  Melbourne  A two day evaluation workshop will be delivered in Melbourne, Australia by Dr Kate Roberts and Dr Jeff Coutts on 25 and 26 July 2007. The workshop is based on research undertaken for the Cooperative Venture for Capacity Building (RIRDC) and has been delivered throughout Australia. Day One of the workshop looks at measuring engagement through an evaluation of five models of extension. It is of relevance to those working in the community development and NRM sectors, local and state government organisations, catchment management authorities and other organisations, and who are working on reviewing and delivering extension and engagement activities. The second day focuses on measuring empowerment. It is particularly relevant to those working with groups, capacity building and empowerment. Contact: Justine Lacey at research@robertsevaluation.com.au (posted 12/06/07)

  • Mosaic.net International will be organizing two workshops on the theme of results-based management and participatory monitoring and evaluation.  Results-based Management, Appreciative Inquiry and Open Space Technology Workshop July 16-20, 2007, Ottawa, Canada The five day workshop grounds you on three topic areas:  results-based management, appreciative inquiry and open space technology.   The following themes will be part of the workshop:  -Results-based management and its implications for your organization;-Building results-based logical frameworks;-Creating monitoring and evaluation systems that are results-based;-Moving away from problem-focus approaches to more asset-based approaches;-The appreciative inquiry cycle;-Weaving appreciative approaches into results-based Management;-Using Appreciative Inquiry in the workplace.-Experiencing open space technology.  ........ PARTICIPATORY MONITORING AND EVALUATION  WORKSHOP July 23-28, 2007  Ottawa, Canada This Six-Day PM&E Workshop will show you how to: *Rethink your own monitoring & evaluation strategies and approaches; *Master and learn new innovative participatory PM & E tools for the workplace;   *Facilitate PM & E processes for your project, programme or organization;  *Develop monitoring and evaluation plans in a more participatory manner;    * Integrate gender, ethnicity, class and sexuality issues and concerns to your PM&E work;        *Integrate qualitative and participatory methods into monitoring and evaluation. (Posted 23/03/07)

  • Multi-Stakeholder Monitoring and Evaluation Workshop  Brighton, UK   July 09 2007 - July 13 2007  IMA INTERNATIONAL IMA’s long-standing course (ME01) offers an introduction to Monitoring and Evaluation, providing project managers with the tools they will require to carry out their work. Our new 1-week workshop builds on this foundation by exploring how the needs of multiple stakeholders can be accommodated in the context of more complex M&E systems. Drawing on the extensive international experience of our lead facilitators and working directly with issues brought by participants, we shall explore a series of inter-linked challenges: what drives stakeholder perspectives and how these may be taken into account; working with organisational cultures to secure optimal outcomes; and creating effective M&E collaborations with partner organisations.n all sectors of development and whether working for Government, NGOs or the private sector.

  • PCM Group training course on Monitoring & Evaluation (3 days @ €744, excl. 21% VAT) (English)9 - 11 July 2007 This course deals with techniques and approaches both applicable to monitoring, mid-term and ex post evaluations. Evaluation analyses the whole cycle, compares the initial plan with what is actually taking place or has taken place, assessing at the same time the quality of the initial plan and, in the case of the mid-term evaluation, how the current plan could be adjusted and improved according changes which may have occurred since the initial plan or Logframe. The core content of our Monitoring & Evaluation course includes (1) a brief introduction on PCM, (2) Design of the monitoring plan (in the form of a LogFrame) (3) the reconstruction of the understanding of the design of the intervention through the `famous` ex-ante Assessment Technique (4) Practice the formulation of different indicators (Results, Purpose and ASSUMPTIONS) (5) the design of the information flow in the Monitoring system. The course has some theory but aims at being very practical and at encouraging communication and the sharing of experience between participants from different organisations. Thus group work and presentation is a major pedagogical aspect of the course. (16/01/07)

  • July 2-25, 2007 Monitoring and Evaluation for Public Sector and Non For Profit Organisations, ALLENTOWN, PA USA This 4 weeks practical monitoring and evaluation workshop organized by Africa Sourcing is designed to provide participants with comprehensive evaluation tools.   In four weeks,  participants will be familiarized with evaluation fundaments, performance measuement vehicules, evaluations methodologies and most all  will have the opportunity to practice their evaluation skills using real cases.    For more information, please go to www.africasourcing.com or email us at mdognon@africasourcing.com. TEL: +610-797-0994  Fax: +610-797-0717

  • Management Information Systems for M&E : University of East Anglia UK, Norwich UK . July 2 – 13, 2007 This module provides professional managers with the opportunity to develop an IT Based Management Information System (MIS) in a two-week period. In Week One, the underlying structure for each participants MIS is designed.  Week Two takes the design further and using available software packages (database, spreadsheet and World Wide Web software), develops the MIS software. Participants will take home a workable system, which can be field tested with live data.  Participants should possess basic competency in commonly available software packages. Contact odg.train@uea.ac.uk (Posted 24/10/06)
  • Monitoring and Evaluating for Development Activities : University of East Anglia, Norwich  UK  July 16 – August 10, 2007  The annual ODG Monitoring and Evaluation course contains all the elements evaluated as most successful by more than three hundred development professionals who have participated in the course since 1981. The continuing challenge is to develop systems that serve the needs of process and bottom-up approaches to development activities, using combinations of formal survey and participatory techniques. The course in 2007 has been designed to also take full account of current demands that M&E is concerned with ‘institutional relational analysis’ alongside more conventional ‘activities analysis’. The overall objective of the course is to provide participants with the skills, tools and concepts which they will need for both their present and future work in specifying and implementing M&E systems to support learning and decision-making across the whole range of organisational and sectoral settings. Contact odg.train@uea.ac.uk (Posted 24/10/06)

  • Monitoring and Evaluation in Development, Brighton, UK, organised by IMA International.  A 2-week intensive practical course to enhance practical monitoring and evaluation skills and methods required in a demanding and rapidly changing world. The first week concentrates on monitoring tools and techniques. The second week focuses on the complete evaluation process. A comparative field visit consolidates the learning. Participants develop a personal and professional action plan to be implemented upon return to their organisations. Date 25th June 2007 - 6th July 2007 For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com (16/12/06)
 
June 2007
  • The International Program for Development Evaluation Training (IPDET) is organizing its seventh annual offering to be held in Ottawa, Canada this year from June 11 to July 6, 2007. IPDET consists of a two-week core, intensive and applied training program at the graduate level for those new to development evaluation or with little formal training. The core program is followed by two weeks of 28 free-standing workshops that offer more in-depth focus on specific topics. The workshops are taught by a dynamic group of leading edge international faculty drawn from Southern and Northern institutions. Participants may enroll for all 4 weeks or the Core program and/or one or two weeks of workshops. New workshops offered this year include  "Using Surveillance, Monitoring and Evaluation to Improve HIV/AIDS Programming"  and "Evaluation with Hidden and Marginal Populations". They join returning workshops such as "Building Results-Based Monitoring and Evaluation Systems", "Designing Impact Evaluations Under Constraints", "Evaluation for Post-Conflict Situations", "Evaluating Environmental and Social Sustainability", and many more. A key feature of IPDET is membership in an active, moderated listserv which keeps instructors and alumni in touch, facilitating shared problem solving and continuous learning.  IPDET is a partnership of the Independent Evaluation Group of the World Bank and Carleton University.  Just go to <www.ipdet.org > to register and then open the application form. (26/01/07)

May 2007
  • The International Development Evaluation Association (IDEAS) and the Latin American Network of Monitoring, Evaluation and Systematization (ReLAC) are pleased to combine their efforts in the organization of a joint Conference focused on the concept and practice of evaluation and its implications for development. The Conference, titled “Development Evaluation: Facing Challenges for Addressing Learning, Ownership, Accountability and Impacts” will be held on May 1-5, 2007 in Bogotá, Colombia. The specific objectives of the Joint Conference are:
    * From a content point of view, to provide an opportunity for evaluators and partners of each Association to present practical evidence, or research in ways of enhancing learning, ownership, accountability and impacts through evaluation
    * For IDEAS, to continue its reflection initiated in 2005 in Delhi, and its regional workshops so as to advance thinking and practice on the use of evaluation for accountability, impacts, as well as challenges in conducting country-led evaluations.
    * For RELAC to build upon its 2004 Conference and further the thinking in linking evaluation, democracy and governance.
    * As a joint activity, to promote awareness on the role of evaluation associations and networks for addressing evaluation as a means towards improving public action and for strengthening evaluation capacities at the
    national and regional levels.
    Programme details in English
    http://stone.undp.org/undpweb/eo/evalnet/eval-net/document/ReLAC-Programme_Biennale_ENG.doc
    For more information contact Ada Ocampo Programme Officer/Evaluation Office Member of the Organizing Committee ReLAC/IDEAS Joint Conference Tel: 1 212 824 6748 E-mail:  aocampo@unicef.org

  • Monitoring and Evaluation in Development, Nairobi, Kenya, organised by IMA International.  A 2-week intensive practical course to enhance practical monitoring and evaluation skills and methods required in a demanding and rapidly changing world. The first week concentrates on monitoring tools and techniques. The second week focuses on the complete evaluation process. A comparative field visit consolidates the learning. Participants develop a personal and professional action plan to be implemented upon return to their organisations. Date 14 May 2007 - 25 May 2007 For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com (16/12/06)

April 2007
March 2007
  • The UKES London Regional Network invites you to our next exciting event: Models of Evaluation Practice: An introductory one-day workshop 10am – 4.30pm, Friday 30th March 2007, central London "With time for informal discussion, the workshop programme will draw on participants’ experiences of evaluation as well as exploring  the variety of approaches available to evaluators.  The day will cover the application of evaluation approaches within a range of environments and relevant social, political and ethical contexts. Using a range of ‘live’ case studies and participatory activities, additional topics to be covered include signposting evaluation through choices of approach and design; stakeholder and participant perspectives; and putting evaluation to good use".  Book now – places very limited! Please see http://www.evaluation.org.uk/Events/regional_networks.htm for more details; or email our administrators on ukes@profbriefings.co.uk  (posted 22/01/07)

  • PCM Group training course on Monitoring & Evaluation (3 days @ €744, excl. 21% VAT) 12 - 14 mars 2007 (Francophone) This course deals with techniques and approaches both applicable to monitoring, mid-term and ex post evaluations. Evaluation analyses the whole cycle, compares the initial plan with what is actually taking place or has taken place, assessing at the same time the quality of the initial plan and, in the case of the mid-term evaluation, how the current plan could be adjusted and improved according changes which may have occurred since the initial plan or Logframe. The core content of our Monitoring & Evaluation course includes (1) a brief introduction on PCM, (2) Design of the monitoring plan (in the form of a LogFrame) (3) the reconstruction of the understanding of the design of the intervention through the `famous` ex-ante Assessment Technique (4) Practice the formulation of different indicators (Results, Purpose and ASSUMPTIONS) (5) the design of the information flow in the Monitoring system. The course has some theory but aims at being very practical and at encouraging communication and the sharing of experience between participants from different organisations. Thus group work and presentation is a major pedagogical aspect of the course. (16/01/07)

  • 5-28 Mars 2007 Suivi-Evaluation pour le Secteur Public et les Organisations Non Gouvernementales LOME, TOGO. Africa Sourcing organise un mois de formation de suivi et d’évaluation à l’intention des cadres du secteur public, cabinets d’études et des Organisations Non Gouvernementales.  Les participants à ces 4 semaines de transfert de connaissance seront exposés aux fondements et essentiels du suivi-évaluation, aux techniques de mesures de performances, aux différentes méthodologies d’évaluation ainsi qu’à plusieurs cas pratiques d’évaluations effectués en Afrique.  Des certificats attestant des compétentes et connaissances acquises seront délivrés à tous les participants.  Pour toutes informations et inscription veuillez consulter notre site www.africasourcing.com ou nous écrire à mdognon@africasourcing.com Tel : +610-797-0994  Fax + 610-797-0717

  •  Monitoring and Evaluation in Development Pretoria, South Africa, organised by IMA International.  A 2-week intensive practical course to enhance practical monitoring and evaluation skills and methods required in a demanding and rapidly changing world. The first week concentrates on monitoring tools and techniques. The second week focuses on the complete evaluation process. A comparative field visit consolidates the learning. Participants develop a personal and professional action plan to be implemented upon return to their organisations. Date 12-23rd March 2007 For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com (24/10/06)
February 2007
  •   Monitoring and Evaluation of National AIDS Programmes  COURSE2 Duration: 19 - 23 February 2007; Location: Zagreg, Croatia.  Andrija Stampar School of Public Health (in partnership with World Health Organization Regional Office for Europe (WHO/EURO) and Deutsche Gesellschaft für Technische Zusammenarbeit (GTZ) BACKUP Initiative) is organizing the course "Monitoring and Evaluation of National AIDS programmes".  Course description: Monitoring and Evaluation of National AIDS Programmes  The aim of the module on Monitoring and Evaluation of National AIDS Programmes  is to provide participants with practical guidance both on how to develop an overall, national monitoring and evaluation (M&E) system and systems for specific programmes. The module is structured around five key themes - introductory information, measuring impact, measuring coverage, using M&E data and setting up a national M&E system. (posted 02/02/07)

  • Dear colleagues: AREOL 25, action research and evaluation on line, is a free on-line course in action research offered twice a year, a public service by the Southern Cross Institute of Action Research at
    Southern Cross University.  The next course is areol 25, beginning in February 2007. As with earlier programs, the theme of areol 25 is the integration of effective change with rigorous research. In some respects, it is a combination of the principles of community and organisational change with those for change-oriented qualitative research, sometimes with use of quantitative research too.  The program does not attempt to cover all varieties of action research. Nor does it analyse the philosophy of action research in any depth. The main intention is to allow participants to understand some processes which combine action and research, and which can be used in practice. You can examine the areol materials on the web. You'll find an  index page at http://www.scu.edu.au/schools/gcm/ar/areol/areolind.html  if you would like any further information. Warm regards -- Bob Dick bdick@scu.edu.au  (16/01/07)

  • Program Evaluation for International Education Professionals   Registration Deadline: January 19, 2007  Term: Winter 2007 Sessional Dates: February 5 - March 24, 2007 (Winter) Note: The course will be offered again for Sping 2007.  The objective of this course is to provide international education practitioners with introductory concepts and tools to make sound program decisions. This interactive on-line course, facilitated by a professional program evaluator, will introduce candidates to the underlying theory that guides program evaluation and provide them with practical assignments that lead to competency in conducting, commissioning, or using evaluations. The course presents various program evaluation options to illustrate the multitude of models and tools available to conduct an evaluation. The course utilizes the case-method approach to learning. Candidates will develop competencies in program evaluation by designing the preliminary stages of an actual evaluation. They will also have the opportunity to apply the knowledge and skills learned throughout the course to a program with which they are familiar.
    Keiko Kuji-Shikatani, Ed.D.  Cathexis Consulting 124 Merton St., Suite 502  Toronto ON M4S 2Z2 Tel: (416) 469-9954 ext.232 Fax: (416) 469-8487 keiko@cathexisconsulting.ca 

  • Monitoring and Evaluation in Development Dubai, UAE, organised by IMA International.  A 2-week intensive practical course to enhance practical monitoring and evaluation skills and methods required in a demanding and rapidly changing world. The first week concentrates on monitoring tools and techniques. The second week focuses on the complete evaluation process. A comparative field visit consolidates the learning. Participants develop a personal and professional action plan to be implemented upon return to their organisations. Date 5th – 16th February 2007 For further information, please visit our website at http://www.imainternational.com or by email at post@imainternational.com (24/10/06)

  • USAID’s MEASURE Evaluation Project is pleased to announce a training opportunity for the Africa region. The Centre Africain d'Etudes Supérieures en Gestion (CESAG), based in Dakar, Senegal is offering a regional workshop on Monitoring and Evaluation of HIV/AIDS Programs. This two-week course will take place February 5-16, 2007 and will be taught in French.  The workshop will offer intensive training that will cover the fundamental concepts and tools for monitoring and evaluating HIV/AIDS programs.  Please contact CESAG (Amani.koffi@cesag.sn) or the MEASURE Evaluation project (measure_project@jsi.com) to request applciation materials or with any questions regarding this training opportunity.  The deadline for applications is January 5, 2007.  Thank you,  Alec T. Moore, Project Administrator
    MEASURE Evaluation, John Snow, Inc. 1616 N. Fort Myer Dr., 11th Floor, Arlington, VA  22209, Phone: 703-528-7474
    Fax: 703-528-7480.

January 2007
  • Channel Research, in collaboration with ALNAP, is planning to organise a short training course on evaluation of humanitarian action. The course will be based on an updated version of the ALNAP training modules.  This course will be facilitated by Margie Buchanan-Smith and John Telford and will take place near Brussels at the end of January 2007. Further information will be available at the end of October.  For any questions, comments or expression of interest regarding this training, please feel free to contact Cécile Collin at Channel Research, email: collin@channelresearch.com, telephone +32 2 633 65 29. (Posted 11/10/06)

  • The 4th Conference of the African Evaluation Association (Afrea) is to be held in Niamey, Niger (West Africa) in January 2007 on the theme "Evaluating Development, Developing Evaluation: A Promising Network for Africa's Future" The choice of the location of the Conference is a recognition of the strength and activity of the Nigerien Monitoring and Evaluation Network (ReNSE) to which I would also like to draw your attention to. Please, do have a look at www.pnud.ne/rense, where you'll also find the link to the Conference webpage (in French and in English). Thank you very much and best regards, Daniela  Gregr, Associate Economist, UNDP Niger, Maison de l'Afrique, Niamey, Niger , daniela.gregr@undp.org 
    (posted 21/05/06)
     
If you can't find the event you are looking for here, try out Lars Balzer's Evaluation Portal: Calendar. An international calendar for evaluation events (congresses, training events, methodological training, workshops, lectures, and everything useful in the field of evaluation). You can also register evaluation events for inclusion in the calendar.
Search the Archives (630+ items) and www.mande.co.uk/docs/ directory (190+ items) Google Custom Search

2. WORK IN PROGRESS Return to Menu
The aim of this new section of MandE NEWS is to feature short news items about particular pieces of work being planned or still under developement in the various organisations funding and or using Mande NEWS. Normally, no publications will yet be available but there will be contact names and email addresses for further information on these developments.
July 2007
  • The Secretary of State for International Development is establishing an independent committee to challenge and advise his department (DFID) on the overall strategy, approach and priorities being adopted in its evaluation work. The role of the committee will be to guarantee the independence of the evaluation function in DFID and the use of evaluation results to enhance the delivery and impact of UK development assistance. Comprising up to nine members (including the Chair), the Independent Advisory Committee for Development Impact (IACDI) will meet around three times a year (in London) and will examine DFID’s evaluation work related to all aspects of the UK aid programme. The IACDI will also review and assess the strategy and work plans for evaluation, ensure that the recommendations from evaluation work are followed up, and comment on the overall quality and implementation of the evaluation programme. The management and delivery of the evaluation function will remain the responsibility of the Evaluation Department in DFID. (02/07/07)

December 2006
  • (from Pelican email list) In September this year, an informal network of organisations concerned about the impact of research on the reality of poverty organised a workshop. The workshop's aim was to mobilise a wider group of organisations that work in the field of research communication to explore and analyse different approaches to monitoring and evaluating of research communication. The workshop was supported by DFID and involved over forty representatives from UK and international NGOs. You can access the four-page summary report via the following link: http://www.healthlink.org.uk/PDFs/mande_summary.pdf  The following three questions guided the workshop:
    1. What can we do differently to monitor communication of research?
    2. Is it enough to produce research and communicate it?
    3. What is good practice around research communication?
    The workshop was informed by a 'scoping study' on M&E of research communication, as well as a number of case studies which concretised some of the challenges of research communication. The key findings of the scoping study are:
    1. The use of the logical framework often presents difficulties in capturing 'network' aspects of communication. Alternative framework are sometimes applied, but become too complicated in the context of comprehensive multi-stakeholder interventions;
    2. Stakeholders are often identified in very broad terms, and researchers particularly do not know a lot about policy makers as an audience for research communication. Establishing relationships with these actors and intermediary groups could enhance the use of information;
    3. A range of approaches is being used for M&E of research communication, including a mix of quantitative and qualitative approaches. While for monitoring quantitative approaches are most often used, semi-structured interviews is the preferred approach in evaluations.
    Some of these case studies are available on the following web page, where you can also download the scoping study under 'more information': http://www.healthlink.org.uk/we-do/network_me2.html    (posted 16/12/06)

August 2006
  • BRIDGE Cutting Edge Pack on Gender, Indicators and Measurements of Change BRIDGE – gender and development – based at the Institute of Development Studies, UK, is currently producing a Cutting Edge Pack on gender, indicators and measurements of change. This Pack will provide an overview of the gender issues around measuring change and impact, formulating/using indicators and gender mainstreaming - with case studies and practical examples, and include a supporting resources collection of key documents and tools, and a list of relevant contacts. We are currently fielding information and case studies on experiences with monitoring and evaluation, impact assessments, examples of good practices, or bad practices with lessons learned, gender reviews, and monitoring gendered results (materials from southern based organisations and non-governmental organisations would be particularly welcomed). If you would like to contribute materials or your own experiences, please email j.demetriades@ids.ac.uk or send documents in paper copy to BRIDGE, Institute of Development Studies, University of Sussex, Brighton, BN1 9RE, United Kingdom Previous Cutting Edge Packs are available at: http://www.bridge.ids.ac.uk/reports_gend_CEP.html. See also  www.bridge.ids.ac.uk - BRIDGE publications and reports , and www.siyanda.org - search gender and development resources www.ids.ac.uk – The Institute of Development Studies, UK (Posted 10/08/06)

July 2006
  • Website dedicated to six joint evaluations of the Maastricht 3Cs is now available!
    "The newly launched 3Cs website offers you access to the six final reports from the 3Cs evaluation initiative and is a key electronic resource on coordination, complementarity and coherence for development. Next to featuring the final reports of the initiative of the EU Heads of Evaluation (see the 'About' section), the website brings you useful information and resources on the 3Cs and monitoring and evaluation. Furthermore, it features relevant news and events.  Besides the website, a limited number of information briefs and newsletters on the the 3C initiative will be produced. These will be disseminated both in hardcopy and electronic form, and will cover different information pertaining to the initiative, the 3Cs and the various evaluation studies."(Posted 19/07/06)

June 2006
  • DFID Project on Human Rights and Social Exclusion Indicators and Benchmarks. The objective of this piece of work is to improve DFID and partners’ knowledge and use of indicators and benchmarks to measure human rights and social exclusion.  The contexts in which these indicators and benchmarks may be used include monitoring of (1) development partnership agreements, MoUs and other framework agreements between donors and partners, (2) Country Assistance Plans (CAPs), (3) Poverty Reduction Strategies (PRSs) and other national development strategies, (3) Poverty Reduction Budget Support (PRBS) and Sector Wide Approaches (SWAps), (4) Human rights and exclusion by civil society / NGOs – particularly in fragile states The main outputs of the project include (1) summary of existing sources and guidance on human rights and social exclusion benchmarks and indicators, (2) recommendations of good sources and types of data for measuring human rights and social exclusion at the national level, (3) step by step methodology for developing human rights and social exclusion indicators and benchmarks with country examples of good practice, (4) list of DFID country offices interested in further support in these areas, (5) executive summary. Project Leader Dr Todd Landman Reader, Department of Government, University of Essex Member, Human Rights Centre, University of Essex  Managing Director, Rights Awareness Ltd. Research Assistant Ms. Edzia Carvalho Candidate for the MA in the Theory and Practice of Human Rights University of Essex Contact email: todd@essex.ac.uk; todd@rights-awareness.co.uk
    Website: http://privatewww.essex.ac.uk/~todd; http://www.rights-awareness.co.uk
    Phone: 44 (0) 1206-872129; 44 (0) 1206-298972  (Posted 26/06/06)

  • The African Monitor Concept. "2005 was the ‘Year for Africa’, prioritised by the Commission for Africa Report, the G8 Gleneagles Summit, and with the UN Special Summit and WTO Doha Round having a focus on development. These built on the international undertakings of the MDGs, and the commitments of the Organisation of African Unity (subsequently African Union) to sustainable development through the NEPAD initiative and Peer Review Mechanism. This African civil society voice can thus be seen as the too often missing ‘fourth piece of the jigsaw’ alongside existing stakeholders of donor governments and institutions; their African counterparts; and donor-based NGOs and civil society. The Most Revd Njongonkulu Ndungane, Anglican Archbishop of Cape Town, realised it would be vital to maintain this momentum, to ensure promises on all sides would be implemented swiftly and effectively, in ways that make a real difference to real people. He saw that Africa’s grassroots voices, currently marginalised and fragmented, could be harnessed to pursue these ends, and that faith communities, the most extensive civil society bodies on the continent, could provide the backbone of networks to bring these voices into the public arena. End-user accounts of experiences of programme delivery would help hold both donors and recipient governments to their word, and enable them to achieve their objectives on the ground. Further, he saw this would be a means of better engaging the priorities and perspectives of the targets of these policies in their formulation and delivery, which would also enhance their effectiveness and sustainability. Extensive consultations within Africa and beyond, among faith communities and wider civil society, NGOs, governments and international agencies, think tanks, academia, and the private sector have shown overwhelming support in principle, with the recognition that there is no existing pan-African network that can provide such a catalyst across the sub-Saharan region, and taking a perspective across aid, trade, development and financial flows. African Monitor therefore developed as a catalyst to bring together targeted grassroots monitoring of development performance in key sectors (among which health is a leading priority) against a broader background view of development from the African perspective, with an advocacy strategy geared towards ensuring the urgent and effective delivery of development commitments, led by high level independent African figures – the Togona. African Monitor aims to be a ‘constructive friend’ to all stakeholders, and in particular to help those who have made promises, to be able to deliver them well. " Posted 26/06/06)

  • "At Local Livelihoods we have developed a new software system for designing, managing, monitoring and evaluating projects based on PCM and Logframe.  Here is the link to the site http://www.locallivelihoods.com/PFDownload.htm from which you can download a demo copy of the software press Project Facilitator V1, the password is admin.  In the Help menu you will find a 100 page PCM Toolkit and a software User Guide.   We are just finalising the completion of the same software that will be online and interactive, i.e. any number of people in any locations in the world can work on the same logframe etc at the same time. If you are interested I can demonstrate it now.  regards,  Freer Spreckley, Local Livelihoods,   
    info@locallivelihoods.com
    (16/06/06)

     
  • "Innovation Network, with the assistance of JEHT Foundation and The Atlantic Philanthropies, is in the midst of assembling resources for an Advocacy Evaluation project: http://www.innonet.org/advocacy (in progress).  The materials we are collecting will be posted as a resource for funders, evaluators, and practitioners.  The materials will be available free of charge." (14/06/06)

  • M&E Framework Good Practice Guide (1 Mb pdf) This guide assists AusAID staff, contractors and partners to prepare good quality monitoring and evaluation (M&E) frameworks—which are necessary to ensure accountability (‘to prove’) and to promote learning (‘to improve’). Supporting this guide is a Glossary of Key Terms (Appendix A); AusAID’s M&E Quality Frame (Appendix B); illustrative formats that may add value for M&E frameworks (Appendix C). For AusAID staff, this guide can be used to ensure that M&E-related documents and processes are comprehensive (e.g. Terms of Reference, Activity Design Documents, appraisals etc.). For contractors and partners, this guide serves to define what AusAID means by ‘M&E' framework’, and in particular, what constitutes good practice. (Exposure Draft march 17, 2006)

  • Mapping of approaches towards M&E of Capacity and Capacity Development Draft ECDPM June  2006 " The document currently covers eighteen approaches in five different groups:
    A. Approaches that focus on M&E from a system perspective;
    B. Approaches which focus on changes in behaviour;
    C. Performance-based approaches;
    D. Approaches which focus on strategic planning;
    E. Rights-based / Empowerment-based approaches.
    The document is available on the Pelican website: http://www.dgroups.org/groups/pelican/docs/Mapping_M&E_capacity_080606.pdf  Please let me know if you have further suggestions and ideas to improve this draft document, or if you would like to suggest other approaches for inclusion. This version of the mapping has been disseminated during the final workshop of the study on Capacity, Performance and Change (15-17
    May). It will be further improved and used for the forum of the Learning Network on Capacity Development (LenCD) (3-5 Oct). Best wishes, Niels Keijzer (posted 09/06/06)

Search the Archives (630+ items) and www.mande.co.uk/docs/ directory (190+ items) Google Custom Search

3. NEW DOCUMENTS Return to Menu
Please note : The providers of some of the reports referred to below may make a charge for copies of those reports.
January 2007
  • 2008 Reader on Measuring and Reporting Results, by Jim Tanburn (English/French/Spanish) Jan 2008  "This Reader comes at a particular point in the history of development, and of the development of value chains
    and service markets in particular. The tax-paying public in donor countries are wondering what their donor
    agencies are achieving, and some people are proposing that the answer is “not much”. They can do this,
    because there is little that is published about results, which is both convincing and comparable.
    There are, of course, real challenges in measuring and comparing results. Those who have worked in the field
    for some time will already be familiar with them, and will be looking for fresh perspectives – rather than the
    usual agreement that “we ought to do more”.
    This Reader aims to do exactly that, arguing that debates about rigour in methodologies have distracted from
    the more important institutional and human barriers to measuring results. These barriers need to be addressed
    in creative ways, as results can be estimated in ways that are affordable and not technically demanding. This
    Reader will have succeeded, if it leads to greater measurement and reporting of results; comments on the text
    are particularly welcome.
    Ideally, there would be a crisp definition of the intended readership; in practice, however, the communities of
    practice are now fluid and overlapping. Certainly, it will be of interest to those developing value chains and
    service markets; it is also likely to be of interest to those engaged in broader reform of the business
    environment, and indeed in private sector development (PSD) more generally" (28/01/08)

    See also
    Impact Assessment: Syntheses and Guides on the same website, a list of 20 online documents

November 2007
  • Governance and Social Development Resource Centre Monitoring and Evaluation Topic Guide About the GSDRC The Governance and Social Development Resource Centre (GSDRC) was established by the UK Department for International Development (DFID) in 2005 to provide access to high quality, timely information to support international development project and programme planning, policymaking, and other activities in the field How to use this guide This topic guide provides an overview of current knowledge in monitoring and evaluation of development activities. It includes short summaries of key texts. Each short summary links to an extended version in the appendix, along with information on how to access the original text. Both the short and extended summaries are cross-referenced in the guide. (16/11/07)

  • Making a difference: M&E of policy research. Author:  Ingie Hovland, Date:July 2007,  ODI Working Paper 281 "This paper aims to advance understanding on how to monitor and evaluate policy research, i.e. research that is undertaken in order to inform and influence public policy. Policy is defined very broadly to encompass both policy decisions and processes, including implementation. Conventional academic research is usually evaluated using two approaches: academic peer review, and number of citations in peer-reviewed publications. For policy research programmes, these evaluation tools have proven too limited. They are not well suited to capture some of the broader aims of policy research, such as policy impact, changes in behaviour, or building of relationships. In short, policy research programmes need new monitoring and evaluation (M&E) approaches in order to know whether they are making a difference, not only in the academic world but also in the world outside academia. The paper is written with research programmes and institutions in mind, rather than individual researchers. It presents examples and approaches on how to do M&E of policy research from the current experience of a range of research institutes, think tanks and funding bodies. The approaches have been divided into the following five key performance areas: (i) Strategy and direction; (ii) Management; (iii) Outputs; (iv) Uptake; and (v) Outcomes and impacts. Research programmes or institutes may wish to focus on only one of these areas, or may combine approaches across the areas to form a more comprehensive M&E plan." (12/11/07)

  • Not new, but newly available as a free downloadable document. Foundational Models for 21st Century Program Evaluation by Daniel L. Stufflebeam The Evaluation Center Western Michigan University The Evaluation Center Occasional Papers Series December 1, 1999 "In moving to a new millennium, it is an opportune time for evaluators to critically appraise their program evaluation approaches and decide which ones are most worthy of continued application and further development. It is equally important to decide which approaches are best abandoned.
    In this spirit, this paper identifies and assesses 22 approaches often employed to evaluate programs. These approaches, in varying degrees, are unique and comprise most program evaluation efforts. Two of the approaches, reflecting the political realities of evaluation, are often used illegitimately to falsely characterize a program’s value and are labeled pseudoevaluations. The remaining 20 approaches are typically used legitimately to judge programs and are divided into questions/methods-oriented approaches, improvement/ accountability approaches, and social
    agenda/advocacy approaches. The best program evaluation approaches appear to be Outcomes Monitoring / Value-Added Assessment, Case Study, Decision/Accountability, Consumer-Oriented, Client-Centered, Constructivist, and Utilization-Based, with the new Democratic Deliberative approach showing promise. The worst bets seem to be Politically Controlled, Public Relations, Accountability (especially payment by results), Clarification Hearings, and Program Theory- Based. The rest fall somewhere in the middle. All legitimate approaches(01/11/07)

October 2007
  • Research Paper No. 2007/52 Monitoring and Evaluation Reform under Changing Aid Modalities Seeking the Middle Ground in Aid-Dependent Low-Income Countries Nathalie Holvoet and Robrecht Renard* September 2007 Abstract This paper grew out of our bewilderment with the insouciance with which some in the donor community seem ready to abandon accounting for the use of aid. If one listens to the rhetoric surrounding the new approach to aid, one gets the impression that most of the crucial accounting tasks must be swiftly abandoned by donors and left to recipient governments. This paper does not question the underlying rationale for shifting towards recipient-led priority setting and control over implementation of aid resources, but argues that donors cannot let themselves off the hook so easily with respect to the accountability part of the equation. We argue that in most low-income countries such trust in recipient systems may be dubbed as over-alignment, and that it is neither necessary nor useful. Our argument is however not that old style donor-managed monitoring and evaluation is the only or the best solution. For we are equally puzzled by the stubbornness with which some other donors stick to their old monitoring and evaluation (M&E) in ways that contradict the new insights in aid effectiveness and hamper the emergence of national M&E systems. Why are positions so polarized and why is hardly anyone arguing in favour of intermediate positions? This is what this paper sets out to do: we argue against a radical and rapid implementation of the new rhetoric in low-income countries, but also against a continuation of present accountability practices. Donors have a large and lasting responsibility in accounting for the use of aid funds, both towards the taxpayers in donor countries and towards the targeted beneficiaries in the at best pseudo-democratic and poorly governed low-income recipient countries. They should find new ways to remain firmly involved in M&E, ways that allow, at the same time, embryonic national M&E systems in low-income recipient countries to grow and flourish. (posted 26/10/07)


  • (from Pelican email list) Public Expenditure and Service Delivery Monitoring in Tanzania: Some international best practices and a discussion of present and planned Tanzanian initiatives Geir Sundet Working Paper 04:7 This paper provides a review of the Tanzanian experience of public expenditure tracking and civil society initiatives to enable feedback from the users of public services. It also discusses a number of related experiences in other countries, including:
    *  Public Expenditure Tracking Studies (Uganda)
    *  Social Audits/public hearings (Philippines and India)
    *  Citizens Report Cards (India)
     The last section of the paper presents some of the key challenges for local feedback mechanisms in the Tanzanian context. related tothis, see also the Tanzania Governance Noticeboard which collates and presents information that is useful for the strengthening of accountability, transparency and integrity in Tanzania.  And the "Following the Money Sourcebook', which was prepared as a resource for CSOs for engaging local authorities in expenditure tracking. While
    adapted for the Tanzanian legal and institutional situation, it may also be of interest in other contexts (posted 15/10/07)


  • ONTRAC The September issue, on Rethinking M&E. In this issue: Our theme in this issue is ‘Rethinking
    M&E’. Brian Pratt analyses the policy shift towards managerial values in monitoring and evaluation; Anne Garbutt describes how nascent CSOs in Oman welcomed the logframe’s structure and clarity ; numbers are
    contrasted with stories in Nomvula Dlamini’s M&E experience from South Africa; Alix Tiernan calls for a paradigm
    shift to build learning into the programme cycle of NGOs, and Katie Wright-Revolledo sets out the steps for
    linking quantitative data and qualitative methods in QUIP. Also, read about INTRAC’s latest book, ‘Rethinking
    M&E - Challenges and Prospects in the Changing Aid Environment’.  (03/10/07)

September 2007
  •  ILAC Briefs Issue 13 - Horizontal evaluation: Stimulating social learning among peers, Graham Thiele, André Devaux, Claudio Velasco and Kurt Manrique. "Horizontal evaluation is a flexible evaluation method that combines self-assessment and external review by peers. We have developed and applied this method for use within an Andean regional network that develops new methodologies for research and development (R&D). The involvement of peers neutralizes the lopsided power relations that prevail in traditional external evaluations, creating a more favourable atmosphere for learning and improvement. The central element of a horizontal evaluation is a workshop that brings together a group of ‘local participants’ who are developing a new R&D methodology and a group of ‘visitors’ or ‘peers’ who are also interested in the methodology. The workshop combines presentations about the methodology with field visits, small group work and plenary discussions. It elicits and compares the perceptions of the two groups concerning the strengths and weaknesses of the methodology; it provides practical suggestions for improvement, which may often be put to use immediately; it promotes social learning among the different groups involved; and it stimulates further experimentation with and development of the methodology in other settings." (12/09/07)

August 2007
  •  Three items sent in by  Keith Mackay, World Bank:
    • Governments in a number of developing countries are devoting considerable  efforts to strengthen their monitoring and evaluation (M&E) systems and  capacities. They are doing this to improve their performance -- by  establishing evidence-based policy-making and budget decision-making,  evidence-based management, and evidence-based accountability. This  publication is a synthesis of IEG's considerable experience on this topic.
        http://www.worldbank.org/ieg/ecd/better_government.html
    • A video presentation (a seminar) on The United States government?s method for  rating the performance of all programs --  the Program Assessment Rating Tool  (PART). These ratings rely heavily on M&E findings, including the quality/reliability of the M&E evidence available:  http://www.worldbank.org/ieg/ecd/part.html
    • A video presentation of a half-day workshop on How to Increase the  Utilization of Evaluations, presented by Michael Bamberger:  http://www.worldbank.org/ieg/ecd/utilization_evaluation.html (02/08/07)

July 2007
  • MAKING A DIFFERENCE: M&E OF POLICY RESEARCH  Ingie Hovland, July 2007 ODI Working Paper 281  This paper aims to advance understanding on how to monitor and evaluate policy research, i.e. research that is undertaken in order to inform and influence public policy. Policy is defined very broadly to encompass both policy decisions and processes, including implementation. The paper is written with research programmes and institutions in mind, rather than individual researchers. It presents examples and approaches on how to do M&E of policy research from the current experience of a range of research institutes, think tanks and funding bodies. The approaches have been divided into the following five key performance areas: 
    Performance Area I – Evaluating strategy and direction: Logframes; Social Network Analysis; Impact Pathways; Modular Matrices
    Performance Area II – Evaluating management: ‘Fit for Purpose’ Reviews; ‘Lighter Touch’ Quality Audits; Horizontal Evaluation; Appreciative Inquiry
    Performance Area III – Evaluating outputs: Evaluating academic articles and research reports; Evaluating policy and briefing papers; Evaluating websites; Evaluating networks; After Action Reviews
    Performance Area IV – Evaluating uptake: Impact Logs; New Areas for Citation Analysis; User Surveys
    Performance Area V – Evaluating outcomes and impacts: Outcome Mapping; RAPID Outcome Assessment; Most Significant Change; Innovation Histories; Episode Studies (26/07/07)

June 2007
  • Ghana: District-based poverty profiling, mapping and pro-poor planning as a monitoring and evaluation tool:by
    Bruno B. Dery and Audrey Dorway. Acting in collaboration with the Ghanaian Ministry of Local Government, Rural
    Development and the Environment (MLGRDE), the National Development Planning Commission (NDPC) and the Social Investment Fund (SIF), the German Technical Cooperation Agency (GTZ) embarked on a nationwide project to compile poverty profiles and maps for all the country’s 110 districts. The aim was to help make the pro-poor targeting of development initiatives more effective. The participatory methodology for poverty profiling and mapping was first piloted in two districts and later implemented in 16 districts. In 2004, assistance was given to the remaining 94 District Assemblies in preparing poverty profiles, maps and pro-poor programmes. (20/06/07)

  • MONITORING AND EVALUATING INFORMATION AND COMMUNICATION FOR DEVELOPMENT (ICD)
    PROGRAMMES GUIDELINES
    by Mary Myers, with the support of Nicola Woods and Sina Odugbemi of the ICD team, DFID 2005 These guidelines were written for DFID staff in need of advice on the monitoring and evaluating Information and Communication for Development (ICD) programmes. The guidelines introduce a range of approaches useful at various stages of a development programme. The guidelines are aimed at programmes involving: face-to-face communication or information activities such as counselling or extension visits; community-level communications such as theatre, role-playing, workshops, posters and other print materials; TV, radio, film and video; Internet and email communications programmes; amd telecommunications-based projects. 

May 2007
  • BEST PRACTICES FOR FUNDING AND EVALUATING THINK TANKS AND POLICY RESEARCH
    The William and Flora Hewlett Foundation ....commissioned a study ... that documents and analyzes the existing pre-grant assessment criteria, methods of grant monitoring and evaluation, and effective funding mechanisms for think tanks in developing and transitional countries. The purpose of the study is to examine the funding mechanisms and evaluation criteria currently employed by public and private donors in order to determine the best practices for each area. Specifically,...:
    1) selection criteria for choosing think tanks to support;
    2) evaluation criteria for determining whether funded groups are "successful" (i.e. metrics that would measure impact and help public and private donors determine whether funding should be renewed); and 
  • 3) funding mechanisms that would encourage the long-term financial sustainability of institutions that receive support.
March 2007
  • A diagnosis of Colombia's national M&E system, SINERGIA is available at  http://www.worldbank.org/ieg/ecd/sinergia.html  Colombia's national system for monitoring and evaluation, SINERGIA, is one of the strongest in Latin America. This rapid diagnosis identifies the strengths of the system and the main challenges still facing it. A number of options for further strengthening the system are identified with the objective of ensuring it becomes fully institutionalized. (Posted 23/03/07) 

  • Impact Measurement and Accountability in Emergencies: The Good Enough Guide "What difference are we making? How do we know? The Good Enough Guide helps busy field workers to address these questions. It offers a set of basic guidelines on how to be accountable to local people and measure programme impact in emergency situations. Its 'good enough' approach emphasises simple and practical solutions and encourages the user to choose tools that are safe, quick, and easy to implement. This pocket guide presents some tried and tested methods for putting impact measurement and accountability into practice throughout the life of a project. It is aimed at humanitarian practitioners, project officers and managers with some experience in the field, and draws on the work of field staff, NGOs, and inter-agency initiatives, including Sphere, ALNAP, HAP International, and People In Aid. The Good Enough Guide was developed by the Emergency Capacity Building Project (ECB). The ECB is a collaborative effort by CARE International, Catholic Relief Services, the International Rescue Committee, Mercy Corps, Oxfam GB, Save the Children, and World Vision International." (Posted 23/03/07)

  • Harvard Family Research Project is pleased to announce our newest issue of "The Evaluation Exchange" on evaluating advocacy and policy change. WHAT'S INSIDE Advocacy that influences or informs public policy has the potential to achieve large-scale results for individuals, families, and communities. Consequently, there is much interest in understanding how to make advocacy and policy change efforts more effective. While previously relegated as "too hard to measure," advocacy evaluation has become a burgeoning field. This 32-page issue of "The Evaluation Exchange" helps to build this new field by defining the developments that are shaping it and showing how enterprising evaluators, nonprofits, and funders are tackling the advocacy evaluation challenge. FIND THE ISSUE ONLINE Download a copy or read the HTML version at: http://www.gse.harvard.edu/hfrp/eval/issue34/  (Posted 23/03/07)


January 2007
  • Scoping study Monitoring and Evaluation of Research Communications Catherine Butcher and Gil Yaron August 2006 28pages. This scoping study on the monitoring and evaluation (M&E) of research communications was carried out over a period of 15 days to: • Provide a broad overview of the key issues in the monitoring and evaluation of research communications. • Draw out differences, if and where they exist, between M&E of research communications compared with M&E generally. • Identify characteristics of good practice in the M&E of research communications and highlight implications for those involved . In reviewing the literature the study addresses three questions: • What approaches have been used in monitoring and evaluation of research communications? • What methods and tools are currently being used to monitor and evaluate research communications? • Is there anything special about M&E of research communications? (posted 12/01/07)

  • The 2006 Global Accountability Report measures and compares the accountability of transnational organisations in the intergovernmental (IGOs), non-governmental (INGOs) and corporate sectors (TNCs) on the basis of four dimensions of accountability: transparency, participation, evaluation, and complaint and response. You may access this report through the One World Trust website at: http://www.oneworldtrust.org/?display=index_2006.  (posted 10/01/07)

  • PREVAL's electronic newsletter No. 7, is a Special Number on Evaluation of Capacity Building of Rural Groups and Organisations, including links and selected literature. "As is known, anti rural poverty strategies place emphasis on the promotion of poor people's capacities for action, dialogue on policy and strengthening of networks and partnerships.  These approaches emphasize not only the acquisition of knowledge and information but also creativity, leadership and other individual and group conditions necessary for impact sustainability. ...For this, it is necessary to build on the most recent thinking on evaluation criteria for measuring capacity building of rural organisations, an issue that is discussed in this Special Number. See the following link: http://www.preval.org/boletin/index.php?boletin=84 (posted 10/01/07)

December 2006
  • Issue 33 December 2006 Making Accountability Count , IDS Policy Briefing "Accountability is now a buzzword in contemporary development debates. It is central to development policy, whether government accountability (as a central component of good governance), corporate accountability (promoted by a swathe of standards and codes), or civil society accountability (claimed by people and organisations from the bottom up). Yet with so many competing ideas, interpretations and practices, it is sometimes unclear how improved accountability is directly relevant to the lives of poor and marginalised people. In order to build accountable institutions that respond to claims by citizens, it is crucial to understand how accountability matters, for whom, and under what conditions it operates. This Policy Briefing looks at who benefits from improved accountability and focuses on how people claim accountability in practice" (posted 21/12/06)

  • Guide: MONITORING, EVALUATION AND LEARNING FOR FRAGILE STATES AND PEACEBUILDING PROGRAMS Practical Tools for Improving Program Performance and Results. Rolf Sartorius, President | Christopher Carver, Program Manager SOCIAL IMPACT 2006 Developed with support from USAID’s Office of Transition Initiatives and USAID’s Capable partners Program Section headings: Part I: Introduction, Part II: Illustrative Indicators for FSP, Part III: ME&L Tools for FSP Programs, Additional ME&L Tools. 24 tools listed in total. 156 pages. "ME&L in fragile states and for peacebuilding is a new technical area in the fi eld of program evaluation. But there has been a great deal of recent innovation to develop practical approaches and to capture learning from programs so that they have greater potential to improve peoples’ lives. This guide consolidates a number of ME&L approaches
    that have been newly developed or contextualized for fragile states and peacebuilding programs. A range of qualitative, quantitative and participatory approaches are included  as well as tools for strengthening ME&L systems at the project or organization level. The approaches have come from bilateral and multilateral donors, local and international NGOs, consultants, and university groups from around the globe who were consulted in putting this
    guide together. Experience suggests that when organizations begin to adapt and use these approaches
    more systematically they become more successful in improving peoples’ lives in fragile states and confl ict prone areas." Email a request for a copy of the Guide to info@socialimpact.com (posted 16/12/06)

  • Praxis Paper No. 12 Learning from Capacity Building Practice: Adapting the ‘Most Significant Change’ (MSC) Approach to Evaluate Capacity Building Provision by CABUNGO in Malawi By Rebecca Wrigley December 2006 "This paper provides a reflection on a pilot experience of using the ‘Most Significant Change’ (MSC) methodology to evaluate the capacity building services of CABUNGO, a local capacity building support provider in Malawi. MSC is a story-based, qualitative and participatory approach to monitoring and evaluation (M&E). INTRAC and CABUNGO worked collaboratively to adapt and implement the MSC approach to
    capture the complex and often intangible change resulting from capacity building, as well as to enhance CABUNGO’s learning and performance. Overall, it is felt that MSC did provide an effective approach to evaluating capacity.... Participants in the evaluation process felt that using a story-based approach was very useful in helping CABUNGO to understand the impact that it is having on the organisational capacity of its clients and how its services could be improved. The key advantages of using MSC were its ability to capture and consolidate the different perspectives of stakeholders, to aid understanding and conceptualisation of complex change, and to enhance organisational learning. The potential constraints of using MSC as an approach to evaluating capacity building lay in meeting the needs of externally driven evaluation processes and dealing with subjectivity and bias". (12/12/06)

October 2006
  • Systems Concepts in Evaluation: An Expert Anthology . 2006 EDITORS Bob Williams Iraj Imam  Summary     This volume was supported by a grant from the W.K. Kellogg Foundation, which is increasingly using systems thinking approaches to our work. It is being published as the first volume in a new series sponsored by the American Evaluation Association. The aim of the series is to make high quality work in evaluation available at a modest price. The Monograph Series was conceived of and will mainly consist of relatively brief single-author works, but will deviate from that model when the occasion arises. As it happened, an unusual opportunity made it possible to inaugurate the series with this very timely and well-staffed anthology. (Posted 17/10/06)

August 2006
  • NEW INTRAC PUBLICATION! Mapping the Terrain: Exploring Participatory Monitoring and Evaluation of Roma Programming in an Enlarged European Union by Zosa De Sas Kropiwnicki and Fran Deans, Occasional Paper 47 (44pp, £8.95, ISBN 1-905240-02-3). (Posted 23/08/06)

  • May 5-6, 2005 Cape Town, South Africa RECONCILING ALIGNMENT AND PERFORMANCE IN BUDGET-SUPPORT PROGRAMMES: WHAT NEXT? David Booth1 Karin Christiansen Paolo de Renzio Overseas Development Institute, London THE WORLD BANK 1 d.booth@odi.org.uk; k.christiansen@odi.org.uk; p.derenzio@odi.org.uk. Abstract This paper reviews the current state of the debate about policy alignment and performance assessment, focusing particularly on the troublesome relationship between budget-support Performance Assessment Frameworks and PRS annual progress reviews. It argues that further advances in alignment will be achieved only if more explicit attention is given to the recipient side of the relationship and the reasons why Annual Progress Reports do not have the qualities donors expect. Until this is done, there is a case for simplifying PAFs but not for making them more results-based. PAFs should build country ownership by including only the sorts of conditionalities that are likely to work, and combining these with a multi-level selectivity and complementary actions to address missing preconditions for aid effectiveness. (Poszted 12/08/06)

JUNE 2006
  • 'Perceptions and Practice: an Anthology of Impact Assessment Experiences' One reader (Peter Ballantyne) says "Its take a kind of storytelling approach to the whole troublesome issue of impact assessment.....As the authors say, it's a "collection of experiences — recounted through interviews and transcribed into stories — in assessing the impact of information-related development projects. It does not provide sanitised or ‘averaged-out’ accounts, but puts them down as told, warts-and-all. It makes no attempt to push particular theories, promote certain practices or provide easy solutions. It simply lets the storytellers — 61 of them, from right across the development spectrum — tell their stories. It lets their opinions, approaches and experiences come through in the stories. It then casts an eye over some of the issues emerging from the stories, not by way of prescription but rather as guidance. And it leaves you, the reader, to interpret the stories, to decide what you think they reveal about the main issues in impact assessment, about the purpose of impact assessment, about when and how it should be done, who should do it and, not least, why it is the subject of so many interpretations and so much debate."Some very rich stories and anecdotes and insights, in a very approachable format.  There aren't any specific KM named studies, but lot's of relevant stuff". Contents:
    Study 1 Catalysing Access to Information and Communication Technologies in Africa (CATIA) programme
    Study 2 Reflect Information and Communication Technologies project, India
    Study 3 National Agricultural Marketing and Development Corporation (NAMDEVCO) information products, Trinidad and Tobago
    Study 4 Footsteps magazine, Kenya and Rwanda
    Study 5 Electronic Delivery of Agricultural Information (EDAI) telecentres project, Uganda
    Study 6 Wan Smolbag Theatre (WSB) radio project, Vanuatu
    Study 7 Aqua Outreach Programme (AOP) extension materials project, Thailand
    Study 8 Agricultural research organisations’ capacity development project, Latin America
    Study 9 Regional Human Rights Education Resource Team (RRRT), Pacific
    Study 10 Selective Dissemination of Information (SDI) service, Africa
    Study 11 Effectiveness Initiative (EI) project, Israel, Kenya, Peru and The Philippines  
    Copies can be ordered online or by post from SMI (Distribution Services) Ltd, PO Box 119, Stevenage, Hertfordshire SG1 4TP, UK, Tel: +44 1438 748111, Fax: +44 1438 748844, E-mail: CTA@earthprint.co.uk, Website: www.earthprint.com (Posted 27/06/06).. For free copies, if you are in ACP countries, follow the instructions here

  • Civil Society Legitimacy and Accountability: Issues and Challenges Jagadananda CIVICUS: World Alliance for Citizen Participation and Centre for Youth and Social Development and L. David Brown Hauser Center for Nonprofit Organizations Harvard University DRAFT March, 2005 "The present paper attempts to unravel the evolving complexities of civil society legitimacy and accountability and to analyze existing systems and practices for responding to legitimacy and accountability challenges. It provides the base for a series of dialogues and consultations about these issues with CSOs and their stakeholders in the future. It suggests steps for developing systems to enhance the legitimacy and accountability of civil society organizations and multi-organization domains. The paper is organized into five sections beyond this introduction. The next section looks at key issues of civil society legitimacy and accountability. It suggest six reasons why the legit imacy and accountability of civil society organizations has come under challenge in the last few years. The third section provides concepts and a framework for understanding these issues and possible ways to address them. It describes our definitions of legitimacy and accountability, their interaction in the context of civil society missions and strategies, and the sources of standards of legitimacy and accountability for civil society. The fourth section suggests ways to build organizational accountability systems that can enhance the legitimacy and accountability of particular civil society actors. Such systems can be used to catalyze organizational learning and performance management as well as increased accountability and legitimacy. The fifth section focuses on building the legitimacy and accountability of multi-organization domains. Such domains include campaign alliances, sectors of similar organizations, and problem domains that may involve diverse actors. The sixth section describes some ongoing dilemmas that we expect will challenge and energize future debates. The last section provides a brief  conclusion. Overall, this report aims to contribute to the world discourse and debate about civil society legitimacy and accountability. We also hope it will help to catalyze action at many levels to resolve some of the questions this debate poses" (Posted 27/06/06)

  • Organized observation systems in the aid chain, incl. monitoring. RDRS Bangladesh has been supporting federations of poor people for over 15 years. 260 federations, one in each of the rural Union Council areas in a poverty belt of the northwest, with a combined membership of 130,000 households, are engaged in a broad variety of institution building, social development and business activity. A report“A Shelter for the Poor. The long-term viability of NGO-supported local associations” (2Mb), by Aldo Benini  analyzes the history and performance of these grassroots organizations, using uniquely detailed statistics as well as case study material. Since the federations fill out a compact region, outcome models can assess the relative influence of past performance, current structure as well as of the local environments. Much of the data is from the monitoring system that the supporting NGO has been operating since early on in the federation history. A special chapter, “Mutual Observation”, details the monitoring challenges, and tries to give equal weight to the perceptions that federation members form and communicate about RDRS. Aldo Benini thinks that the literature on two-way observation systems, particularly from network situations with numerous semi-autonomous grassroots organizations, is sparse or perhaps non-existent. He is grateful for feedback and suggestions. The report describes also the competing rationalities that impact on the policies and practice of Federation support and their consequences for monitoring the outcomes." (Posted 27/06/06)

     
  • Monitoring and Evaluation (M&E) for Development in Peace-Precarious Situations Catherine Elkins Senior M&E Expert RTI International International Development Group Research Triangle Park, NC 27709 USA Prepared for
    The North-South Divide and International Studies 47th Annual ISA Convention March 22-25, 2006 Town & Country Resort and Convention Center San Diego, CA 96815 USA Panel: Monitoring & Evaluating Conflict Interventions: Theory vs Practice 1 ABSTRACT  Monitoring and evaluation (M&E) supports evidence-based decision-making in program management through rigorous approaches to collecting and using quality data on program performance, results, and impact. The application of appropriate analytical tools in order to assess the efficiency and effectiveness of interventions in well-defined contexts over time contributes to our knowledge of the kinds of interventions that work best, and under which conditions. This paper focuses on the value of utilizing M&E information systems to improve
    both program impact and our understanding of how best to assist peaceful development in situations prone to violent conflict. Project M&E examples illustrate M&E strategies and tactics in peace-precarious situations, framing discussion of the utility of key M&E practices and approaches where stability and security are lacking. The final section suggests initial criteria for enhancing effective and cost-effective M&E that contributes more meaningfully to
    the success of development interventions in peace-precarious situations; the most critical of these is building flexible M&E systems that can respond appropriately to continue providing useful information under extreme uncertainty.  (Posted 26/06/06)

  • The 4th issue of the online Journal of MultiDisciplinary Evaluation is now available at the Evaluation Center's Website at  http://evaluation.wmich.edu/jmde/   This includes reviews of the recent contents of 6 other evaluation journals, most of which are not available onlne.  There are four other issues of the Journal of MultiDisciplinary Evaluation which are also available online, on the same website. The Journal of MultiDisciplinary Evaluation is  A peer-reviewed journal published in association with The Interdisciplinary Doctoral Program in Evaluation. The Evaluation Center, Western Michigan University. (19/06/06)

  • Final Report of the Evaluation Gap Working Group: "When Will We Ever Learn? Improving Lives through Impact Evaluation" (May 2006) The Evaluation Gap Working Group was convened to understand the reasons for the lack of good impact evaluation and the possible ways to make significant progress toward solving the problem. After a year of research and deliberation and further consultation among a broad set of interested parties, the group reached a recommendation for what the international community must do to close the "evaluation gap": take individual action to reinforce existing efforts, and make a major advance through a collective endeavor. Read the working group's report or policy brief to learn more. (Posted 15/06/06)

  •  Monitoring and Evaluation Toolkit: HIV/AIDS, Tuberculosis and Malaria. Produced by World Bank, U.S. Department of Health and Human Services, U.S. Centers for Disease Control and Prevention, United Nations Children's Fund, Joint United Nations Programme on HIV/AIDS, U.S. Agency for International Development, U.S. Department of State, Global Fund to Fight AIDS, Tuberculosis and Malaria. June 2004. Summary: This information package aims to provide those working at the country level on monitoring and evaluation systems linked to expanded HIV/AIDS, TB, and malaria programs with access to key resources and standard guidelines. The target audience includes national disease program managers and project leaders, donor agencies, technical and implementing agencies, and NGOs. Available in English,  Spanish,  French:   and Russian (Posted 06/06/06)

  • Accountability for Empowerment: Dilemmas Facing Non-Governmental Organizations  PATRICK KILBY
    Australian National University, Australia Summary. — The accountability of NGOs, particularly their ‘‘downward’’ accountability to their beneficiaries, affects NGO effectiveness in the process of empowerment for the poor and marginalized in developing countries. While debate about the accountability of NGOs and various pressures they face is well traveled, much less consideration is given to the broad values of the NGO and how they may affect their approach to downward accountability. This paper looks at evidence from a number of case studies of NGO programs with poor women in India, on the role of accountability in empowerment outcomes, and the role NGO values play in these outcomes." Published in World Development, Volume 34, Issue 6, Page 933-1152 (June 2006)  (05/06/06)

Search the Archives (630+ items) and www.mande.co.uk/docs/ directory (190+ items) Google Custom Search

4. BOOKS NOTED Return to Menu
October 2007  
  • The Importance of Culture in Evaluation: A Practical Guide For Evaluators (2007) Evaluations are significantly influenced by the cultures of participants, as well as the evalutor. This report provides insights to help evaluators better understand the influence of different cultures, assess their own work and how they work with others, with the goal of creating more useful evaluations for all stakeholders. Digital and hard copy available (03/10/07)  

  • NTRAC has published a new M&E book - Rethinking M&E - Challenges and Prospects in the Changing Global Aid Environment The book both analyses practitioner issues and situates them within wider aid trends. It's now available to pre-order at a 20% discount at £14. 'Rethinking M&E' is based on INTRAC's Sixth Evaluation Conference and regional M&E workshops in Ghana, India, Sweden and Peru, and includes perspectives from both NGOs and CSOs, donor ministries, activists, think-tanks and foundations. Emphasising Southern perspectives and covering a rich variety of experiences, it stresses the important role of M&E in challenging many of our assumptions about poverty alleviation. Contents page: 1. Where we came from and how we got to this point 2. Africa and the M&E of Power Relations 3. Asia 4. Central Asia, the Caucasus and Eastern Europe: From Statist to Activist 5. Latin America
    6. Europe: Between solidarity and fund management 7. M&E Tensions, Challenges and Future Opportunities 8. Towards a Conclusion or a New Start? (03/10/07)

June 2007
  • Negotiated Learning: Collaborative Monitoring for Forest Resource Management : Irene Guijt, Editor Forthcoming February 2007 "A welcome shift from monitoring as mere data collection that fixates on standard indicators. The authors show that monitoring for adaptive management depends on conscious social learning and negotiated decisions on what processes of change are critical...this accessible analysis will inspire many practitioners throughout the world." --Michel Pimbert, International Institute for Environment and Development.  "Tackles one of the biggest challenges in learning: how do we use our experience effectively to improve what we do? Monitoring is neither easy nor well understood...This book does not shy away from the difficulties." --Fred Carden, International Development Research Centre  (20/06/07)

January 2007
  • The publication of "System Concepts in Evaluation: An Expert Anthology" by the American Evaluation Association addresses in a comprehensive way the theory and practice of using systems concepts in  evaluation settings. The 14 chapters (four written by UK authors) cover a wide range of systems concepts and methods.  Most chapters are case study based and describe the use of systems concepts in real life evaluations.  There is also a superb introduction by Gerald Midgley (editor of the 4 volume opus "Systems Thinking"), which describes the key developments
    in systems concepts and methods over the past 50 years, and explores the implications for evaluation of each of those developments.  This chapter in particular will, I think, help answer the questions "what is a systems approach, what is distinctive about it, and what's the relevance to evaluation"  It is currently available as a free file download from the Kellogg Foundation's website, and will be published soon by EdgePress in both soft and hardback forms for $US18 and $US36 respectively.  Although the download is free, please consider purchasing the book - $US18 is a bargain for over 200 pages of great material from some of the best in the business. The quick way to download the publication is here :
    http://www.wkkf.org/default.aspx?tabid=101&CID=281&CatID=281&ItemID=5000253&NID=20&LanguageID=0
    However, if for whatever reason you have difficultly with this monster URL, then the simple route is this : http://www.wkkf.org/ Select "Knowledgebase" Select "Publications and Resources" Select "Toolkits"" Info provided by Bob Williams to EvalChat email list (Post 29/01/07)
  • RIGHTS, RESOURCES AND THE POLITICS OF ACCOUNTABILITY . Edited by Peter Newell and Joanna Wheeler
    Claiming Citizenship: Rights, Participation and Accountability, Zed Books, Aug 2006, 256 pages. Description: In the context of much controversy, this book looks at a range of exciting and imaginative ways in which poor and marginalized groups from around the world claim their rights and demand accountability for the realisation of those rights. Groups mobilizing around the right to water, housing or for fair working conditions often find themselves aligned against powerful state and corporate interests. Experiences from the global North and South are combined to generate key insights into who mobilizes and how and when this makes a difference to the lives of the poor.
    Author Bio: Peter Newell is Senior Research Fellow at the Centre for the Study of Globalisation and Regionalisation, University of Warwick. Joanna Wheeler is the research manager for the Development Research Centre on Citizenship, Participation, and Accountability.

December 2006
  • Evaluation Roots: Tracing Theorists' Views and Influences (Paperback) by M.C. Alkin (Author) # Paperback: 424 pages # Publisher: SAGE Publications (USA) (15 April 2004) # Language English # ISBN-10: 0761928944 "Synopsis
    Evaluation Roots examines current evaluation theories and traces their evolution with the point of view that theories build upon theories and, therefore, evaluation theories are related to each other. Initially, all evaluation was derived from social science research methodology and accountability concerns. The way in which these evaluation 'roots' grew to form a tree helps to provide a better understanding of evaluation theory. Thus, the book uses an evaluation theory tree as its central metaphor. The authors posit that evaluation theories can be classified by the extent to which they focus on methods, uses, or valuing; these three approaches form the major limbs of the tree. In addition to the authors' overview, which analyzes the evaluation theory tree and connections among theories, the book contains essays by most of the leading evaluation theorists. In these pieces, the evaluators comment on their own development and give their views of their placement upon the tree." (posted 24/12/06)

  • Reinventing Accountability: Making Human Democracy Work for Human Development Anne Marie Goetz and Rob Jenkins - 2006 ISBN 1 40390 624 6 264 pages £54 "Democracy's recent proliferation has given millions a political voice, while revealing the difficulties of holding elected governments accountable. This book analyzes the worldwide wave of experimentation with new means of holding powerful actors - public and private, national and transnational - accountable. It traces the multiple deprivations faced by poor people in developing countries back to failures in conventional accountability institutions. The authors argue that a 'new accountability agenda' is in the making and consider whether the reinvention of accountability can make democracy work for the poor." (Posted 21/12/06)

Search the Archives (630+ items) and www.mande.co.uk/docs/ directory (190+ items) Google Custom Search

5. EDITORIAL Return to Menu


Search the Archives (630+ items) and www.mande.co.uk/docs/ directory (190+ items) Google Custom Search

6. Favorite Quotes

Friend to Marx: "Life is difficult!" Marx to Friend: "Compared to what?"

Should Groucho Marx be nominated as the patron saint of evaluation?

Or, Albert Einstein?
Not everything that counts can be counted and not everything that can be counted, counts.”      

Nominated by Heather Budge-Reid, France
"Freedom of information is a fundamental human right...and is the touchstone of all other freedoms to which the United Nations is consecrated",
UN General Assembly, Official Record, First Session, December 1946, Part 2, page 29.
"Knowledge has become the central 'factor of production' in an advanced , developed economy. Knowledge has actually become the primary industry that supplies the economy the essential and central resources of production ",
"The Age of Discontinuity", Peter Drucker, Heinemann, London, 1969, p168
...freedom of the press is limited to those who own one. ",
A.J.Leibling, an American journalist
"Where information has become property, the demand for access is a claim for the redistribution of wealth ",
"The Microinvaders", Ian Reinecke , Penguin, 1982, p229
Document made with Nvu