Close sites icon close
Search form

Search for the country site.

Country profile

Country website

Enhancement of the Evaluation Function in UNHCR

Executive Committee Meetings

Enhancement of the Evaluation Function in UNHCR
EC/49/SC/CRP.8

14 January 1999

1. In August 1998, UNHCR contracted a private firm and a consultant, to conduct a review entitled Enhancement of Evaluation Capacity in UNHCR. The purpose of the consultancy was to assess past and present UNHCR evaluation strategies, as administered and implemented both at central (Headquarters) and decentralized (field) levels. A small UNHCR Advisory Committee met twice throughout the process to advise the consultant and to react to the drafts of his final report. The two-month consultancy terminated in early November 1998, resulting in a final report, highlights of which are contained in the attached Executive Summary.

2. The report notes that present systems and procedures in UNHCR are not well geared to encourage learning from experience. Underlying the main findings were three fundamental "lessons learned", focusing on weaknesses and inadequacies in the evaluation process in UNHCR:

  • The establishment of an effective evaluation function requires adequate staff and financial resources;
  • The establishment of an effective dissemination and feedback system is as important as the relevance and quality of the evaluations themselves;
  • The enhancement of the evaluation function cannot be achieved through piecemeal improvements of selected aspects. It requires a comprehensive approach that recognizes the interdependence of planning, monitoring and evaluation systems.

3. The report produced conclusions and recommendations on evaluation strategy, methodology, structure, staffing, training, and financial resource requirements, emphasizing that "without top management commitment, enhancement of the evaluation function will remain difficult to achieve". The following were among the key recommendations made:

  • Recommendation 1: amalgamation of the Central Evaluation Unit with the Centre for Documentation and Research, in order to enhance learning and policy formulation in UNHCR;
  • Recommendation 2: the creation of four additional professional posts and two support posts in the Central Evaluation Unit;
  • Recommendation 3: appointment of Evaluation Coordination Officers on a pilot basis to the Director's office in three Regional Bureaux: Central East and West Africa (CEWA), Central Asia, South West Asia, North Africa and the Middle East (CASWANAME) and the Bureau for Europe (BE);
  • Recommendation 4 and 5: adequate financial resources to support the above recommendations;
  • Recommendations 7, 8 and 9: development of a long-term evaluation strategy to guide the overall evaluation function of UNHCR, including diversification of evaluation expertise and preparation of an Evaluation Manual. Other recommendations focused on the need to add a field evaluation component to the current inspection function, an improved computerized retrieval system, and a dissemination and feedback strategy. Finally, it was recommended that the report be shared with member countries.

4. The final report came out at a time when an internal review of UNHCR Headquarters structure was about to start and therefore its content was a welcome contribution to the restructuring exercise. On the other hand, the follow-up on most of the consultant's recommendations depended heavily on the new structure and in particular on the organizational placement of the central evaluation function.

5. The first recommendation of linking closely the central evaluation function with the policy analysis function has been endorsed by the High Commissioner. In the new organizational structure, the Department of Operations is directed by the Assistant High Commissioner. In addition to the Regional Operations and the Division of Operational Support, he also directly supervises a new unit called "Evaluation and Policy Analysis". The implementation of this recommendation goes beyond a mere placement of the unit and respects the features which go with it. The independence of the evaluation function is preserved by being well placed within the organizational hierarchy and being directly supervised by the Assistant High Commissioner. The policy analysis formulation will be enhanced with closer ties and with the practical realities of the field through evaluation. Indeed, it will continue to define policy guidelines in collaboration with evaluation specialists and help to ensure that policy guidelines and related recommendations are being implemented. In other words, the new unit will strongly emphasize the complementary activities of evaluation and policy analysis and feedback services.

6. The recommendations two to five deal with the provision of adequate resources, both human and financial. While recognizing them as crucial elements for significantly enhancing UNHCR evaluation capacity, they should be looked at in the present context of the organization as having entered a period of significantly diminishing resources. The creation of additional evaluation posts and the provision of adequate financial resources to support the function will be most difficult in view of the recent drastic post and budget reductions in the field, and at Headquarters. The organization will have to look into additional resources with the concurrence and assistance of the member countries.

7. The other recommendations, dealing with evaluation, are not constrained by the lack of resources and are fully supported by UNHCR. They will become an intrinsic part of the work programme of the new Evaluation and Policy Analysis Unit. The recommendation on inspection is presently being tested during field inspection missions with the introduction of an inspection matrix.

Executive Summary of the report on Enhancement of Evaluation Capacity in UNHCR (November 1998)

Introduction

1. This review of the evaluation function in UNHCR is composed of two parts. The first part examines characteristics of the current condition; the second part suggests a number of key steps to enhance the current condition.

2. Present systems and procedures in the Agency are not well geared to encourage learning from experience. There are a number of dimensions at play here. They range from various insufficiencies observed in the project planning and monitoring process, to a lack of resources devoted to evaluation, and an embryonic dissemination and feedback system that fails to nourish learning effectively.

3. Current project and programme planning neither facilitates nor anticipates evaluation. The OMS change management team has identified a number of shortcomings in planning that directly affect the ease with which subsequent evaluation can be conducted. Of importance in particular is the need to introduce a Logical Framework Analysis approach to planning, to emphasise results rather than outputs, to introduce participatory planning with partners, and to establish a solutions strategy to planning that would do away with the artificiality of geographic borders and annual planning horizons. At the same time, current project and programme planning documents do not include a budget for subsequent evaluation, fail to identify an evaluation plan for the project or programme in question, and do not call for a clear indication of lessons learned from previous experience. The OMS change management team recognises these and other problems and is attempting to address them.

4. The current practice of monitoring similarly inhibits the evaluation function. Periodic reporting is primarily designed to accompany and support the submission of financial statistics. Former guidelines that provided a useful framework for self evaluation reporting in the 1980s were superseded in 1990 by a new set of instructions for the completion of Project Monitoring Reports in a revised Chapter 4 of the UNHCR Manual. Not only were these new guidelines inferior to the previous ones from the point of view of evaluative feedback, training to complete these PMRs was never fully provided. As a result, current monitoring reports are not designed to serve as a basis for subsequent evaluation. They are, by and large, not analytical in nature and fail to assess progress against objectives. The OMS change management team is currently working towards an enhancement of the monitoring system in the Agency.

5. While evaluation would benefit greatly from improvements in planning and monitoring systems, it would not be wise to postpone enhancement to the evaluation function until a transformation in such systems has been achieved. The current condition of evaluation is such that corrective action is needed now.

6. This study has looked at the three main evaluative instruments of the Agency, central level evaluation, self evaluation by the Bureaux and the inspection function.

Central Evaluation

7. The central evaluation function in UNHCR started in 1973 when the first evaluation post was established. The Central Evaluation Unit became a part of the Inspection and Evaluation Service in 1995. Over the years numerous evaluation studies have been produced. These studies were generally considered useful documents, but mostly so when the topics coincided with the interests of the Senior Management Committee, and the quality of the study was high. For some time now voices have been raised, both internally and externally by member countries, about the impact of central evaluation on Agency-wide learning, and the degree of commitment top management displays towards the overall function. Our review of the function tends to support this sense of concern.

8. At the moment there is only one existing evaluation officer position, the one of the Co-ordinator. Two previous positions were absorbed by, respectively, the Inspection Unit and the newly established investigation function. A single position for an Agency with some 5400 staff world wide, with field offices numbering in excess of 120 and a total operating budget of some US $1 billion, working under most trying conditions in collaboration with countless partners, is not sufficient. In contrast, the WFP carries out central level evaluations with 6 officers, and the UNDP has 4 officers. To try to compensate for this shortage, the Evaluation Co-ordinator uses UNHCR staff awaiting permanent assignments. Such people, however, are not necessarily evaluation specialists, in spite of the other good qualities they may have. In this report we are suggesting the creation of four additional central evaluation positions, three at the P4 level and one at the P3 level, making it five in total including the position of the Evaluation Co-ordinator.

9. The total annual central evaluation budget is also very modest. The estimated total budget for central evaluation in 1997 (US $600,000) amounted to 0.05% of the total Agency budget (US $1.2 billion) in that year. In contrast, the WFP devoted about four times as much (0.19% of total budget) to evaluation in 1997. In this report we are suggesting a substantially increased total central evaluation budget roughly estimated at US $1.5 million annually, which would include the above mentioned staff, an allocation for an administrative budget of US $0.25 million, and US $0.5 million for outsourcing.

10. This shortage of staff and financial resources tends to prescribe the type of evaluation studies that can be reasonably undertaken, both in terms of subject area, scope and design. At the same time, it would be difficult to do justice to the more ambitious topics which seem ripe for an evaluative study, such as UNHCR's impact on the environment, relationship with partners, capacity building, re-integration, etc.

11. Aside from resource constraints, the central evaluation function could benefit greatly from certain improvements in methodology. It is recognised that availability of sufficient resources might facilitate the introduction of a more systematic methodological approach. Improvements are required in annual and long term planning of an evaluation programme. There is a need to introduce a more long-term perspective in establishing an evaluation programme in order to test Agency policies and strategies against realities in the field. The type of evaluation expertise required to conduct evaluations needs to be revisited. A predominant reliance on former or current UNHCR staff to conduct evaluations should be tempered by possible new insights that could be obtained from the private sector, as well as the introduction of more joint evaluations with interested member countries and partners. The way in which evaluations are individually planned and conducted need to be improved. The actual conduct of the evaluation should be subject to a more disciplined methodological approach, including a standardisation of reporting, that could enhance the quality of findings and the feedback of lessons learned.

12. Furthermore, significant improvements in the dissemination and feedback functions could be envisaged, improvements that could foster learning more effectively that has taken place in the past. Of particular importance would be a dissemination process that would make findings available to staff on demand, with an automated system playing a predominant role. A much more powerful feedback system could also be envisaged, whereby strong linkages are established between evaluation on the one hand, and top management, programme management, programme staff, Agency guidelines and training on the other hand. Suggestions are provided in this report concerning a more viable dissemination and feedback system.

Self Evaluation by Bureaux

13. Evaluations carried out by Bureaux of their own activities (i.e. the so-called self evaluations) are very infrequent. A number of causes can be found for this situation. On the one hand, they seem to reside in the earlier mentioned shortcomings in the planning and monitoring systems, and the fact that an apparently useful self evaluation process was replaced in the late 1980s by Project Monitoring Reports that are primarily descriptive and non analytical in nature. More fundamentally, however, is the absence of a meaningful self evaluation culture in the Bureaux. Improvements in this culture will no doubt result from improvements in planning and monitoring, but in the short term more will be needed.

14. We are suggesting in this report that an evaluation co-ordinator be appointed, on a pilot basis, to the office of the Director in each one of the three largest Bureaux, occupying a staff function with clear terms of reference. The tasks would basically involve the design and management of a self evaluation programme for the Bureau concerned, in a way that would encourage and support evaluation efforts by the field office. This implies co-ordinating, in collaboration with the field office, the formulation of annual self evaluation plans, the introduction of the right methodology, the management of the actual studies themselves, the implementation of a dissemination and feedback system, and required follow-up activities. The methodological, dissemination and feedback principles identified for central evaluation will also be applicable to self evaluation. The introduction of operational adjustments as a result of findings, would require the establishment of strong linkages between self evaluation and project/programme planning, new policies and implementation strategies, programme management, manuals/ guidelines and training. Of key importance will be the link with the central evaluation function in order to co-ordinate Agency-wide evaluation planning, integrate a common dissemination and feedback strategy, introduce consistent methodological approaches, and engage in joint evaluation efforts where required.

15. Since current project planning in the Bureaux does not call for a budgetary allocation to perform an evaluation of operations at the appropriate moment (the design and timing of which to be a joint decision between the Bureau's evaluation co-ordinator and the field office), it will be necessary to set aside an annual self evaluation budget. It is estimated that an allocation of US $200,000 per year for each one of the three Bureaux selected for this pilot project would be sufficient. With a revised planning system envisaged for the year 2000, an evaluation budget could be included in the planning document against specific evaluative activities envisaged.

Inspection

16. Inspection within the IES, very much an evaluative activity, commenced in 1995 and has become, in a few short years, a feedback mechanism that is frequently used by management. It provides, essentially, a management audit. Close to half of all UNHCR field offices will have been inspected by the end of 1998. Looking at its mandate, however, it would appear that a focus on management audit does not fully satisfy the original intent of the service. The mandate includes a "review of UNHCR impact in given countries and regions", suggesting a more evaluative approach than a management audit would imply, an approach that would measure achievements against objectives.

17. Furthermore, there is a need for a more formally expressed day-to-day relationship between the Inspection Unit and UNHCR's internal auditors which report to the Office of Internal Oversight (OIOS) in New York. While the overall relationship is covered by a Memorandum of Understanding, the day-to-day relationship is not. Because of the potential that exists for overlap, the scope for complementarity that is inherent in the two operations, the need for consultation on annual programmes and strategies, as well as the need to keep each other informed regularly with respect to results, it might be useful to subject this relationship to a more formal process than exists now.

Evaluation Structure

18. It will be necessary to create an enabling structure that would allow evaluation to take place efficiently and effectively in the Agency. Of importance is the right organizational placement for the evaluation function. For central evaluation, this would mean the amalgamation of the Centre for Documentation and Research with the Central Evaluation function. CDR provides a complementary policy research function and, with improvements to the current information technology system, could play a useful role in the dissemination and feedback process that forms an integral part of evaluation. For self evaluation the right organizational placement would mean, as was mentioned earlier, the appointment of an evaluation co-ordinator at the Bureau Director's office. At the same time, the creation of an enabling structure would require significantly more staff and financial resources. These and other suggestions have been captured in a number of recommendations that appear below. A small number of key lessons have been extracted and appear at the end of this executive summary

Recommendations

19. Consideration should be given to the amalgamation of the Central Evaluation Unit with the Centre for Documentation and Research in order to enhance learning and policy formulation in the Agency.

20. Four additional evaluation positions need to be created in the Central Evaluation Unit, three at the P4 level and one at the P3 level (making it five in total), accompanied by the creation of at least two support positions, one secretarial, the other clerical.

21. An Evaluation Co-ordination Officer should be appointed on a pilot basis, in a staff capacity, to the Director's office in each of the following Bureaux: (I) Central East and West Africa; (ii) Europe (including Yugoslavia); and (iii) Central Asia, South West Asia, North Africa and the Middle East.

22. An annual Central Evaluation budget, to be used exclusively for outsourcing, should be established at the US $0.5 million level.

23. An annual allocation for outsourcing and administration of US $200,000 should be made available to each one of the three Bureaux selected for the pilot phase of the self evaluation function.

24. The Inspection Service should move beyond its current focus on management audit and introduce the concept of performance measurement which reviews achievements against objectives; it should also subject the relationship between inspection and internal audit to a more formal process that exists now.

25. A long term evaluation strategy should be drafted that would guide the overall evaluation function of the Agency and form the basis for an annual evaluation plan incorporating both central evaluation as well as self evaluation.

26. Sources of evaluation expertise should become more diversified: (i) a carefully screened roster of private sector evaluation consultants should be established as a basis for increased use of private sector expertise; (ii) the scope for more joint evaluations with member governments donor organisations should be actively explored; (iii) participatory evaluations with partners should be undertaken.

27. An Evaluation Manual should be prepared that provides standards for project and programme evaluation in order to guide the overall evaluation function in the Agency. The Manual should also outline the key linkages that should be established and maintained between evaluation and other systems and procedures in the Agency.

28. A study should be carried out to design and cost a structured automated evaluation retrieval system to allow staff user friendly desk top access to evaluation findings and lessons learned.

29. A comprehensive dissemination and feedback strategy should be drafted for management approval, using principles and practices outlined in this report.

30. The findings of this report, including its recommendations, should be submitted to member countries with a view to obtaining their concurrence with the suggested approach, and ascertaining availability of additional resources that may be required.

Main Lessons Learned

31. Among the various lessons that could be derived from this study, three essential ones have been retained. They are the following:

32. The establishment of an effective evaluation function, capable of producing meaningful results, requires adequate staff and financial resources. Without such resources many compromises need to be made that seriously affect the scope and design of the studies, as well as the quality of feedback.

33. In order to create a true learning culture in the Agency, it is important to realise that the establishment of an effective dissemination and feedback system is as important as the relevance and quality of the evaluations themselves.

34. The enhancement of the evaluation function cannot be achieved through piecemeal improvements of selected aspects. It requires a comprehensive approach that recognises the interdependence of planning, monitoring and evaluation systems, as well as the intimate connection that exists among the various steps within the evaluation cycle itself.