Monitoring and Evaluation Policy

Home -> About -> Monitoring and Evaluation Policy

May 14th, 2023

1. Introduction

The overall aim of this policy is to establish common structures and standards across Skills House that govern the application of effective monitoring and evaluation (M&E) systems with a view to maximize the benefits from Skills House interventions.

More specifically, this policy aims to: 

  • Demonstrate Skills House commitment to monitoring and evaluating its work and using the results to drive performance and impact 
  • Set out minimum requirements, principles to be respected, as well as roles and responsibilities 
  • Provide an overview of and basic introduction to M&E at Skills House, with additional tools referenced to provide further guidance and information.

2. Definitions of Monitoring and Evaluation

2.1. Monitoring

Monitoring is the continuous collection and analysis of information used by management and partners to determine progress on the implementation of activities, achievement of objectives and use of resources. Monitoring can happen at several levels, including local, regional, global as well as at project and programme level.

2.2. Evaluation

Evaluations are formal Skills House activities that provide evidence of the achievement of results and institutional performance. Evaluation is a periodic and systematic assessment, as impartial as possible, of the relevance, effectiveness, efficiency, impact and sustainability of an activity in the context of stated objectives. Evaluations can focus on different Skills House activities, including programmes, projects, policies and organizational units. Evaluations should provide credible, reliable and useful information, enabling timely incorporation of findings, recommendations and lessons learned into relevant decision-making processes.

3. The Difference Between Monitoring and Evaluation

The term ‘monitoring’ is often used in conjunction with the term ‘evaluation’. In fact, information collected through monitoring is an important source of data used in evaluation. While monitoring tells us what is happening, evaluation provides more detailed information such as why and how things are happening. In other words, while monitoring tells us whether an activity is on track to achieve its intended objectives, evaluation tells us whether the activity as a whole is on the right track.

4. The Purpose of Monitoring and Evaluation

  • Learning and Improvement: M&E activities help to understand why, and the extent to which, intended and unintended results are achieved, and their impact on stakeholders. It is therefore an important agent of change through the provision of useful feedback and a commitment to act on that feedback, thereby driving organizational learning.
  • Accountability: M&E plays a crucial role in accountability. Skills House is answerable to its Members, partners, donors and users on whether its policies, programmes and projects are having the intended results. Skills House also needs to demonstrate that resources are used efficiently and effectively. The M&E process, together with the required documentation that accompanies it, holds Skills House staff and contracted implementing partners responsible for their performance.
  • Evidence-Based Management: The results of M&E activities are an important input to the decision-making process within Skills House and affect a range of management processes, including risk and performance management and decisions to change, expand or contract programmes.

5. Criteria and Guiding Principles

5.1. Monitoring:

  • (S)pecific – The information captured measures what it is supposed to measure. In other words, the data collected clearly and directly relates to the achievement of an objective and not to any other objective. If the information collected is specific, it can tell us whether the change we seek to create is happening or not.
  • (M)easurable – Before starting monitoring, staff must make sure that the information required can be practically collected using measurable indicators.
  • (A)ttributable – Any changes measured must be attributable to the intervention.
  • (R)elevant – Monitoring results must make a contribution to selected priorities, i.e. they must fit with the Skills House Global Programme and where possible Skills House global results indicators must be included in monitoring.
  • (T)ime-bound – Monitoring is not open-ended but allows change to be tracked at the desired frequency for a set period.

5.2. Evaluation:

  • Relevance – To what extent is the policy, programme, project or organizational unit contributing to the strategic direction of Skills House and/or its Members and partners? Is it appropriate in the context of its environment?
  • Effectiveness – To what extent is the policy, programme, project, or organizational unit meeting its objectives and performing well?
  • Efficiency – To what extent is the policy, programme, project or organizational unit using its resources cost-effectively? Does the quality and quantity of results achieved justify the resources invested? Are there more cost-effective methods of achieving the same result?
  • Impact – What are the positive, negative, primary, secondary and long-term effects of an intervention directly, indirectly, intended or unintended? In other words, what difference has the activity made?

6. Monitoring and Evaluation Principles

  • Results-Oriented Accountability: M&E must focus on the extent to which the work of Skills House contributes to policy, programme, and overall objectives of the Union. A results-oriented accountability regime recognizes that there are a number of approaches to obtain results. It provides the flexibility for managers to use their insights and creativity to obtain the results desired. Similarly, a results-oriented system supports a management and governance system that provides guidance to managers, and requires information from managers about performance and learning. System controls for accountability for inputs are primarily left to internal audit.
  • Improving Planning and Delivery: M&E activities must provide useful findings and recommendations. Those under consideration should see M&E as an asset aimed at improving results and thereby strengthening the organization. 
  • Quality Control:  M&E involves the systematic integration of a wide assortment of knowledge and information related to a set of questions posed. As a result of gathering, analyzing and making judgements, Skills House staff and their stakeholders make important decisions related to the quality of their work at the policy, programme, project and organizational level.
  • Supporting an Evaluation Culture:  M&E is most effective when it forms part of an organization’s culture – a way of thinking and a way of acting. Concretely, M&E is seen as an important part of all Skills House staff responsibilities. As such, Skills House’s incentive systems need to support learning about and appropriately using M&E. All staff should see the M&E process as a tool that can help them improve their work and their results.
  • Working in Partnership: M&E often involves multiple stakeholders. Those affected by the outcome of M&E work have a right to be involved in the process. Stakeholders should be actively involved in all aspects of the evaluation process. Such involvement will make evaluations better understood, promote contributions and acceptance, and will increase the likelihood of use.
  • Transparency: The transparency of the M&E process is an important aspect of ensuring that M&E information is extensively used by managers, the Director General and Council. Clear communication with stakeholders concerning the purpose of the monitoring and/or evaluation work, the key questions and intended uses of the results of the M&E process, along with standards for the design, data collection and analysis will maximize the transparency of the M&E process.
  • Access: Skills House makes M&E results publicly accessible. All final reports, as well as management responses where available, are uploaded on the Skills House website. Findings and lessons learned will be disseminated as appropriate and in accordance with Skills House’s aspiration to be seen as a leader in M&E and in the spirit of collaboration. Finally, the Director General will present a report summarizing the M&E results of the term at each World Conservation Congress.
  • Ethics: M&E shall provide due regard for the welfare, beliefs, and customs of those involved or affected, avoiding conflict of interest. Ethical M&E requires that management and/or commissioners of M&E work remain open to the findings and do not allow vested interests to interfere. It also involves ensuring that Skills House carefully considers whether a monitoring and/or evaluation process is the appropriate tool to address the questions and issues raised about any policy, programme, project or organizational unit, or if some other process is more suitable such as an audit or performance appraisal.
  • Credibility: Skills House is committed to ensuring that M&E is carried out according to a high quality of accepted standards in the professional field and based on reliable data and observations. The use of these standards by Skills House managers is reviewed on a regular basis, and progress towards improving the quality of Skills House’s evaluations is reported on an annual basis. Improving the quality of evaluations in Skills House is a critical aspect of the credibility of its evaluation work.

If you have any questions related to this Policy, please contact us at [email protected]