End-of-term evaluation



Download 0.92 Mb.
Page4/18
Date09.06.2018
Size0.92 Mb.
#53406
1   2   3   4   5   6   7   8   9   ...   18

Introduction


The purpose of this report is to detail the findings of the End-of-Term Evaluation exploring the relevance, efficiency, effectiveness, coordination, and implementation of the five-year Barbados NSP for the Prevention and Control of HIV 2008-2013. The objectives for this evaluation were developed by CARPHA in partnership with national stakeholders during a planning mission conducted in May 2014. They are as follows:

  1. Determine the relevance of the programmatic response to the NSP;

  2. Assess the extent to which the Strategic Objectives of the NSP were achieved;

  3. Determine whether the programme objectives of the national response were achieved; and,

  4. Assess the extent to which programme resources were utilised to achieve programme objectives.

It is intended that the findings of this evaluation will be used to inform the planning and implementation of activities under a new five-year National Strategic Plan for HIV 2014-2018. This NSP was in the processes of being finalised at the time of the writing of this report.

The Barbados NSP 2008-2013 was developed using an RBM&E framework and has a clearly articulated Goal and six Strategic Objectives.

The Goal of the 2008-2013 NSP was the ‘Mitigation of the social and economic impact of HIV and AIDS on the population thereby reducing new cases (incidence) and ensuring the sustainable development of our nation’. This Goal is in keeping with Goal 3 of the National Strategic Development Plan of Barbados 2005-2025.

The 2008-2013 NSP included the following five Priority Programme Areas for Action:



  1. Prevention and Control of HIV transmission;

  2. Diagnosis, Treatment and Care of PLHIV;

  3. Support for PLHIV;

  4. Programme Management and Institutional Performance; and,

  5. Surveillance, Monitoring and Evaluation, and Research.

The NSP also highlights three cross-cutting themes:

  1. Gender power relations and dynamics and HIV/AIDS;

  2. Human resource management; and,

  3. Human rights policy and legislation.

The Priority Areas for Action are supported by the following six Strategic Objectives in the NSP:

  • Strategic Objective 1: To increase awareness and knowledge on the transmission and prevention of STIs/HIV.

  • Strategic Objective 2: To effect positive behaviour change to prevent and reduce the spread of HIV/STIs.

  • Strategic Objective 3: To strengthen treatment, care and support services for PLHIV, OVC, and vulnerable and high-risk groups.

  • Strategic Objective 4: To boost the educational and economic opportunities of PLHIV and of the most at risk.

  • Strategic Objective 5: To build capacity, strengthen institutional and management structures across private sector, civil society and government to deliver effective and sustainable programmes.

  • Strategic Objective 6: To strengthen institutional structures that will enable successful scale up and execution of monitoring and evaluation of programmes to allow for evidence-based decision-making.

In turn, each Strategic Objective is aligned with specific results and activities. In assessing achievements against the Strategic Objectives, the EET reviewed the current status of each Strategic Objectives and associated activities before making a summation of performance against the stated objective based on the empirical evidence available and the testimony of informants.

Background


This evaluation was undertaken by CARPHA following a request from the NHAC and was conducted in partnership with key stakeholders involved in the Barbados HIV and AIDS national response. The key stakeholders involved included the NHAC, MOH, Barbados Family Planning Association (BFPA), partner Government Ministries and Civil Society Organisation (CSO) representatives. During the NSP period, the NHAC was the lead coordinating body for the multi-sectoral response.

This evaluation was specifically designed to incorporate the participation of in-country stakeholders in the form of a ‘Local Evaluation Team’ or LET. Through participation in the design, implementation, review and dissemination processes of the evaluation, the LET members enhanced their appreciation for evaluation, strengthened their evaluation capacity and assisted in focusing the evaluation on areas of particular interest for the national response.

Additionally, an ‘External Evaluation Team’ or EET was deployed to bring an independent point-of-view and relevant expertise to the evaluation. The EET was made up of regional HIV experts and evaluation specialists, who worked to draw out conclusions and articulate lessons learned for the evaluation. In the process, EET members strengthen their skills in undertaking evaluations of national programmes, thus further expanding the pool of experienced evaluators throughout the region.

This was the sixth ‘learning by doing’ evaluation undertaken by CARPHA (and formerly the Caribbean Health Research Council) under the existing PANCAP/GFATM Agreement and within the context of a national response. These evaluations are intended to complement the theory-based training courses implemented by CARPHA, as well as to respond to a demand throughout the region for more exposure to evaluation.

The LET for this evaluation included:


  • Ms. Nicole Drakes - Assistant Director, NHAC;

  • Ms. Alexis Nurse - BCC Specialist, NHAC;

  • Dr. Anton Best - Senior Medical Officer Health (Communicable Diseases), MOH;

  • Dr. Dale Babb - Project Director, MOH;

  • Ms. Chisa Cumberbatch - Health Planner, MOH;

  • Ms. Shawna Crichlow - Data Analyst, MOH;

  • Ms. Madge Dalrymple - HIV Coordinator, Ministry of Transport;

  • Mr. Teddy Leon - Senior Programme Officer, CHAA;

  • Ms. Patsy Grannum - Director, Movement against Discrimination Coalition (MOVADAC);

  • Ms. Laura-Lee Foster - Research Assistant, National Council for Substance Abuse (NCSA); and,

  • Ms. Nia Salankey - Safety and Health Officer, Labour Department, Ministry of Labour.

The EET included regional experts and CARPHA staff.

The Regional Partners/Counterparts on the EET included:



  • Dr. Ayanna Sebro - Director HIV & AIDS Coordinating Unit, Ministry of Health, Trinidad & Tobago;

  • Mrs. Sannia Sutherland - Director of Prevention, National HIV Programme Ministry of Health Jamaica; and,

  • Ms. Audrey Brown - Focus Group Consultant and Behaviour Change Communication Expert.

The CARPHA staff on the EET were:

  • Mr. Erin Blake - M&E Specialist;

  • Ms. Patricia Smith-Cummings - M&E Officer; and,

  • Mr. Garth Watson - M&E Officer.

The initial work to develop the Evaluation Protocol and Matrix for this evaluation was undertaken by the CARPHA M&E Unit along with members of the Barbados LET in May 2014. The mission by the EET took place October 27-31, 2014. The final report was compiled from the notes and presentations of the EET by the CARPHA M&E Unit members in November and December 2014.

Methodology


This evaluation was undertaken using a ‘learning by doing’ methodology grounded in Utilization-Focused approach (Patton, 2008) and an RBM&E framework (Kusek and Rist, 2004). This methodology was used in order to produce practical recommendations that can be incorporated in the implementation of the next NSP for HIV (2014-2018) in Barbados.

Utilization-Focused Evaluation


Utilization-Focused Evaluation has been described as one of the most promising approaches for evaluations in the 21st century (Stufflebeam, 2001). As the name implies, Utilization-Focused Evaluation emphasizes the utility of evaluation findings and recommendations for an evaluations intended audience.

‘Utilization-focused evaluation begins with the premise that evaluation should be judged by their utility and actual use; therefore, evaluators should facilitate the evaluation process and design any evaluation with careful consideration for how everything that is done, from beginning to end, will affect use.’ (Patton, 2008, p.37)



Thus, Utilization-Focused Evaluation concentrates on evaluation questions that are of interest to stakeholders and promotes the use of findings through stakeholder involvement throughout the evaluation process (Patton, 2008). Utilization-Focused Evaluation does not depend on or advocate any particular evaluation content, model, method, theory, or even use. Rather, it is a process of helping primary intended users select the most appropriate content, model, methods, theory and uses for their particular situation. Thus, the strength of the Utilization-Focused Evaluation approach lies in the process by which stakeholders are engaged to determine the purpose of the evaluation, the procedures employed during the evaluation, and the subsequent findings of the evaluation.

Results-Based Monitoring and Evaluation


RBM&E can be broken up into two parts; Results-Based Monitoring and Results-Based Evaluation (Morra Imas and Rist, 2009). Results-Based Monitoring is the continuous and routine process of collecting and analysing data to assess an intervention's implementation through pre-determined indicators and targets. Progress is measured by comparing the current situation with the desired results identified for key performance indicators. Results-Based Evaluation takes a broader perspective on an intervention and seeks to determine its relevance, efficiency, effectiveness, impact, and sustainability. The intention of Results-Based Evaluation is to provide information that credible and useful; that facilitates the development of lessons learned which can be used to tailor an interventions design and implementation. Results-Based Evaluation also seeks to account for any confounding factors that could have influenced an intervention’s outcomes.

Appropriateness of Approach


These approaches were appropriate for this evaluation for the following reasons:

  1. The NSP being evaluated was developed using an RBM&E framework, and is linked to an M&E Framework that incorporates clearly states objectives and indicators for which the results and progress can be measured;

  2. The purpose of the NSP is to guide the national response to achieving tangible results for target populations;

  3. This evaluation was designed expressly to draw out lessons learned that can inform implementation of activities under a new NSP;

  4. RBM&E has growing currency in the Caribbean and is viewed favourably by health professionals and decision makers;

  5. The recommendations from this evaluation will need to be practical and actionable in the Barbados context; and,

  6. These approaches lend themselves well to CARPHA’s capacity building mandate and ‘learn by doing’ approach for evaluations.


Learning by Doing Approach


The ‘learning by doing’ aspect of this evaluation reflects CARPHA’s desire to enhance the capacity for evaluation throughout the region. The approach seeks to build the capacity of local personnel and regional counterparts to conduct evaluations of health interventions by involving them in ‘real life’ evaluations. This approach builds on training sessions and other capacity building efforts implemented by CARPHA as part of a broader M&E capacity building strategy and health system strengthening in the region. Project personnel participates in the evaluation planning, execution, and dissemination through appointment to the LET while regional peers participate as part of the EET. This lends greater credibility to the evaluation by presenting an independent perspective of project activities while developing their capacity through greater exposure to evaluation activities.

Data Collection Methods


The answers to the evaluation questions were developed based on a mix of quantitative and qualitative methods. Data collection methods included a combination of primary and secondary data collection. Primary data collection took the form of key informant interviews and focus group discussions with key populations. Secondary data collection consisted of document and data reviews of reports, indicator data, and research.

Key informant interviews are the process of interviewing persons who have insight or knowledge on a specific issue. In this case, the key informants represented people with in-depth knowledge of different aspects of the HIV and AIDS response in Barbados, and associated key populations. This data collection method is time efficient and inexpensive, and is relatively low burden on respondents. To cover all the interviews in a timely manner and to lessen the burden on the key informants, the EET split up into smaller teams to conduct most of the interviews, sharing questions ahead of time to ensure all pertinent questions were asked of each key informant. In a few cases, when the key informant had a broad understanding of the national response, the entire EET were present for interviews. Interviews were conducted with one key informant at a time and in small groups of people representing similar positions, in rooms that facilitated privacy. It should be kept in mind that a weakness of the key informant data collection process is its susceptibility to interview and question bias. Additionally, the views of key informants are not generalizable.

Focus group discussions are interviews of a small group of persons conducted by a skilled moderator. For the purposes of this evaluation, focus groups were conducted with groups of people representing different key populations identified as especially vulnerable to contract HIV and AIDS in Barbados based on the available data. The aim of these focus groups was to gather an in-depth understanding of key population perceptions of the national response and associated services. This data collection method is also relatively low cost, flexible and a useful tool for gathering a broad range of information on a specific topic. However, the findings from focus groups are not generalizable and cannot be said to represent the ‘private’ views of participants, especially on sensitive topics. Focus groups also rely heavily on the skills of the moderator and are vulnerable to moderator and question bias.

Document review is the process of systematically analysing documents to develop a case. This method of data collection was employed in this evaluation because it is time efficient, inexpensive and can yield insight into the thinking behind a project at different junctures. This data collection method is considered free from interview bias, although writer bias may be present. The method is also limited in that documents are static and are not always well aligned to specific evaluation questions.

In order to balance the shortcomings of each of these data collection methods, the evaluation team sought to balance and triangulate the findings from the different data sources. The evaluation team was also chosen purposefully to ensure representation of professionals from differing backgrounds, equipped with varying perspectives on national responses.





Download 0.92 Mb.

Share with your friends:
1   2   3   4   5   6   7   8   9   ...   18




The database is protected by copyright ©ininet.org 2024
send message

    Main page