RFP FOR CONSULTANCY: Humanitarian Programme Plan (HPP) Education and Protection Project Irish Aid-Puntland

Company profile

photo

World Vision

World Vision is an international Christian Relief and Development organization, whose goal is to achieve long-lasting benefits in the quality of life for vulnerable children and their families, displaced persons and communities. www.wvi.org 


Job Details
Employer: World Vision
Job Title: RFP FOR CONSULTANCY: Humanitarian Programme Plan (HPP) Education and Protection Project Irish Aid-Puntland
Job Type: Full-Time
Location: Puntland
Category: Other
Description:

1.0 Introduction

1.1 Evaluation Summary

Project name (s)

Humanitarian Program Plan (HPP)-Somalia, Irish Aid

Project Goal

Building the Resilience of Women and Children through Enhanced Education and Protection

Project Outcomes

-Increased awareness and action to identify, assess, and mitigate child protection risks

-Girls and boys participate and learn in a quality educational and disaster responsive environment

-Parents and community have more positive attitudes towards the value of education and participate in education management

-Schools provide a girl friendly and gender supportive learning environment

-Increased awareness and action on GBV prevention, mitigation and response among women, men, girls and boys

-Improved access to quality GBV response services.

Project participants

The project targeted 13,025 beneficiaries including 2,865 men, 4,298 women; 5,862 children between ages 5-17 (2,162 boys and 3,700 girls) from 8 villages in Eyl and Godobjiraan Districts.

Project locations

Eyl and Godobjiraan Districts, Puntland Somalia

Project duration

January 1, 2017-December 31, 2018

Evaluation type

End of Project Evaluation

Evaluation purpose

The purpose of this evaluation is to document and inform the stakeholders (donors, partners and beneficiaries) of the project’s relevance, effectiveness, sustainability, the potential impact in relation to project outcomes and the lessons learned. The end of project evaluation shall relate largely to the baseline evaluation to allow measuring progress and the contribution of the project.

Primary methodology

The evaluation study will adopt a mix of quantitative and qualitative techniques as summarised below:

Quantitative

§ Household surveys

§ School based literacy and numeracy assessments

Qualitative

§ Focus group discussion (FGD)

§ Key Informant Interview (KII)

§ Document Reviews

§ Case studies

Evaluation timelines

TBA

2. Description of Projects Being Evaluated

World Vision Somalia (WVS) has worked with the children of Somalia, their families and communities since 1992 through a variety of emergency and rehabilitative programming to address the emergency needs of the communities while addressing some of the underlying causes of vulnerability in those same communities. During the last 25 years, the program has grown to over 20 districts spread over three main operational regions of Puntland, Somaliland and South-Central and Jubaland with a liaison office in Mogadishu. The strategic aim of the Somalia program is to develop and implement high quality projects that address emergency, rehabilitation and developmental needs in a demand driven and responsive manner. The overall goal of all World Vision programs is to save lives and contribute towards the economic development of the Somali people, while contributing to the child well-being outcomes.

The humanitarian programme plan for Somalia (2017-2018) was designed to fit into WVS strategic objectives with focus on building the resilience of women and children through enhanced education and protection. WVS has been implementing the HPP project with funding from Irish Aid in Eyl and Godobjiraan districts of Puntland. The project started in January 2017 and will end on 31st December 2018. The implementation of this project has been undertaken in partnership with WVS, Puntland Ministry of Education and Higher Education and (MoEHE) and Ministry of Women Development and Family Affairs (MoWDAFA). The project focuses on increasing awareness on child protection and gender based violence referral mechanisms, improving the school learning environment for boys and girls, creating awareness among the community members to have a more positive attitude towards the value of education for children and participate in education management among others.

In 2017, a baseline study was conducted including quantitative household surveys, literacy and numeracy assessments for children and focus group discussions with project beneficiaires as well as key informant interviews with the relevant local government authorities and community leaders among others. It is expected that a similar methodological approach will be adopted at the end line evaluation in order to measure progress over the life of the project that can be attributed to the project.

This Terms of Reference (ToR) has been prepared to hire the services of a highly competent external Consultant to evaluate the project performance during the implementation period. The evaluation results will help the key stakeholders to measure the level of project success with reference to the DAC criteria. Particular attention will be paid to assessing the outcomes by gender and grade of children.

2.1 Project Goal

The general objective of the project is to improve the access and quality of basic education for children in Eyl and Godobjiraan Districts while improving the awareness and support mechanisms for protection issues and particularly gender based violence. The following are the key project indicators that will be measured at the endline evaluation. (Please note that a final list of indicators to be measured and their definition will be shared with the consultant(s) in a later stage of the selection process).

Description

Key result indicators

Outcome 1: Increased awareness and action to identify, assess, and mitigate child protection risks

Proportion of children who know of the presence of services and mechanisms to receive and respond to reports of abuse, neglect, exploitation or violence against children (disaggregated by gender and age)

Number of child protection cases referred to available services (disaggregated by gender)

Percentage of parents/caregivers (male/female) who demonstrate improved knowledge, attitudes and practices on child protection

Output 1.1: Increased awareness raising of child protection in emergencies

Number of awareness raising sessions conducted by child protection committees

Output 1.2: Community Protection Committees established and capacity built

Number of child protection committees who have received training and developed a work plan.

Outcome 2: Girls and Boys participate and learn in a quality educational and disaster responsive environment

Percent (%) increase in the number of girls and boys enrolled in school (disaggregated by age and gender)

Percent (%) of children who are functional literate (ability to read and understand local material and answer 2-3 comprehension questions correctly). Disaggregated by age and gender

Proportion of girls and boys able to express themselves with confidence and participate actively in discussion (disaggregated by age and gender).

Percent of dropouts (girls/boys) (Disaggregated by school years and gender)

Percent of targeted teachers with improved skills to plan for and respond to outbreaks of conflict (disaggregated by gender).

Output 2.1: Improved facilities in 8 primary schools and trialing of one mobile education unit

Number of primary school classrooms rehabilitated

Number of students attending mobile education unit (disaggregated by gender and age)

Output 2.2 Schools are equipped with essential learning support material

Percentage of schools who have received required teaching and learning materials

Output 2.3 Capacity of 50 teachers enhanced through training on academic curriculum, GBV and Disaster Prevention and Mitigation Skills

Number of teachers trained on academic curriculum

Percentage of teachers who have completed Education in Emergencies (EiE) training

Number of teachers trained on gender based violence prevention and mitigation (disaggregated by gender)

Output 2.4: Students participate in school decision making and management

Number of student associations formed and trained

Number of meetings and type of issue raised with school leadership by student associations

Outcome 3: Parents and community have more positive attitudes towards the value of education and participate in education management

Attitudes of parents towards the importance of education for girls/ boys (disaggregated by gender)

Number and description of actions taken by community leaders to promote primary education for girls and boys

Number and description of actions taken by parents teachers associations to promote primary education for girls and boys

Output 3.1: Awareness on the importance of education for girls and boys

Number of enrolment campaigns conducted,

Number of parents participating in enrolment campaigns

Output 3.2: Communities and Parents participate in education management

Number of PTAs established, trained, and operational (holding quarterly meetings)

Number of Community Education Committees trained

Outcome 4: Schools provide a girl friendly and gender supportive learning environment

Level of confidence of girls in Menstrual Hygiene Management (MHM) in schools

Level of self-reported perception of improved well-being among girls at school

Output 4.1: Schools are equipped with Girl Friendly WASH facilities and kits

Percentage of rehabilitated schools with WASH facilities

Number of girls supported with sanitary kits

Output 4.2: Formation of Child (Girl & Boy) Clubs

Number of child clubs with active learning programmes for boys and girls

Output 4.3: Schools provide gender and life skills training for girls and boys (on WASH, GBV, early marriage, value of education)

Number of schools providing awareness sessions on each theme (WASH, GBV, early marriage, educational value)

Outcome 5: Increased awareness and action on GBV prevention, mitigation and response among women, men, girls and boys

Number of community members (men, women, girls and boys) who can cite at least three key messages from GBV campaigns

Percentage of women and men with improved knowledge, attitudes and practices towards mitigating/reporting actions on GBV

(disaggregated by gender)

Output 5.1: Increased community awareness on GBV through campaigns/ informational sessions

Number of awareness raising sessions conducted by GBV Committees

Output 5.2: GBV Committees formed and capacity built

Number of GBV committees established and functional (implementing actions as per work plan)

Outcome 6: Improved access to quality GBV response services

Number of health services (Health Centre, Health Units) that provide quality GBV response services

Percentage of GBV survivors who report satisfaction with quality of GBV response services

Percentage of GBV survivors referred to culturally appropriate multi-sector and specialist GBV services

Output 6.1: Increased clinical facilities for GBV cases

Number of Health Centres/Units established/upgraded to provide GBV services

Output 6.2: Health care providers have increased capacity to support GBV survivors and women at risk and make appropriate referrals

Number of health care providers trained on Clinical Management of Rape (CMR) Protocols, Clinical Care for Sexual Assault Survivors

Increased linkages with government structures and community committees formed for sustainability

Percentage of monitoring visits conducted by MoE/MoWDAFA

Number of trainings for government officials on service delivery monitoring

Number of joint initiatives implemented between government (MoE/MoWDAFA) and community committees

3. Evaluation Target Audiences

The end of project evaluation is intended to benefit multiple stakeholders that have been involved directly or in directly in the project implementation process. In particular, the following are the key stakeholders that will be involved in the evaluation process;

  • Community groups and committees that have been established including gender based violence, child protection, community education committees and school clubs among others
  • Local and international organisations that are operating within Eyl and Godobjiraan Districts
  • Local administrative offices involved in the project implementation
  • Ministry of Education and Higher Education (MoEHE)
  • Ministry of Women Development and Family Affairs (MoWDAFA)
  • Religious leaders
  • World Vision Somalia
  • Support Office (WV Ireland).

4. Evaluation type

This is an end of project evaluation that is aimed at assessing the progress made by the project towards achieving the project goal of building resilience of women and children through enhanced education and protection. The assessment of the project impact will focus on the contribution made by the project from inception.

5. Evaluation Purpose and Objectives

The main purpose of this evaluation is to assess the impact, appropriateness, relevance, effectiveness and sustainability as well as the complementarity of the HPP Somalia (2017-2018). The end of project evaluation will also help to draw key lessons learnt and the best practices to the project stakeholders. In particular, the project evaluation will be shaped by the following specific key evaluation questions:

Evaluation objectives

Key questions

Effectiveness

Measures the extent to which the objectives of the project are achieved or are expected to be achieved. In particular,

o To what extent did the project meet the objectives as defined in the indicators in the log frame?

o What information was coming out of the accountability team and how did it make the response more effective?

o What were the intended and not intended positive and negative effects at the household level?

o To what extend has the project enabled people to be better able to mitigate future responses?

Management Effectiveness/Efficiency

The degree to which organizational structure provided sufficient, effective and timely support for the project implementation.

o How adequate were the available resources qualitatively and quantitatively?

o Were all the project resources utilized optimally?

o Explore alternative low cost approaches that could have been used to achieve similar results?

o How could the efficiency of the project be improved without compromising outputs?

o Assess the timeliness of implementing the project activities.

o How adequate were the reporting and monitoring systems of the project?

o Have the project outputs been achieved with a reasonable cost?

Appropriateness/relevance

Community involvement and participation in the design process, goal setting, planning and implementation. In particular, the relevancy and appropriateness of project design to the needs of the community will be assessed.

o What did the communities articulate were the priority needs and how the project align with that?

o What were secondary sources saying the issues were and how did the project align to that?

Connectedness/Sustainability

This will focus on the need to ensure that project activities are carried out in the context that takes longer term and interconnectedness into account. In particular,

o Are there sustainability plans, structures and skills in place to ensure there is sustainability of project benefits?

o How adequate are they?

o How is the community and local partners prepared to continue with the project outcome?

o Are the community members knowledgeable and supportive to the project?

o Is there evidence of community contribution and ownership of the different project interventions?

Coordination

o Has there been any collaborations and networking with the different stake holders?

o How strong are the relationships with government, other agencies and CBOs that can be improved (in terms of partnering, collaboration, networking and donor relations)?

Cross-cutting issues

To what extent were relevant international standards used in the planning, implementation and monitoring of the intervention (e.g., international humanitarian and human rights law; the Red Cross/ NGO Code of Conduct and developing standards - e.g., Sphere)?

Was gender equality taken into consideration in all relevant areas? Did the project interventions conform to gender equality standards and policy?

6. Evaluation Methodology

The evaluation methodology will be designed in alignment with World Vision’s Learning through Evaluation with Accountability and Planning (LEAP) guidelines and principles to ensure the quality of evidence. The data collection process will apply both quantitative and qualitative methods. A detailed evaluation methodology will be designed by the external Consultant in consultation with WVS Quality Assurance team, the Project Manager and the WV Ireland technical team who will validate the sampling strategy and procedures. The detailed design of methodology must include the following;

§ The evaluation design

§ Sampling for qualitative and quantitative surveys

§ Data collection instruments, protocols and procedures

§ Procedures for analysing quantitative and qualitative data

§ Data presentation/dissemination methods.

§ Report writing and sharing etc.

The key data collection methods will include the following among others.

§ Document reviews including the project proposal, monthly, quarterly and annual outcome monitoring reports and project review reports.

§ Focus Group Discussions (FGD) involving primary project participants and

§ Key Informant Interviews with MoEHE and MoWDAFA, and WVS staff, partners, other NGOs, leaders.

§ Reflection and feedback sessions with staff and partners.

The Consultant will be expected to employ use of mobile data collection and Geographical Information System (GIS) tools in the evaluation process; ranging from data collection, analysis and presentation of results.

7. Evaluation Deliverables

The Consultant will be expected to deliver the following outputs:

§ An inception report detailing the approach and methodology to be used and sample size calculations, a detailed execution plan, data-collection tools.

§ Draft report submitted to WVS within an agreed timeline between the WVS and the Consultant (soft-copy)

§ A presentation of the key findings and recommendations to WVS and other stakeholders in Garowe (this is optional depending on if the consultant chooses to remain in country during report write up, however, Consultants able to complete this deliverable will be preferred. Either way, an online presentation will be required to be made to WVS).

§ All indicators must be presented overall and disaggregated by sex and age, where appropriate.

§ Collected data (raw) after analysis complete with variable labels and codes, and the final evaluation tools submitted to WVS and WV Ireland alongside the final report.

§ Final report (soft copy) and 3 hard copies submitted to WVS Quality Assurance team and Education and Protection Project Manager. However, the Consultant should note that, the Final Evaluation Report shall follow the structure below customized from the UNDP (2009) Handbook on Planning, Monitoring, and Evaluation for Development Results.

7.1 Evaluation Report Structure

Title and Opening pages (front matter)—should provide the following basic information:

i. Name of the project evaluated

ii. Time frame of the evaluation and date of the report

iii. Project location (districts and country)

iv. World Vision logo

v. Acknowledgments

Table of Contents-including boxes, figures, tables, and annexes with page references.

List of acronyms and abbreviations

Executive Summary

A stand-alone section of two to three pages that should:

§ Briefly describe the project that was evaluated.

§ Explain the purpose and objectives of the evaluation, including the audience for the evaluation and the intended uses

§ Describe key aspect of the evaluation approach and methods.

§ Summarize principle findings, conclusions, and recommendations.

§ Include a summary table displaying the figures (scores) for each indicator at baseline and at end line, with elaboration on the statistical significance of differences.

Introduction

§ Explain why the evaluation was conducted (the purpose), why the intervention is being evaluated at this point in time, and why it addressed the questions it did.

§ Identify the primary audience or users of the evaluation, what they wanted to learn from the evaluation and why and how they are expected to use the evaluation results.

§ Identify the project that was evaluated

§ Acquaint the reader with the structure and contents of the report and how the information contained in the report will meet the purposes of the evaluation and satisfy the information needs of the report’s intended users.

Description of the Intervention

Provide the basis for report users to understand the logic and assess the merits of the evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation. The description should:

§ Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to address.

§ Explain the expected results map or results framework, implementation strategies, and the key assumptions underlying the strategy.

§ Link the intervention to WVS national strategy and child well-being targets

§ Identify any significant changes (plans, strategies, logical frame-works) that have occurred overtime and explain the implications of those changes for the evaluation

§ Identify and describe the key partners involved in the implementation and their roles.

§ Describe the scale of the intervention, such as the number of components (e.g., phases of a project) and the size of the target population for each component.

§ Indicate the total resources, including human resources and budgets.

§ Describe the context of the social, political, economic, and institutional factors, and the geographical landscape within which the intervention operates and explain the effects (challenges and opportunities) those factors present for its implementation and outcomes.

§ Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g., resource limitations).

Evaluation Scope and Objectives

Provide a clear explanation of the evaluation’s scope, primary objectives and main questions.

§ Evaluation scope-define the parameters of the evaluation, for example, the time period, the segments of the target population included, the geographic area included, and which components, outputs or outcomes were and were not assessed.

  • Evaluation objectives-spell out the types of decisions evaluation users will make, the issues they will need to consider in making those decisions, and what the evaluation will need to achieve to contribute to those decisions.

§ Evaluation criteria-define the evaluation criteria or performance standards used. The report should explain the rationale for selecting the particular criteria used in the evaluation.

§ Evaluation questions-evaluation questions define the information that the evaluation will generate. The report should detail the main evaluation questions addressed by the evaluation and explain how the answers to these questions address the information needs of users.

Evaluation Approach and Methods

The evaluation report should describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation questions and achieved the evaluation purposes. The description should help the report users judge the merits of the methods used in the evaluation and the credibility of the findings, conclusions and recommendations. The description on methodology should include discussion of each of the following:

§ Data sources-sources of information (documents reviewed and stakeholders), the rationale for their selection and how the information obtained addressed the evaluation questions.

§ Sample and sampling frame-the sample size and characteristics; the sample selection criteria, the process for selecting the sample (e.g., random, purposive); and the extent to which the sample is representative of the entire target population, including discussion of the limitations of the sample for generalizing results.

  • Data collection procedures and instruments-methods or procedures used to collect data, including discussion of data collection instruments (e.g., interview protocols), their appropriateness for the data source and evidence of their reliability and validity.
  • *Performance standards***-**standard or measure that will be used to evaluate performance relative to the evaluation questions (e.g., national or regional indicators, rating scales).
  • Stakeholder engagement-stakeholders’ engagement in the evaluation and how the level of involvement contributed to the credibility of the evaluation and the results.
  • Background information on evaluators-the composition of the evaluation team, the background and skills of team members and the appropriateness of the technical skill mix, gender balance and geographical representation for the evaluation.
  • Major limitations of the methodology-major limitations of the methodology should be identified and openly discussed as to their implications for evaluation, as well as steps taken to mitigate those limitations.
  • Data analysis-procedures used to analyse the data collected to answer the evaluation questions. It should detail the various steps and stages of analysis that were carried out, including the steps to confirm the accuracy of data and the results. Data from the end line evaluation should be systematically compared with data from the baseline evaluation, and the difference between baseline and end line values should tested for statistical significance and discussed. The report also should discuss the appropriateness of the analysis to the evaluation questions. Potential weaknesses in the data analysis and gaps or limitations of the data should be discussed, including their possible influence on the way findings may be interpreted and conclusions drawn.

Findings and Conclusions

Present the evaluation findings based on the analysis and conclusions drawn from the findings.

Findings-presented as statements of fact that are based on analysis of the data. The evaluation findings should be structured around the key evaluation questions and project indicators so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results should be explained, as well as factors affecting the achievement of intended results. The assumptions or risks in the project design that subsequently affected implementation should also be discussed.

Conclusions-this section should be comprehensive and balanced, and highlight the strengths, weaknesses and outcomes of the intervention. The conclusion section should be well substantiated by the evidence and logically connected to the evaluation findings. It should respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision-making.

Recommendations-the Consultant should provide practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations should be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. This should address sustainability of the initiative and comment on the adequacy of the project exit strategy.

Lessons Learned

The report should include discussion of lessons learned from the evaluation, that is; new knowledge gained from the particular circumstances (intervention, context outcomes, even about the evaluation methods) that are applicable to a similar context. Lessons should be concise and based on specific evidence presented in the report.

Report Annexes

Annexes shall include the following to provide the report user with supplemental background and methodological details that enhance the credibility of the report:

§ ToR for the evaluation

§ Additional methodology-related documentation, such as the evaluation matrix and data collection instruments (questionnaires, interview guides, observation protocols, etc.) as appropriate

§ List of individuals or groups interviewed or consulted and sites visited

§ List of supporting documents reviewed

§ Project results map or results framework

§ Summary tables of findings, such as tables displaying progress towards outputs, targets, and goals relative to established indicators.

8. Time frame

The overall evaluation process is expected to take 30 days including preparation, data collection, and analysis and reporting. The Consultant should be able to undertake some of the tasks concurrently to fit within the planned time-frame, without compromising the quality expected. The assignment is expected to commence on 18th November and end by 17th December 2018.

9. Authority and Responsibility

WVS will establish an evaluation team to oversee all the related tasks. The DME Manager will be responsible for the overall coordination of all the evaluation tasks with the Consultant. In addition, the Project Manager, Regional Operations Manager and Quality Assurance & Strategy Manager will provide all the necessary technical and operational support required throughout the evaluation process.

Support from WV Somalia and WV Ireland

World Vision will be responsible for the following:

§ Recruit the external Consultant and finalize the consultancy agreement

§ Share all necessary documents to the Consultant to finalize the evaluation methodology and data collection tools

§ Provide input for evaluation study methodology, data collection tools and report.

§ Ensure that input from WVS and WV-Ireland is circulated and shared with external Consultant

§ Flight expenses for the Consultant to Puntland (where necessary)

§ Vehicle hire to support the evaluation exercise

§ Food and accommodation for the Consultant in Puntland

§ Working space for the Consultant while in Puntland

§ Recruitment and payment of enumerators

§ Stationery for data collection

§ Overall accountability of the evaluation process

§ Guidance and coordination throughout all the phases of evaluation, keeping communication with external Consultant throughout all phases

§ Provide support to the evaluation technical lead (external Consultant) for the evaluation field visits processes such as orientation and training of enumerators, FGDs and KIIs

§ Closely follow up the data collection process, ensuring quality control, daily debriefing, meeting the timelines set for interview completion;

§ Inform evaluation audience for their involvement in the study and help in setting specific dates for the evaluation field schedule.

§ Provide smartphones and ODK aggregate server for data collection where required.

The Consultant will be responsible for the following:

§ Review all relevant documents for evaluation study

§ Develop evaluation study design which includes survey methodology and the data collection tools (questionnaire; focus group guides, interview protocol, data entry templates, etc.), as appropriate, including a field manual for training, in consultation with evaluation team, reflecting WVS and WV-Ireland feedback on the methodology. These should be heavily based on the tools used at baseline in order to make appropriate comparisons over the life of the project

§ Designing the ODK forms, data entry template, procedures and systems, and training of entry clerks in the use of the template,

§ Develop the field work schedule in consultation with evaluation team

§ Conduct training for the data collectors during field visits phase, finalize the evaluation schedule

§ Supervise the data collection process, give advice and ensure the quality of the data

§ Conduct interviews (KII) with key project staff,

§ Data analysis and report writing. It is expected that at least 2 drafts be provided to WVS and WV-Ireland, with feedback addressed in each round before submission of the final report

§ Provide required data that is complete and labelled in English (variables and values) for both the SPSS and Microsoft Excel file formats.

§ Provide final versions of data collection tools.

§ Provide daily field briefing to the DME and Project Manager on progress and any challenges from the field.

10. Limitations

Time and security may be a major limitation with regard to assessment processes in fragile and versatile contexts such as Somalia and this makes it often challenging to keep up strictly with a set agenda. In addition, In Somalia; households spend a better part of the afternoon hours in prayers and it will be hard for the enumerators to administer many questionnaires per day (in an effort to complete the assessment timely). To address this issue, firstly WVS will allocate extra overflow days for field data collection. WVS team will also work closely with the security department to ensure that the evaluation field processes are conducted in the most appropriate time and secure conditions. Therefore, the Consultant should be able to demonstrate some level of flexibility when required.

11. Documents

The key documents to be reviewed for the evaluation study are as follows:

§ Project document (needs assessment, proposal, log frame)

§ Baseline Report

§ Monthly, quarterly, semi-annual and annual, outcome monitoring reports

§ Training reports

§ Success stories

§ Any district level secondary data and other relevant documents and reports.

12. Qualifications of the Consultant

The evaluation exercise will be undertaken by an external Consultant who will work in close collaboration with WVS Quality Assurance team, Project Manager and WV Ireland technical team. Therefore, we are looking for a Consultant/team with the following skills and qualifications;

§ The team leader must possess a Post Graduate Degree in Social Sciences, Education, Protection or related discipline.

§ Strong and documented experience in conducting participatory qualitative assessments related to education and protection and livelihoods.

§ Demonstrated experience in leading at least three similar project evaluation studies such as surveys and group interviews,

§ At least 10 years’ experience in conducting baseline/evaluations for complex projects such as education and protection, infrastructure development, livelihoods, health, water and sanitation and hygiene being implemented by non-governmental and private sector actors.

§ A solid understanding of remote learning and use of mobile technology in data collection,

§ Demonstrated experience in leading teams, training local staff in quantitative and qualitative data collection tools including entry template

§ Demonstrated experience in designing survey methodology, data collection tools, processing and analysis of data.

§ Ability to interact with host government, partners and/or others as requested by WVS;

§ Strong organizational, analytical and reporting skills, presentation skills, attention to detail, ability to meet deadlines, and proficiency in SPSS or other statistical packages, Microsoft Office and qualitative data analysis software/tools.

§ Previous experience in a fragile country with tight security context will be preferred.

§ Capacity to use mobile data collection and GIS tools for data collection, and analysis of survey results.

§ Excellent verbal and written communication in English required.

§ Knowledge of Somali language will be an added advantage.

HOW TO APPLY:

Application Process and Requirements

Qualified and interested parties are asked to submit the following;

§ Letter of interest in submission of a proposal

§ A detailed technical proposal clearly demonstrating a thorough understanding of this ToR and including but not limited to the following;

o Consultant/Company Profile

o Description of the Methodology and Sample Size Determination

o Demonstrated previous experience in similar assignments and qualifications outlined in this ToR (with submission of at least two most recent reports)

o Proposed data management plan (collection, processing and analysis).

o Proposed timeframe detailing activities and a work plan.

o Team composition and level of effort of each proposed team member (include CVs of each team member).

§ A financial proposal with a detailed breakdown of costs for the study quoted in United States dollars.

All applications should be sent electronically to: somo_supplychain@wvi.org with attachments in pdf and a subject line**: Technical and Financial Proposal for End of Project Evaluation-HPP Education and Protection-Somalia (2017-2018).** NB: Application deadline is COB (1700hrs) Tuesday 27th November 2018.

 
Publish date: 2018-11-22 00:35:52
Premium Job: No
Back   Back   Print   Print   Company's jobs   Company's jobs     Views   Hits:209   Apply to this job   Apply       Add note     Add note        


You are here: Home