Cookie tracking notice Are we allowed to crumble with cookies and anonymous tracking?

We use cookies on our website. Some of them are essential for the operation of the site (so called session cookies), while others help us to improve this site and the user experience (tracking cookies). We use the application Matomo and the external service etracker to analyze your behavior on our website anonymously. Because we value your privacy, we are here with asking your permission to use the following technologies. You can change your settings any time via this link or the menu item in footer menu. For more information visit our Data Policy

YES – I agree to tracking cookies. No thanks – I agree to session cookies only.

External services notice Do you agree to use Google Translate?

To use the Google Translation function we need to know if you agree to use those external service. You can change your settings any time via this link or the menu item in footer menu. For more information visit our Data Policy

The automatic translation service in the sidebar on this website is performed by Google Translate, a third-party service which we have no control over. Google collects, stores and processes information to provide users with better services. By using the services of Google Translate you express your explicit consent that your data will be transmitted, stored, processed etc. according to Art. 6 (1) (a) DSGVO/GDPR.

YES – I agree No thanks

External services notice Do you agree to use OpenStreetMap?

To use the map function we need to know if you agree to use those external map service. You can change your settings any time via this link or the menu item in footer menu. For more information visit our Data Policy

The map service is performed by Humanitarian OpenStreetMap Team and hostet by OpenStreetMap France, a third-party service which we have no control over. By using the services you express your explicit consent that your data will be transmitted, stored, processed etc. according to Art. 6 (1) (a) DSGVO/GDPR.

qualitative and quantitative research techniques for humanitarian needs assessment

  • Startpage »
  • Knowledge hub »

Resources and publications

Qualitative and quantitative research techniques for humanitarian needs assessment. an introductory brief, acaps (2012).

qualitative and quantitative research techniques for humanitarian needs assessment

Published in: 2012 Pages: 14

Publisher: ACAPS

Author: ACAPS

Uploaded by: SuSanA Admin

Partner profile: common upload

1239 Views 87 Downloads

Collection, collation, analysis, and synthesis of qualitative and quantitative information, gathered and analysed using appropriate sources, tools, and methods is the cornerstone of rapid needs assessments that allows decision makers to plan a timely, appropriate, and coordinated emergency response.

Bibliographic information

ACAPS (2012). Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment. An Introductory Brief. ACAPS

Filter tags

English Guidelines and manuals

Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment. An Introductory Brief.

Format: pdf file Size: 0.35 MB  download

Share this page on    

qualitative and quantitative research techniques for humanitarian needs assessment

SuSanA Partners  currently 400 partners

qualitative and quantitative research techniques for humanitarian needs assessment

Networks Circle

qualitative and quantitative research techniques for humanitarian needs assessment

Partners & Members

  • Change your details
  • Discussion Forum

Integrated content

  • Register as Member / Partner

About SuSanA

  • Regional Chapter Help
  • Site search
  • Data policy
  • Privacy settings

News & Events

Sanitation events

  • Network Circle Newsfeed

SuSanA meetings

Susana newsletter.

  • Tweets mentioning SuSanA

Knowledge Hub

Regional chapters

  • Shit flow diagrams (SFD)

Shared learning

  • Multimedia database
  • Conference and course materials

Regional Chapters

Integrated Content

Working Groups

  • 1 - Capacity development
  • 2 - Market development
  • 3 - Climate Mitigation and Adaptation
  • 4 - Sanitation systems and technology options
  • 5 - Food security and productive sanitation systems
  • 7 - Sustainable WASH in institutions and gender equality
  • 8 - Emergency and reconstruction situations
  • 9 - Public awareness, advocacy and civil society engagement
  • 10 - Operation, maintenance and sustainable services
  • 11 - Groundwater protection
  • 12 - WASH and nutrition
  • 13 - Behaviour change
  • Upcoming events
  • Past events
  • Event announcements on Discussion Forum

Add your event to calendar»

Sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At vero eos et accusam et justo duo dolores et ea rebum.

32nd SuSanA Meeting

The  32nd virtual SuSanA Meeting took place  Monday, 22nd of August (9:00 - 17:00 CEST) , right before the Stockholm World Water Week. It was organised by the Secretariat with support and contributions from SuSanA Partners, Members, Working Groups, Regional Chapters and many more. 

Now meeting recordings and presentations available:  Click here to get to the Meeting Page!

Networks Circle News

  • Collected Newsfeed
  • Upcoming meetings
  • Past meetings
  • Posted on discussion forum

Latest SuSanA Blog Articles

  • 26-03-2024 • Alice Brandt ,Mascha Kaddori: Let’s get wild: Water, sanitation and hygiene at the human-wildlife interface »
  • 21-03-2024 • Tabeer Riaz: Empowering Young Women Water Professionals in South Asia: Leading the Wave of Change »
  • 12-03-2024 • Beauty Mkoba: Unlocking the potential of African Women in STEM through mentorship »
  • 08-03-2024 • Gloria Mariga : Mentorship empowers African women to lead environmental stewardship »
  • 05-03-2024 • Josphine Gaicugi: Achieving access to adequate and equitable sanitation for all is no mean feat »
  • 01-02-2024 • Sanitation for Millions: Toilets Making the Grade® school competition – A Competition where all Participants are Winners »
  • 24-01-2024 • Anne Fetscher,Jörg Felmeden: The sustainable use of tap water (in Germany) and the power of education. An Interview with Dr.-Ing. Jörg Felmeden »

SuSanA Blog »

  • SuSanA News March 2024
  • SuSanA News November 2023
  • SuSanA News August 2023
  • Newsletter Menstrual Hygiene Day 2023
  • Newsletter World Water Day 2023
  • Archived SuSanA newsletter
  • Announcements about SuSanA

Stay informed about the activities of SuSanA and its partners. The SuSanA newsletter is sent out around four times per year. It contains information about news, events, new partners, projects, discussions and publications of the SuSanA network. Subscribe to newsletter »

close  

  • Visit the library

Our library has more than 3,000 publications, factsheets, presentations, drawings etc. from many different organisations. It continues to grow thanks to the contributions from our partners.

Add item to library »

The three links below take you to special groups of items in the library for more convenient access:

Publications

  • SuSanA case studies
  • SuSanA publications
  • School activity collection
  • View project database

The project database contains nearly 400 sanitation projects of many different organizations dealing with research, implementation, advocacy, capacity development etc. Advanced filtering functions and a global map are also available. Information on how and why this database was created is  here .

People working for SuSanA partners can add their own projects through their partner profile page. You might need your SuSanA login upgraded for this purpose. Please contact us if you would like to add a project.

Multi-media database

  • Photos (Flickr database)
  • Videos (YouTube channel)
  • Thematic discussion series (TDS)
  • Glossary and Wikipedia
  • Digital approaches in sanitation

Trainings, conference and events materials

  • Conferences
  • Training and courses
  • Other events

Missed important conferences or courses? Catch up by using their materials for self study. These materials have been kindly provided by SuSanA partners.

Shit flow diagrams, excreta flow diagrams (304 SFDs worldwide)

Shit flow diagrams (SFDs) help to visualize excreta management in urban settings. Access SFDs and more through the SFD Portal.

Emersan eCompendium

Humanitarian sanitation hub, sanitation workers knowledge and learning hub.

qualitative and quantitative research techniques for humanitarian needs assessment

Discussion forum

  • Go to discussion forum

Share knowledge, exchange experiences, discuss challenges, make announcements, ask questions and more. Hint: Your discussion forum login is the same as your SuSanA login. More about the forum's philosophy »

  • Sanitation Marketing Portal
  • Akvo Sanitation Portal
  • Sanitation Wikipedia
  • Public Sanitation

We are hosting content from some other communities of practice and information-sharing portals. This section also provides a link to SuSanA's Sanitation Wikipedia initiative.

Suggest content to add »

SuSanA partners

  • View all partners
  • Introductions of recently joined partners

Not yet a SuSanA partner? Show your organisation's support to SuSanA's vision and engage in  knowledge sharing by becoming partners.

Apply to become a partner »

Individual membership

Register as an individual member of SuSanA free of charge. As a member you can interact with thousands of sanitation enthusiasts on the discussion forum .  You can also get engaged in one of our 13 working groups  and our regional chapters . Our FAQs  explain the benefits further.

By getting a SuSanA login you can fully participate in the SuSanA community!

Register as a member

Forgot your password? Forgot your username?

SuSanA Themes

  • Climate resilient Sanitation (CRS)
  • Health [coming soon]
  • Universal Language for Behaviour Change (UL4BC) [coming soon]

SuSanA Working Groups

  • Working Group 1 Capacity development
  • Working Group 2 Market development
  • Working Group 3 Renewable energies and climate change
  • Working Group 4 Sanitation systems and technology options
  • Working Group 5 Food security and productive sanitation systems
  • Working Group 6 Cities
  • Working Group 7 Community, rural and schools (with gender and social aspects)
  • Working Group 8 Emergency and reconstruction situations
  • Working Group 9 Public awareness, advocacy and civil society engagement
  • Working Group 10 Operation, maintenance and sustainable services
  • Working Group 11 Groundwater protection
  • Working Group 12 WASH and nutrition
  • Working Group 13 Behaviour change
  • WANA region
  • Latin America
  • Africa Chapter

Use the map or the search tool to access the most relevant information and knowledge products for your region or country. This includes relevant resources, events, partners or projects.

qualitative and quantitative research techniques for humanitarian needs assessment

Circle Newsbox

How to work with the filter panel.

Filter operations

If you tick two options within one filter parameter, then this is treated as an "or" search, meaning either of them could be true. But if you tick two filters for two different parameters then this is treated as an "and" search, i.e. both parameters must be true to return a result. If anything is unclear about how the filter and search combinations work, just contact us: [email protected]

Function coming soon

At the moment, not all functions and hyperlinks are available. We will implemented this functionality within the next few weeks.

We ask for your understanding.

Add your event

SuSanA Partners can upload their trainings, webinars and events with their SuSanA member log-in here: https://www.susana.org/login .

In case your member account is not yet added to the Sanitation Event Calendar, please contact the secretariat via [email protected] .

Contact us if you have ideas on how your website (or sub-sections of it) could be integrated into the SuSanA website: [email protected]

Add your publication

If you have a publication that should be added to the SuSanA library, please send it to the SuSanA secretariat: [email protected] .

If the publication is by a SuSanA partner and you work for that SuSanA partner, then you can also add it yourself by going through the partner profile page option. Please see here for more explanations .

We got more than just a new look!

Welcome to the new website of susana.org.

On 1 September we relaunched our website, using a new menu concept to improve the user experience. However, some functions are not working yet but we are working on those now. If you have any comments or suggestions please post them here on the SuSanA discussion forum . Thank you.

  beta version

Content Search

Humanitarian needs assessment – the good enough guide, attachments.

Preview of h-humanitarian-needs-assessment-the-good-enough-guide.pdf

ACAPS has published Humanitarian Needs Assessment – The Good Enough Guide, to help humanitarian staff design and implement needs assessments in emergencies.

What assistance do disaster-affected communities need? This book guides humanitarian field staff in answering this vital question during the early days and weeks after a disaster, when timely and competent assessment is crucial for informed decision making. Developed by the Emergency Capacity Building Project (ECB) and the Assessment Capacities Project (ACAPS), the Good Enough Guide is especially aimed at national project managers and their teams. Essential reading for field staff carrying out assessments, the Good Enough Guide is also for humanitarian policy makers and researchers.

Needs assessment is essential for programme planning, monitoring and evaluation. In an emergency response, however, a quick and simple approach may be the only practical possibility – in other words, needs assessment has to be ‘good enough’. ‘Good enough’ does not mean second best. Following the same principles as previous Good Enough Guides published by the ECB, good enough here means choosing a simple solution rather than a complicated one.

This guide does not explain every activity that you will need to carry out for your assessment, but it will take you step by step through the assessment process, offering a number of useful tools and resources. The steps and tools are most directly useful for initial and rapid assessments in the first weeks of an emergency, but the principles and practices described apply at any stage in the response.

The Good Enough Guide was developed through wide-ranging consultations which began in November 2012. Input was through workshops and field tests and by face-to-face, e-mail, and phone discussions. Comments and feedback were received from over 150 individuals and organisations, all of which strengthened the content of the guide significantly.

Related Content

World + 15 more

Widespread parts of Asia and Africa reel under extreme weather

Gavi et l’unicef se félicitent de l’approbation d’un nouveau vaccin anticholérique oral.

World + 30 more

State of the Climate in Asia 2023

World + 34 more

Climate Risk Insurance Annual Report 2023

OCHA Services

Related Platforms

  • Financial Tracking Service
  • Humanitarian Data Exchange
  • OCHA website

Other OCHA Services

  • Humanitarian ID
  • ReliefWeb Response
  • Inter-Agency Standing Committee
  • Virtual OSOCC

Humanitarian Programme Cycle

Search form

Needs assessments and analysis.

A coordinated approach to the needs assessment and analysis in an emergency and to the prioritisation of the needs of affected people lays the foundation for a coherent and efficient humanitarian response. It helps improve the quality, comparability, and evidence-base for such response. Needs assessments and analysis are carried out in partnership by humanitarian actors and document the scope of a particular crisis. More importantly, coordinated assessments illustrate the needs of affected populations and informs strategic response planning and system-wide monitoring.

For protracted crises, the depth and volume of information needed for an effective response increase as the response evolves. This often translates into a requirement for in-depth cluster/sector, thematic or agency-specific assessments to inform planning and operations, which in turn necessitates a harmonized assessment approach with joint needs analysis.

Needs assessments and analysis follow the principle of humanitarian accountability and can enhance the quality of inter-agency collaboration. They can also improve donor funding levels and relationships with governments, local NGOs, and disaster-affected populations. Humanitarian Country Teams (HCTs) benefit from using coordinated assessments when responding to a disaster. 

Evidence-base

Needs assessments and analysis provide the evidence-base for  strategic planning , as well as the baseline information upon which  situation and response monitoring  systems will rely. It should therefore form a continuous process throughout the  Humanitarian Programme Cycle (HPC) .

Coordinated assessments are carried out in partnership with all humanitarian actors in order to assess the humanitarian situation and to identify the needs of the affected population. Local and national authorities, civil society and affected communities are encouraged to participate in this process, the output of which is a Humanitarian Needs Overview (HNO).

Key outputs: Humanitarian Needs Overview and Humanitarian Dashboard

HNOs should be produced twice a year to support the Humanitarian Country Team (HCT) in developing a shared understanding of the impact and evolution of a crisis and to inform response planning. This document presents a comprehensive analysis of the overall situation and associated needs. It is structured along the analytical framework developed for the  Multi-Cluster/Sector Initial Rapid Assessment (MIRA) .

The HNO builds and expands upon the needs analysis chapter of the former Consolidated Appeal Process (CAP) document, consisting of a discrete step in the implementation of the programme cycle. Its development is a shared responsibility among all humanitarian actors, requiring strong collaboration between programme and information management staff as well as support from the OCHA country office and the inter-cluster coordination mechanism.

In addition to the HNO, each country will continue to produce a humanitarian dashboard to present data on needs, response monitoring, and gaps per crisis in an easily digestible format, based on the information presented in the humanitarian needs overview.

For more information visit the JIAF web page .

Guidance and Templates

Below you will find a collection of key guidance and templates supporting the production of Needs Assessment-related output.

Humanitarian Needs Overview

This output is designed to support the Humanitarian Country Team in developing a shared understanding of the impact and evolution of a crisis. The humanitarian needs overview helps inform strategic response planning. Most importantly, it works to ensure that credible evidence and a joint analysis of needs underpin an effective and prioritised humanitarian response.

Available in several languages, the HNO annotated template is available under the HPC Facilitation Package .

All HNOs are available here .

IASC Operational Guidance on Coordinated Assessments in Humanitarian Crises

Identifying priority needs of affected populations is the first step towards ensuring an effective and speedy humanitarian response. The Operational Guidance promotes a shared vision of how to plan and carry out coordinated assessments. Outputs from coordinated assessments support humanitarian decision-making.

qualitative and quantitative research techniques for humanitarian needs assessment

MIRA Guidance

The MIRA Guidance outlines an approach to undertaking a joint multi-sectoral assessment in the earliest days of a crisis or change in the context. It guides subsequent in-depth sectoral assessments and provides decision-makers with timely, adequate, sufficiently accurate and reliable information to collectively identify strategic priorities. More information can be found  here .

qualitative and quantitative research techniques for humanitarian needs assessment

Data Collection Tools

Kobo toolbox.

Kobo Toolbox was created to be a free and accessible data collection tool for organizations in the humanitarian, development, environmental protection, peacebuilding, and human rights sectors. Kobo Toolbox is the main tool used by humanitarian organizations for primary data collection. Kobo allows users to quickly build questionnaires using a number of quantitative and qualitative question types and to collect data offline or online on mobile phones or web browsers. More information can be found  here .

The Data Entry and Exploration Platform (DEEP)

DEEP is used for Secondary Data Review (SDR). It offers a suite of tools and collaborative workflows that meet the needs for compiling, storing and structuring data and qualitative information. More information can be found here .

Joint Intersectoral Analysis Framework (JIAF)

JIAF provides humanitarian actors with a common analytical framework and system to gather, structure, and synthesize information regarding the  needs of populations in crisis.

The JIAF offers a methodological approach and a structured sensemaking process to support regular joint needs analysis through:

  • Supporting the collation, analysis and storage of data
  • Providing a way to organize what data to collect and how to analyze it;
  • Guiding a joint analysis process involving multiple stakeholders;
  • Serving as a driver for collaboration between humanitarian actors 

The purpose is to help inform the decision-making process when planning a coordinated response through a  people-centred,  and inclusive   joint intersectoral analysis system  that is both  comprehensive and methodologically rigorous.     

The findings of the JIAF are presented Humanitarian Needs Overview (HNO) which in turn helps inform the  humanitarian response plan (HRP).  For more information about JIAF please visit this  website .

Assessment Registry

A humanitarian needs assessment registry functions as a centralized database specifically designed to track and store information pertaining to needs assessments carried out in response to crises.  

To learn more about the Assessment registry tool used by OCHA, click here .

qualitative and quantitative research techniques for humanitarian needs assessment

Search form

qualitative and quantitative research techniques for humanitarian needs assessment

  • Table of Contents
  • Troubleshooting Guide
  • A Model for Getting Started
  • Justice Action Toolkit
  • Best Change Processes
  • Databases of Best Practices
  • Online Courses
  • Ask an Advisor
  • Subscribe to eNewsletter
  • Community Stories
  • YouTube Channel
  • About the Tool Box
  • How to Use the Tool Box
  • Privacy Statement
  • Workstation/Check Box Sign-In
  • Online Training Courses
  • Capacity Building Training
  • Training Curriculum - Order Now
  • Community Check Box Evaluation System
  • Build Your Toolbox
  • Facilitation of Community Processes
  • Community Health Assessment and Planning
  • Section 15. Qualitative Methods to Assess Community Issues

Chapter 3 Sections

  • Section 1. Developing a Plan for Assessing Local Needs and Resources
  • Section 2. Understanding and Describing the Community
  • Section 3. Conducting Public Forums and Listening Sessions
  • Section 4. Collecting Information About the Problem
  • Section 5. Analyzing Community Problems
  • Section 6. Conducting Focus Groups
  • Section 7. Conducting Needs Assessment Surveys
  • Section 8. Identifying Community Assets and Resources
  • Section 9. Developing Baseline Measures
  • Section 10. Conducting Concerns Surveys
  • Section 11. Determining Service Utilization
  • Section 12. Conducting Interviews
  • Section 13. Conducting Surveys
  • Section 14. SWOT Analysis: Strengths, Weaknesses, Opportunities, and Threats
  • Section 16. Geographic Information Systems: Tools for Community Mapping
  • Section 17. Leading a Community Dialogue on Building a Healthy Community
  • Section 18. Creating and Using Community Report Cards
  • Section 19. Using Public Records and Archival Data
  • Section 20. Implementing Photovoice in Your Community
  • Section 21. Windshield and Walking Surveys
  • Section 22. Using Small Area Analysis to Uncover Disparities
  • Section 23. Developing and Using Criteria and Processes to Set Priorities
  • Section 24. Arranging Assessments That Span Jurisdictions
  • Main Section

What are qualitative methods of assessment?

Why use qualitative methods of assessment, when would you use qualitative methods of assessment, how do you use qualitative methods of assessment.

Using qualitative assessment methods rather than purely data-based information is crucial to understanding many community issues and needs. Numbers work well to show comparisons, progress, an statistics of community efforts, but they cannot express motives, opinions, feelings, or relationships. This section discusses how to use qualitative assessment methods and when to implement them into communtiy planning.

Qualitative methods of assessment are ways of gathering information that yield results that can’t easily be measured by or translated into numbers. They are often used when you need the subtleties behind the numbers – the feelings, small actions, or pieces of community history that affect the current situation. They acknowledge the fact that experience is subjective – that it is filtered through the perceptions and world views of the people undergoing it – and that it’s important to understand those perceptions and world views.

There are two major scientific ways of gathering information: quantitative methods and qualitative methods. Quantitative methods are those that express their results in numbers. They tend to answer questions like “How many?” or “How much?” or “How often?” When they’re used to compare things – the results of community programs, the effects of an economic development effort, or attitudes about a community issue – they do it by subjecting all of the things or people they’re comparing to exactly the same tests or to the same questions whose answers can be translated into numbers. That way, they can compare apples to apples – everything or everyone is measured by the same standard. Quantitative measures are often demanded by policy makers; they are considered trustworthy because their results can be measured against one another, and because they leave less room for bias.

Qualitative methods don’t yield numerical results in themselves. They may involve asking people for “essay” answers about often-complex issues, or observing interactions in complex situations. When you ask a lot of people for their reactions to or explanations of a community issue, you’re likely to get a lot of different answers. When you observe a complex situation, you may see a number of different aspects of it, and a number of ways in which it could be interpreted. You’re not only not comparing apples to apples, you may be comparing apples to bulldozers or waterfalls. As a result, researchers and policymakers sometimes see qualitative methods as less accurate and less legitimate than quantitative ones. That can be true, but, as we’ll see, if qualitative methods are used with care, they can also yield reliable information.

Qualitative and quantitative methods are, in fact, complementary. Each has strengths and weaknesses that the other doesn’t, and together, they can present a clearer picture of the situation than either would alone. Often, the most accurate information is obtained when several varieties of each method are used. That’s not always possible, but when it is, it can yield the best results.

There are a number of qualitative methods that can be used in assessment of issues or community needs. We’ll list the major ones here, and look at them in more detail later in the section.

They include:

  • Individual interviews . These may be structured interviews, where the questions are determined beforehand, or unstructured conversations that are allowed to range wherever the interviewee wants to go in relation to the general topic. Even in structured interviews, there may be room for both interviewers and interviewees to pursue topics that don’t relate directly to answering the original questions. The difference, however, is that in a structured interview, all those questions are formally asked, and the interviewer does her best to make sure they’re answered.
  • Group interviews . These are similar to individual interviews, but involve two or more interviewees at a time, rather than one. (Sometimes, these are unexpected – the interviewee’s mother and sister are present, and insist on being part of the conversation.) Group interviews have some advantages, in that interviewees can act as a check on one another (I remember that happening in a different way…), and stimulate one another’s thinking. At the same time, the interviewer has to be somewhat of a facilitator, making sure that no one person dominates, and that everyone gets a reasonable chance to speak.
A special case of group interviewing is a focus group . This is a group of about 6-10 people, led by a trained facilitator, assembled to answer a specific question or questions. An effort is sometimes made to make sure that group members don’t know one another, so that social pressures won’t influence them. If trained facilitators are available, focus groups can be a good way to get accurate information about an issue.
  • Observation . Here, someone actually goes and looks at a place or event, watches situations or interactions, or takes part in the life of the community or a population while recording what he finds as a result.
  • Community or other large meetings . These meetings allow a range of people a chance to express their opinions and react to others’. They can draw on a large pool of opinions and knowledge at one time, and uncover disagreements or differences that can then be discussed.
  • Interpretation of records, transcripts, etc. This can range from qualitative analysis of quantitative data (like the assumption of the researcher in the introduction to this section that people who are doing well won’t be interested in an adult education program), to using quantitative data as a jumping-off point for qualitative assessment, to case studies (detailed examinations of individual cases). The last are not always useful in assessing community issues or needs, but they can be very effective in convincing policymakers or funders of the importance of those issues and needs.

Many types of qualitative information are turned into numerical results, although not always accurately. The transformation may miss important details, or the information may simply be too complex to fit easily into numerical constraints, unless you can create a computer model or similar number-based framework that has the capacity to take in an enormous amount of variety. There are many software programs – NVivo and Atlas.ti are fairly well-known, but there are many others, including some freeware – that are intended expressly for analyzing qualitative data.

Since qualitative methods give you results that are not always easy to compare, or even to check for accuracy, people who want hard and fast evidence often see them as suspect. In fact, both quantitative and qualitative measures are important and necessary, depending on the situation. When you’re assessing community issues, as we’ve discussed, you’ll often get closest to the complete picture by using both. The problem is convincing those who need to be convinced – policymakers, funders, etc. – that your qualitative measures are reliable.

There is a debate in the research community about how to judge qualitative methods. Some say they should be evaluated by the same standards as quantitative methods. Others maintain that, because they are intrinsically different from quantitative methods, qualitative methods need a set of standards that take into account their philosophical base and the kind of information they yield.

The British government, for instance, has developed a framework for demonstrating qualitative reliability, which includes a set of 18 questions that a qualitative assessment or study should be subjected to (see Tool #1 ).

Guidelines that can help you argue for the reliability of your qualitative assessment include:

  • Report accurately and completely . Whether you’re interviewing, observing, or engaging in some other technique, you should faithfully record such details as the time and place of your activity, who was involved, what the situation was, etc. In that way, you can see similarities and differences, and make comparisons where they’re appropriate. The recording of interviews, observations, and other information should be as accurate and nearly complete as possible (e.g., word-for-word for interviews).
  • Frame the right questions, and direct them appropriately . Occasionally, it works to go fishing for information, i.e. to start without any idea of what you want to find out In most instances, however, you should know what the important questions are, and where you need to look for answers. The clearer you can be – and the clearer it is that the questions you’re asking will lead to real understanding and effective action – the more credibility your inquiry will have.
  • Use qualitative methods specifically to gain information you can’t easily get from quantitative methods . You can quantify how many members of a specific minority live in a particular neighborhood. It’s much harder to quantify a clear understanding of how well they get along with their neighbors, and why.
  • Use the method(s) that can best help you answer the questions you’re asking . If you want to know the state of vacant lots in a city, you’re less likely to determine it by asking people than you are by going and looking at the lots themselves. On the other hand, you usually can gain more information about people’s opinions through talking to them than you can from observation.
  • Sort out your own and others’ subjective feelings and comments from objective reality, and try to make sure that your findings are objective . It’s easy to get caught up in the passion of interviewees’ opinions, or in your own response to particular conditions. If you want your findings to be reliable, you have to screen out as much of the subjective as possible from what you find and record. (One way to approach this issue is to have more than one person record and analyze each interview or observation, and then to check on how well they agree, both in their recording of the data and in their interpretation.)
Something that’s objective – an observation, statement, opinion, research finding, etc. – is based on reality as it actually is. Scientists, for instance, aim to be objective, and to understand the way things really are, rather than the way the scientists or others want them to be, or think they might be. A subjective observation, statement, opinion, or research finding, on the other hand, is based on the thoughts and assumptions of the person issuing it. A researcher may be so appalled by the conditions in neighborhoods where violence is rampant that she may begin to feel that violence is in fact the only rational response, and slant her research in that direction. Especially in community assessment, objectivity is vitally important. Objectivity in looking at the community will help you understand how to most effectively address issues, maximize and use assets, and solve problems. Understanding your own subjective reactions – to difficult conditions, to particular individuals, to cultural practices – will help you to screen them out, thereby increasing the reliability of your findings.

The basic reason to use qualitative methods is that there are some kinds of questions and some dimensions of community assessment that can be better addressed by them than by quantitative methods. The methods you use should be determined by the questions you’re asking. Since it may be hard to convince policymakers and others that qualitative methods are useful, however, why bother to use them at all? Some of the major reasons:

  • They answer some questions that quantitative measures can’t. Quantitative methods may tell you how many people do a certain thing, but they’re unlikely to tell you how or why they do it. Qualitative methods can better answer the how and why questions, and also provide other information in the process.
  • They connect directly with the population and the community with which you’re concerned. In assessment, the best sources of information are those closest to what’s being assessed: they experience it more than anyone else. Qualitative methods generally go directly to those sources with more complex questions than quantitative methods.
  • They can get at certain underlying realities of the situation. Once again, quantitative methods often don’t answer “why?” questions, while qualitative methods can tell you about the history of the community or issue, who the significant supporters and opponents of various ideas are, whom people in the community listen to, etc. In an assessment situation, these can be crucial pieces of information.
  • They can involve the population of interest, or the community at large, in helping to assess the issues and needs of the community. This participation fosters a sense of ownership and support for the efforts.
  • They often allow for a deeper examination of the situation or the community than quantitative methods do. Quantitative methods, although helpful, can tend to put people or events in specific categories, ask for yes-no or multiple-choice answers, often eliminating complexity. Qualitative methods allow for following promising directions (“Why do you say that?”), and can lead to the discovery of important information that quantitative results wouldn’t have touched on.
  • They allow for the human factor. While the information obtained through qualitative methods is often subjective, it is also often identified as such, and can be analyzed accordingly.

Clearly, there are times when quantitative research will give you the information you need. So when do you use qualitative methods? It depends to a great extent on the question you’re asking. (The first four situations below are based on a USAID guide to using rapid appraisal methods, Performance Monitoring and Evaluation Tips .)

  • When what you need is qualitative, descriptive information . Particularly in an assessment situation, what you’re often looking for is descriptive or analytical information that has little to do with quantitative measures. The type of information provided by qualitative methods is often exactly what you’re looking for in community assessment to decide on next steps.
  • When you’re trying to understand the reasons and motivations for people’s behavior, or how they operate in particular situations . Why don’t people take advantage of human service programs for which they’re eligible? What are the differences in the ways people of different cultural backgrounds respond to services? These are the kinds of questions you’re likely to want to answer in a community assessment, and they often can’t be answered quantitatively.
  • When you’re analyzing quantitative data. As mentioned above, much quantitative data can be analyzed using qualitative methods.
An odd set of numbers – a community that’s decidedly low-income, but where a vast majority of people own their own homes, for instance – might be the springboard for a qualitative examination of why this is so. A number of reasons are possible: The community is largely elderly, and people are living in long-since-paid-for houses they bought 40 or more years ago, when their income was higher and housing was less expensive. One or more local banks have made it a priority to help people buy houses, and provide low-interest mortgages and other subsidy programs to further that goal. While they may be low-income, the members of the community nonetheless scrimp on everything else in order to put away money for a house. This is often the case among immigrants from certain cultures, where people are willing to live very simply for many years in order to save for property and education. The community has been “written off” because of its substandard housing, dangerous streets, and lack of services, and houses as a result are ridiculously cheap. A combination of factors, some of which may not be listed here. By and large, quantitative methods won’t easily tell you the reasons for this unusual situation, but qualitative methods will.
  • When you’re trying to develop suggestions and recommendations . Again, this is often the primary purpose of community assessment. How should you design a program or initiative to accomplish a major community goal or deal with an issue? What will people respond to? Qualitative data may give the best information here, or may be used in addition to qualitative information to provide a complete picture on which to base your strategy.
  • When you want to involve the community in assessment as directly as possible . Involving community members directly leads to ownership and support of initiatives, and is also likely to generate the best and most effective solutions. Qualitative assessment methods, for the most part, collect information directly from community members themselves, and allow them to fill in the details as much as they can. By and large, being interviewed is more likely to leave someone feeling like part of the process than filling out a survey.
  • When you’re doing community-based participatory research (i.e., involving the community directly in planning and implementing assessment). Community-based participatory research often relies greatly on qualitative assessment methods.
  • When quantitative data are unavailable or unobtainable.
  • When you don’t have the capacity to use quantitative methods . You may not have the proper training, the software or hardware that will make quantitative assessment useful for you, or the time to use quantitative methods properly.

Now that you’re convinced of the importance of using qualitative methods of assessment, how are you going to do it? There’s seldom one right way to do anything, but we’ll offer some steps to take in using qualitative methods, including some guidelines for doing interviews and observations, the two most common methods. (Most of these guidelines hold equally for using quantitative methods as well.)

Start by deciding what it is you want to know.

You may remember that this is also one of the guidelines for qualitative reliability. It may seem elementary, but it doesn’t happen anywhere near as often as it ought to. The importance of deciding what you want to know is that it determines the character of your assessment – what kinds of questions you ask, whom you ask them of, how you’ll go about it, etc. Without that minimal amount of structure, you’re likely to wind up with a confused and unorganized mass of information.

There are many ways to approach a community assessment, and, consequently, many questions you might choose to start your assessment with. You might even use more than one, but it’s important to be clear about exactly what you’re looking for.

Some possibilities:

  • What is the most serious issue – either general or specific – the community faces (i.e., what should we turn our attention to?
  • What services are most needed in the community? Who most needs them?
  • Are people taking advantage of services that currently exist?
  • What are the community’s significant assets? How can they be strengthened?
  • Are there forces working against the good of the community that should be opposed? (You probably wouldn’t be asking this question unless you thought there were, and had some idea who or what they might be.)
  • Who ought to be involved in a prospective coalition or initiative?

Choose the method best suited to finding the information you’re looking for.

If you want to learn about people’s public behavior, you would probably use direct observation. Observing mothers and children in a clinic waiting room, for example, might give you information about the mothers’ anxiety levels or child-rearing practices.

If you want to know people’s opinions or how they feel about issues, some type of interview would be appropriate.

Once you’ve chosen the right method, it’s important to carry it out properly. Be aware of what you can do with the resources you have. You can’t conduct thousands of interviews in a large city, for instance, without considerable money. If you’re a cash-strapped nonprofit, you might look for a grant to fund your interviews, or you might confine your assessment to one neighborhood. Perhaps you’d mobilize volunteers to conduct interviews, or interview groups rather than individuals. It’s better to do a limited community assessment well than a large one badly. In choosing your method, be aware also that, in some cases, quantitative methods may be more appropriate and more likely to tell you what you want to know.

Choose the people who will gather the information, and, if necessary, train them.

With qualitative methods, where contact is often personal, the question of who carries them out can be very important. Academics or others who are perceived by community members as “the other,” whether because of their behavior, their speech, or simply because they’re outsiders, may find it hard to gather accurate and complete information from a population that’s very conscious of class or cultural differences. Often, it makes more sense to train members of the population or others who are known and trusted by – or at least familiar to, in their behavior, dress, and speech –those who are being asked to contribute their opinions and observations.

Data collectors should be fluent in the language and culture of those they are interviewing. If you’re assessing commercial activity in a Hispanic neighborhood, you’ll miss most of what’s really happening unless you understand both the Spanish language and the normal ways in which Hispanic (or Dominican or Mexican or Puerto Rican) customers and merchants relate to one another.

If you recruit members of the community or of a specific population to do qualitative information gathering – because they relate to the population better, because they speak the language, because you’re engaged in a participatory effort, or simply because you think they’ll be good at it – you should provide them with training to make sure that the results they come up with are reliable. Depending on what kinds of methods they’ll be using, some of the elements of a training might be:

  • What to record and how : It may not be obvious how important it is to record the time, place, details, and circumstances of an interview, observation, focus group, or larger meeting It may also be necessary, depending on a trainee’s experience, to learn to use a recorder or video camera, and/or to learn how to take efficient notes without losing the thread of the conversation or missing important points in an observation.
  • Interview techniques , as well as exactly what purpose an interview serves, and how it fits into the larger assessment picture. The more clearly an interviewer understands not just what to do and how, but why she’s doing it, the better she’s likely to be at drawing out the information she’s seeking.
  • Observation techniques : As with an interview, an observation will be far more useful if the observer understands not just what to do and how to do it, but exactly why he’s doing it, and how it will be used.
  • Training in other methods : Focus groups, for instance, require specific skills and techniques.
  • Training in how to think of themselves as researchers : Like those engaged in community-based participatory research , information gatherers should understand how researchers operate. Objectivity, attention to detail, curiosity, and the continuous processing of information in order to generate the next question or observation are all part of the investigative mindset, which they should be encouraged to develop.

Determine from whom or from where you need to gather information.

It may be that you want to hear from all sectors of the community, but some issues or circumstances demand more specific informants. Some possible interview subjects may be public officials, members of a specific population or cultural group, people from a particular geographic area, or people with certain characteristics (parents of young children, individuals with disabilities, males 18-24, people with high blood pressure).

Knowing whom you need to ask extends to any method in which you talk directly to people – focus groups, large community meetings, etc. Focus groups used by marketers are chosen extremely carefully, for example, with age, gender, income, place of residence, and even such factors as favored leisure activities considered.

Observation may or may not involve people. If it does, the question may not be whom you want to observe, but rather what activity or situation you want to observe. If it’s general – what kinds of street activity take place in various neighborhoods, how people use a public park – it’s not necessary to focus on a particular population, but rather on the place. If it’s more specific – back to commercial activity in that Hispanic neighborhood – you’ll need to be in the right place at the right time.

Gather the information.

Now it’s time for you or the people you’ve chosen to go out and collect the qualitative information you need.

As mentioned above, interviews can be structured or unstructured. In a strictly structured interview, the same questions in the same order are asked of everyone, with relatively little room for wandering off the specific topic. Semi-structured interviews may also be based on a list of specific questions, but – while trying to make sure that the interviewee answers all of them – the interviewer may pursue interesting avenues, or encourage the interviewee to talk about other related issues. An unstructured interview is likely to be more relaxed – more like a conversation than a formal interview.

There are advantages and disadvantages to each approach. A structured interview may make the interviewee focus in on the questions and the interview process, take it more seriously, and thus provide excellent information. Because everyone is interviewed in the same way, a structured interview may be – or at least may look – reliable. It may also make an interviewee nervous, emphasize the differences between him and the interviewer, and lead to incomplete or less-than-truthful answers.

A semi- or unstructured interview may allow the interviewee to be more relaxed, and thus more forthcoming. It also leaves room for pursuing a topic that’s not directly related to the formal list of questions, but that might be important or even crucial. At the same time, because it can be far-ranging, a semi- or unstructured interview – particularly one that doesn’t start with a list of questions – is, or appears, less reliable than a structured one. It also, in the hands of an inexperienced or indecisive interviewer, may allow an interviewee to get sidetracked and never get back to the original questions.

What kind of interview you use depends on the nature of the information you’re looking for, the needs of the people you’re interviewing (e.g., whether comfort is more important than structure), and your own comfort. The author has conducted all three types of interviews, and has found that semi-structured interviews – having clear questions and goals for the interview, but conducting it in an informal way, with room for pursuing tangents and some simple friendly conversation – is generally productive. The following guidelines for interviewing reflect that view.

  • Ask the interviewee to choose the space . You might give him a range of suggestions – his home or workplace, the office of a human service agency, a neutral space, such as a café or a park – and go with his choice. The more comfortable he is, the better and more informative the interview is likely to be.
  • Choose your clothes for the comfort of the interviewee . In general, your clothes and hers should be similar: if she’s in jeans and a t-shirt, you shouldn’t be in a suit; if you’re interviewing a business executive at her office, you should be wearing a suit. Clothes send powerful messages, and the message you should be sending here is “We’re from the same planet; you can talk to me.”
  • Talk beforehand with the interviewee if you’re planning to record or photograph the interview . Get permission before you show up with equipment It’s common courtesy, and it’s less likely to start the interview off awkwardly .
If the results of the interview are likely to be published, even if the interviewee will be anonymous, you might want to get a signed “informed consent” form, indicating that the interviewee understands the purpose of the interview, and gives permission for the material to be published or used in other ways.
  • Record carefully the time, place, circumstances, and details of the interview . This includes a description of the location (the neighborhood as well as the space, if you’re interviewing a community member), other people present, any distractions (kids, pets, TV), other factors influencing the interview or the situation. Include a general description of the interviewee (married Hispanic woman, age 25, three children aged 6, 4, and 1).
  • Think out and frame your questions carefully, and ask directly for the information you’re seeking . Memorize your basic questions (not necessarily word-for-word, but know what they are), so that you refer to notes as little as possible. Make your questions clear and unambiguous, so that questions aren’t vague or difficult to understand.
  • Ask open-ended questions . These are questions that require an "essay" answer, rather than a yes-no response. For example, instead of asking "Did you enjoy being in the program?" you might ask "What was participating in the program like?" Try to give people the chance to answer as fully and thoughtfully as possible.
  • Probe . Ask follow-up questions to get at what people are really saying, or to keep them talking about a topic. ("Why did you like it when the teacher asked your opinion?") Don't be afraid to pursue what may seem to be a sidetrack. Sometimes the best or most important information lies off the beaten path.
Some interviewees can manage one-word answers to nearly any question. They might answer "What was participating in the program like?" with “Good.” Don’t be afraid to probe these answers. “What does that mean?” or “How was it good?” might get you a flood of information. If it gets you another one-word answer, keep probing, unless you sense that the person is getting angry or frustrated. Then it’s probably time to move on to the next question, and hope that there’ll be an opportunity to return to this one for a fuller explanation. But be aware that some people are simply quieter – or less reflective – than others. You may never get much more than one-word answers from them.
  • Don't cut people off too quickly . Their stories, or what you can read between the lines, may give you information as important as what they tell you directly.
At the same time, be aware when they’ve strayed too far from the topic. There’s a Mark Twain story that consists of the voice of a man telling an anecdote about a three-legged dog. Every other word reminds him of something else – another story – and he gets continually sidetracked, never finishing the story of the dog, or any of the others, either. Beware the Curse of the Three-Legged Dog: gently but firmly direct people back to the topic if they get too far afield.
  • Confirm what you're told by checking with others to the extent that you can . Remember that you're getting people's perceptions, which aren't always the same as objective reality. In Rashomon, a film by the great Japanese director Akira Kurosawa, an incident is described from the perspectives of four participants, each of whom sees it totally differently. In fact, the phenomenon of Rashomon lurks everywhere; get everyone's side of the story.
Group interviews are both similar to and different from individual ones. The basic guidelines – being clear what you’re asking, open-ended questions, probing, etc. – still hold, but the group brings its own dynamic to the situation. The interview becomes more of a group discussion , and the interviewer’s concerns must extend to making sure that everyone gets heard, reining in individuals who dominate the discussion, and keeping the focus on ideas and information, rather than personalities. As with other methods, group interviews have advantages and disadvantages. The former include using the energy of the group to generate more information than might otherwise be forthcoming. Members may stimulate one another to come up with more and more useful material, as their thinking is prodded by the memories and conclusions of others. They can also act as a check on the accuracy of the information provided. In addition, the presence of other, often familiar, interviewees may help to break down shyness or nervousness, and create a relaxed atmosphere in which everyone feels comfortable talking. (The skills of the interviewer at making people comfortable – at least partially by being comfortable herself – are important here.) With these potential positives come the possible negatives of conflict, antagonism, or dislike among group members, as well as other negative feelings or history that can disrupt or twist discussion and make an interview all but useless. There are also problems that can arise from members of the group being too friendly: they may spend too much time in chit-chat, and have trouble focusing on the questions at hand. Group interviews may be useful when resources – and, as a result, interviewers – are limited, or when there are a large number of people who should be, or would like to be, interviewed. Groups probably shouldn’t be much larger than five or six, and interviewers should have, or be trained in, basic group facilitation skills .

Observation

What do we mean by “observation?” For our purposes, there are essentially two kinds: direct and participant observation.

Direct observation is the practice of examining or watching places, people, or activity without interfering or taking part in what’s going on. The observer is the proverbial fly on the wall, often unidentified, who does nothing but watch and record what she sees and/or hears. A direct observation to see how people use a public park, for instance, might consist of one or more observers simply sitting in one place or walking around the park for several hours, or even several days. Observers might come back at different times of day, on different days, or at different times of year, in order to understand as much as possible of what goes on in the park. They might occasionally ask questions of people using the park, but in as low-key and unobtrusive a way as possible, not identifying themselves as researchers.

Some kinds of direct observation – those where people are observed in situations they think are private – have the potential of violating privacy. In these instances, ethics generally demands that the observer obtain the permission of those being observed . In laboratory schools, for instance, where teachers are trained and new educational ideas tested, classes are often observed from behind one-way mirrors. In such cases, both the teachers and the parents of the students are generally informed that such observation may happen, and are asked to sign consent forms. They don’t know exactly when observation is taking place, but they understand that it’s part of the laboratory school environment, and are willing to allow it in order both to improve individual teachers’ skills and to foster the development of better educational methods.

Participant observation involves becoming to some extent part of the life of the people you’re observing – learning and taking part in their culture, their celebrations and rituals, and their everyday activities. A participant observer in the park above might introduce himself into the activities he observes – a regular volleyball game, winter cross-country skiing, dog walking, in-line skating – and get to know well the people who engage in those activities. He would also monitor his own feelings and reactions to using the park, in order to better understand how its users feel about it. He would probably ask lots of questions, and might well identify himself as a researcher.

An effective participant observer may take a long time (in some cases, years) to establish himself in this way. There are exceptions to this rule, of course. Some marketing firms and corporations employ trend-spotters as participant observers. Young, hip, and stylish themselves, these observers are able to identify and mingle with adolescent and young adult trend-setters in brief interactions, and determine what products, styles, and behaviors are likely to catch on soon with young people in general. You may able to do something similar, but it helps greatly if you’re already part of the group that you’re interested in observing, or if the group, like public park users, can include anyone.

Both direct and participant observation can be useful in community assessment. A participant observer in that situation is likely to be a member of the group being observed, because of the length of time it can take to establish an outsider as a participant observer. Direct observation is probably more common as an assessment tool.

Regardless of its type, your observation should be conducted so as to be reliable.

Some guidelines for reaching that goal:

  • Think carefully about the questions you want your observation to answer . You may be looking at people’s behavior or interactions in a given place or situation, or the nature of social, physical, or environmental conditions in a particular place or circumstance. If you’re clear about what you want to find out, you can structure your observation to get the best information.
  • Where and whom should you observe to answer these questions ? You wouldn’t normally look for evidence of homelessness in the wealthiest neighborhood in town, nor would you observe the residents of an Asian neighborhood to find out something about the Hispanic population.
  • When and for how long should observation take place ? Observing commercial activity downtown on Sunday morning won’t get you a very accurate picture of what it’s actually like. You’d need to observe at both busy and slow times, and over a period of time, to get a real idea of the amount, intensity, and character of commercial activity.

What should you observe and record? That depends on the questions you’re trying to answer, but some basics include:

  • The physical characteristics of the setting(s), including weather, if outdoors.
  • The time of day, week, and year.
Clothing reflects the way people choose to present themselves to the world. A mohawk haircut, piercings, and black clothes represent an attitude and, to some extent, a world view, not just a fashion statement. The same is true for an expensive suit, or for an outfit of jeans, wool shirt, and hiking boots. Paying attention to such details can increase both your understanding and the reliability of your observation.
  • The activities, events, and/or places or circumstances observed, and a description of each.
  • The nature of interactions among people.
  • People’s apparent attitudes toward a place, situation, activity, or event – positive or negative, happy, confused, angry, disappointed, etc.
At a neighborhood festival, for instance, an observer could be watching from a window high above the street, from a position just at the edge of the crowd, from within the crowd and the festival goings-on, as a participant in a festival activity, or even as a festival volunteer or organizer. What she would see and hear, what she would experience, and the information she would obtain would be different from each of these viewpoints.
  • The observer’s own responses and attitudes, including the physical and psychological comfort of the observation. This should be separate from the recording of the observation itself, and, in the ideal, should not influence the objective recording of what was observed.

How do you record observations? That depends on the nature of the observation and on your resources. Video recording, unless it’s done from a concealed spot, or in a situation where such recording is expected (a tourist site, or that street festival, for example), can change people’s behavior or put the observer under some suspicion. Audio recording is much less obvious, but also provides less information, unless it’s specifically sound information that you’re seeking. In most cases, recording would be done with a notebook and pencil or with a laptop computer. If recording during the observation would be disruptive or out of place, you’d probably wait till after you had left the situation – but as soon after as possible, so as not to forget or confuse details.

Analyze the information.

Once you’ve gathered information by whatever qualitative method, you have to figure out what it tells you . Some of that will be obvious: if you’ve been interested in who uses that public park we were talking about earlier, and your observation tells you that it’s mostly young people, you have an answer to your initial question . Your next questions may be why other groups don’t use the park as much , and whether the fact that it’s largely used by young people keeps others away. When you’ve answered those questions, you may have generated others , or you may have a basis for planning a campaign to get more people using the park.

Make and carry out a plan to address the issue or problem you’ve identified or were concerned with.

The final step here is to use the information and analysis that came from your use of qualitative methods to change the community for the better. All the assessment in the world is useless if it doesn’t lead to some action that’s meant to create positive change.

Qualitative methods of gathering information – methods such as interviews, observation, focus groups, and community meetings that don’t always yield results that can be reduced to numbers, or that are used to capture a level of information difficult to get with quantitative methods – are often extremely useful in community assessment, especially when used together with quantitative methods, which do give numerical results. Qualitative methods can get at the things that numbers don’t, such as the reasons for people’s actions, or community history. They can help to identify community issues and needs, and provide a basis for planning community efforts that lead to long-term change.

Online Resources

The Action Catalogue is an online decision support tool that is intended to enable researchers, policy-makers and others wanting to conduct inclusive research, to find the method best suited for their specific project needs.

Chapter 6: Research Methods in the "Introduction to Community Psychology" describes the ecological lens in community research, the role of ethics, the differences between qualitative and quantitative research, and mixed methods research.

Harnessing Qualitative Data to Advance Health Equity is a presentation on how data has the potential to both paint an accurate picture of what sexual and intimate partner violence prevention practitioners and advocates know is happening on the ground  and  convey that reality to policymakers.

Qualitative assessment of the Washington State Department of Social and Health Services goals provides a summary of the results of focus groups conducted to explore the public's perception of relevant issues. This is a summary, but you can also download a PDF of the full report.

Qualitative Methods  provides brief descriptions of four standard qualitative research methods: participant observation, direct observation, unstructured interviews, and case studies.

Qualitative Research Methods  is a compendium of sites with papers, links, etc. to qualitative research methods.

Print Resources

Berg, B. (2007),  Qualitative Research Methods for the Social Sciences  (6th edn.) Boston: Allyn and Bacon.

Berkowitz, W. (1982).  Community impact . Cambridge, MA: Schenkman Publishing Company, Inc.

  • Open access
  • Published: 13 May 2020

Conducting operational research in humanitarian settings: is there a shared path for humanitarians, national public health authorities and academics?

  • Enrica Leresche   ORCID: orcid.org/0000-0003-4743-5821 1 ,
  • Claudia Truppa 1 ,
  • Christophe Martin 1 ,
  • Ariana Marnicio 2 ,
  • Rodolfo Rossi 3 ,
  • Carla Zmeter 1 ,
  • Hilda Harb 4 ,
  • Randa Sami Hamadeh 4 &
  • Jennifer Leaning 5  

Conflict and Health volume  14 , Article number:  25 ( 2020 ) Cite this article

4849 Accesses

11 Citations

19 Altmetric

Metrics details

In humanitarian contexts, it is a difficult and multi-faceted task to enlist academics, humanitarian actors and health authorities in a collaborative research effort. The lack of research in such settings has been widely described in the past decade, but few have analysed the challenges in building strong and balanced research partnerships. The major issues include considering operational priorities, ethical imperatives and power differentials. This paper analyses in two steps a collaborative empirical endeavour to assess health service utilization by Syrian refugee and Lebanese women undertaken by the International Committee of the Red Cross (ICRC), the Lebanese Ministry of Public Health (MoPH) and the Harvard François-Xavier Bagnoud (FXB) Center.

First, based on challenges documented in the literature, we shed light on how we negotiated appropriate research questions, methodologies, bias analyses, resource availability, population specificities, security, logistics, funding, ethical issues and organizational cultures throughout the partnership.

Second, we describe how the negotiations required each partner to go outside their comfort zones. For the academics, the drivers to engage included the intellectual value of the collaboration, the readiness of the operational partners to conduct an empirical investigation and the possibility that such work might lead to a better understanding in public health terms of how the response met population needs. For actors responding to the humanitarian crisis (the ICRC and the MOPH), participating in a technical collaboration permitted methodological issues to be worked through in the context of deliberations within the wider epistemic community.

We find that when they collaborate, academics, humanitarian actors and health authorities deploy their respective complementarities to build a more comprehensive approach. Barriers such as the lack of uptake of research results or weak links to the existing literature were overcome by giving space to define research questions and develop a longer-term collaboration involving individual and institutional learning. There is the need ahead of time to create balanced decision-making mechanisms, allow for relative financial autonomy, and define organizational responsibilities. Ultimately, mutual respect, trust and the recognition of each other’s expertise formed the basis of an initiative that served to better understand populations affected by conflict and meet their needs.

This paper presents a structured analysis of a multi-disciplinary research partnership formed to assess a humanitarian response to a protracted crisis. It aims to address the challenges described in the literature regarding efforts to conduct a collective research process in a humanitarian context. This analysis derives from the experience gained through the collaborative engagement in Lebanon of a humanitarian organization, an academic centre, and a government agency. The research initiative focused on utilization of Primary Health Care (PHC) services by Lebanese and Syrian women and extended for over 4 years, from the inception of the project until the first peer-reviewed publication [ 1 ].

Research gap

Conducting operational research in the context of a humanitarian response is a difficult, challenging but still much needed enterprise [ 2 , 3 , 4 , 5 ]. Research conducted in humanitarian settings has increased but remains of insufficient quantity and quality [ 2 , 3 ]. At the same time, demands for greater accountability [ 2 , 3 , 6 , 7 ] and questions around equity in the research process are rising [ 8 , 9 , 10 , 11 ].

In the context of dispersed refugees and Internally Displaced Populations (IDPs), research efforts with strong methodologies are especially scarce [ 2 , 3 ]. There is a documented disparity between regions, with a gap noted for the Middle East [ 2 , 12 ] except for specific over-researched communities [ 13 ] and, despite key initiatives, a gap also remains on Sexual and Reproductive Health (SRH) [ 12 , 14 , 15 , 16 , 17 ].

A recent series on “health in humanitarian crises” described the need to improve the quantitative and qualitative evidence; to measure health outcomes in terms of mortality and morbidity; and to strengthen research on safe access to facilities and affected populations [ 2 , 4 , 6 , 7 ]. Analysis of processes that overcome some of the key challenges are explored in global health research [ 18 , 19 , 20 ] but to a much lesser extent in the humanitarian world [ 15 , 21 ]. Little is known on how to build formal ventures involving humanitarian actors, national stakeholders and academics in a region such as the Middle East, which has been heavily affected by conflict in the past decade [ 12 ]. Also under-researched in these settings are negotiations around resource distribution (access to grants, academic expertise, understanding of the global political context, access to the field) [ 8 , 9 , 10 ] or around cognitive and moral dynamics (notions of trust, ethical issues, direct ties with communities) [ 11 , 13 ].

10 key documented challenges of conducting research in humanitarian settings

We first conducted a scoping review of the literature in English, including a recent academic series on evidence in humanitarian settings [ 2 , 3 , 4 , 6 , 21 ], research published by humanitarian actors including methodological papers [ 5 , 22 , 23 , 24 , 25 ], academic analyses of humanitarian and global health partnerships [ 12 , 20 , 26 , 27 ], research on SRH and conflicts including the Middle East [ 14 , 15 , 16 , 17 ] as well as literature related to research ethics and in conflict settings [ 8 , 9 , 10 , 11 , 13 ]. Then, we compared issues emerging from the literature with our recent experience and we collectively agreed upon a set of 10 key challenges reflecting the main trade-offs we had to negotiate throughout the partnership. We discussed what important factors where at play while building practical solutions to meet each challenge. We also looked at how the same body of literature described limitations of stand-alone approaches as opposed to a partnership. Finally, we reflected on whether the ways that our partnership had been initiated, built and managed contributed to our meeting these 10 challenges and we sought to define main take-home findings for each partner.

The 10 key challenges documented in the literature are outlined below and will be discussed in-depth in the body of the debate.

Using the right methodology based on an appropriate research question constitutes a significant challenge [ 2 , 3 , 6 , 17 , 22 , 23 , 24 , 27 ]

Failing to account for bias, study limitations, and lack of statistical data are also seen as major shortcomings [ 2 , 12 , 15 , 21 ]

Specifying the population to be studied in conflict affected areas (including populations on the move) involves balancing issues of comprehensiveness and practicality [ 2 , 6 , 13 , 21 ]

Measuring the initial health status of displaced populations is difficult especially in conflicts of long duration where essential baseline information is usually missing [ 3 , 14 , 28 , 29 ]

Securing the functional balance of resources (such as financial, technical, human, and time) may prove daunting [ 2 , 5 , 15 , 16 , 21 , 22 , 23 , 25 , 26 ]

Adapting methodologies for field conditions becomes troublesome because lengthy prospective cohort studies or randomized controlled trials (RCTs) are difficult to conduct in unpredictable and volatile environments [ 2 , 21 , 23 ]

Constraining research efforts are distortions imposed by issues of security and logistics [ 2 , 3 , 5 , 11 , 12 , 13 , 15 , 16 , 17 , 28 ]

Unstable and unpredictable funding patterns restrain the perceived scope of research [ 2 , 3 , 21 , 23 ]

Ethical issues are complex and in certain situations of marked power differentials can appear prohibitive [ 2 , 5 , 6 , 8 , 9 , 10 , 11 , 13 , 30 ]

The differences in the analytic cultures of humanitarian as compared to academic actors constitute yet another type of barrier [ 2 , 21 , 22 , 23 , 24 , 31 ]

Research driven by academics

In the past decades, several initiatives have been launched to strengthen global and humanitarian research capacity [ 2 , 3 , 7 , 15 , 18 , 26 , 32 ]. Existing academic field collaborations include partnerships between universities, non-governmental organizations (NGOs) or academic networks in relatively stable environments [ 15 , 18 , 19 , 26 ]. Recent academic proposals to promote research in the humanitarian field include establishing a global research service linked to existing coordination bodies “probably housed by academic centres of excellence” [ 3 ]. Yet processes driven by academics in the global north often retain the main roles of access to funding and control of study design and analysis, while NGO field personnel or national academic partners perform the tasks of community engagement, data collection and initial analysis [ 8 , 9 , 18 , 33 ]. These patterns of power and roles may carry the risk that the senior academic managers lack awareness of the relevance of the research question to field operations and that field stakeholders do not contribute substantively to the research question, data analysis and interpretation. It is also possible the implementers fear a diversion of resources from the beneficiaries and do not incorporate results into subsequent programs and that key decision-makers may dismiss the process as a top-down academically driven activity, resulting in a possible weak impact on the program [ 19 , 23 , 25 , 32 , 34 ]. In unbalanced partnerships, academic institutions may be perceived as owning key ideas and results [ 8 , 18 ] and thus might miss the contributions of field actors relating to their insights on equity considerations, community engagement, policy making and benefits to the local population [ 9 , 18 , 19 , 33 ].

Research driven by humanitarian actors

In the past decades humanitarian actors such as Médecins Sans Frontières (MSF) have made significant efforts to develop research within the humanitarian sector including funding, training of staff, ethical review processes and engagement in academic debates [ 22 , 23 , 24 , 32 , 35 , 36 ]. Global partnerships such as the Structured Operational Research and Training Initiative (SORT IT) have expanded human resource capacities to conduct operational research in humanitarian settings [ 25 , 35 , 37 ]. Between 2009 and 2014, 236 participants were trained over a period of 9 to 12 months resulting in 186 manuscripts published [ 35 ]. These efforts grounded in the field, close to beneficiaries and dealing with operationally relevant questions have substantially increased the number of peer reviewed publications [ 25 , 35 , 37 ]. Major issues persist, including the absence of actors from the Middle East involved in such global initiatives [ 25 , 35 ] and challenges of maintaining a research community in conflict affected regions where access is variable [ 2 , 13 , 15 ]. Other concerns include the perception of research diverting operational funds and resources, the long time for implementation of the study results, the lack of writing skills for publication and the need for substantial mentorship [ 18 , 25 , 32 ]. Research driven by humanitarian actors also may lack the capacity or support to assess results in the context of broader concerns in the existing literature, an assessment that could contribute critical insights on local operations and local findings [ 18 , 19 , 21 , 22 ].

In humanitarian settings, insecurity, lack of social and economic supports and precarious legal status affect the population to be studied. In such contexts research outcomes have important political, financial and operational implications for a multitude of intertwined stakeholders [ 6 , 21 , 22 ]. Health authorities, international actors, and academics have different perspectives, technical capacities, resources, and priorities. It is in the interest of all, however, to ensure that the humanitarian response is appropriate, reaches the most vulnerable, and mitigates the effects of the crisis on the population and the implicated health system [ 21 , 38 , 39 , 40 ].

The first objective of this paper is to identify retrospectively key factors that allowed the ICRC, the Lebanese Ministry of Public Health (MoPH) and the Harvard FXB Center for Health and Human Rights (FXB Center) to overcome key documented barriers in a partnership to conduct field operational research in Lebanon [ 1 ].

The second objective is to explore how each partner was able to bridge the divide between humanitarian or academic driven research. Through the experience of our multi-disciplinary team in the context of Lebanon, we identify a collaboration pathway that required both academics and humanitarians to get out of their comfort zones. We discuss factors at the intersection between and among humanitarians, academics and national health authorities that need to be addressed in order to build a robust research partnership in a protracted crisis setting.

This paper does not consider the implementation of operational changes as a result of this research. We recognize that operational research in humanitarian settings has important ethical and operational dimensions related to its usefulness and implementation, especially once scarce resources have been invested to perform the research [ 5 ]. However, the focus of this work is to analyse the joint enterprise itself. The use and implementation of research results will be elaborated in a subsequent paper.

The three actors

The ICRC mandate since 1863 has been to protect the life and dignity of victims of armed conflict and in other situations of violence and provide assistance within the frame of the Geneva Conventions [ 41 ]. The ICRC has been present continuously in Lebanon since 1967. In 2015, the ICRC scope of activities changed significantly to support the Lebanese health system response to the needs of an estimated total of up to 1.5 million registered and unregistered Syrian refugees present in the country [ 38 , 42 ].

The MoPH and public healthcare system in Lebanon has long been subjected to political and economic unrest [ 43 , 44 ]. In the past the MoPH was heavily affected by the Lebanese civil war and since 2011 has been hit hard by the Syrian refugee crisis, resulting in the Lebanese society hosting the highest per capita concentration of refugees in the world [ 38 , 43 , 44 , 45 ]. Refugees are scattered among the poorest Lebanese in informal tent settlements in rural areas or in overcrowded urban areas including Palestinian camps [ 38 , 46 ]. Over 50% of the refugees are estimated to be women and children [ 17 , 46 , 47 ]. Syrian refugees are granted access to the same channels of healthcare as Lebanese through a network of PHC services embedded in a complex privatized system [ 43 , 44 ]. In 2017, Syrian refugees constituted half of the total beneficiaries in the MoPH network [ 48 ]. The majority of deliveries among younger women are by Syrians, constituting 70.3% of all deliveries under the age of 20 with a maternal mortality ratio double that among Lebanese in 2016 and 2017 [ 49 ]. The refugee demand for public healthcare services in Lebanon has coincided with independent efforts by the MoPH to promote domestic access to the MoPH PHC network [ 43 , 50 ]. These two combined trends might have strained important aspects of the health system.

The FXB Center is an academic institution focusing on research related to provision of health care and other rights-based supports and protections for vulnerable populations in volatile settings. It conducts action-oriented research to support policy and advocacy for the promotion of human rights and adherence to norms of international humanitarian law in contexts of armed conflict, forced migration, and widespread social distress.

Factors precipitating the partnership

Some important factors had an influence on the creation and subsequent trajectory of the partnership. Two international forums on the need to reach Every Woman and Every Child Everywhere (EWEC) in Abu-Dhabi and in Washington in 2015 nurtured discussions between the FXB Center and the ICRC around issues of measuring access to supported services at population levels [ 40 ]. Despite ICRC’s long experience in providing medical assistance to victims of armed conflict, evidence was missing on interventions that improved access measured at the population level, apart from important but relatively isolated efforts to assess quality of care mechanisms [ 51 , 52 ] or immunization campaigns [ 53 ].

For the MoPH, the observed patterns of utilization in reproductive health services raised questions about access and referrals in the existing response to the crisis. The MoPH also saw the increasing demand for health services from the refugees in the context of a progressive funding gap, which increased from 24% (US $29 million) in 2013 to 55% USD ($159 million) in 2018 [ 54 ]. These unmet shortfalls forced the public system to absorb the cost—mitigated only marginally by imposition of higher out-of-pocket expenditures for certain services. The accumulated health budget deficit in 2018 (estimated at 15 million USD) led to a gradual increase in poverty for the crisis-affected populations [ 38 ], triggering operational questions on how to increase services utilization for affected Lebanese and Syrians women in particular. The heavy burden on the Lebanese public health system, the protracted characteristics of the crisis, combined with the presence of over 100 NGO partners [ 38 ] led the ICRC to re-examine its strategy for support.

Discussions with the MoPH brought all three actors in 2016 to work on a concept note as the basis for the research design. The key question became: Was the ICRC primary health care support reaching those most affected by the crisis and matching beneficiaries’ needs in terms of access, cost and appropriateness? The research aim was to evaluate ICRC support in response to the Syrian crisis, to inform evidence-based health programming, and to nurture a set of discussions around policy with the MoPH in Lebanon [ 1 ].

How 10 documented challenges were approached

The section below addresses the influence of access to resources and the management of cognitive and moral dynamics. Resources include financial means, technical skills, contextual understanding, operational experience and access to the field. Dynamics refer to building trust, maintaining transparency, creating shared motivation, and agreeing on ethical decisions. The discussions around collaborative solutions to the documented challenges are described. The power dynamics at play to negotiate the trade-offs needed to overcome the different constraints are analysed.

Using the right methodology based on an appropriate research question constitutes a significant challenge

The research question that the ICRC and the MoPH sought to answer was how to account for the low utilization of sexual and reproductive health services for women attending the ICRC supported facilities. The gap in understanding was a key and common issue discussed between the ICRC and the MoPH PHC teams at field and management levels [ 1 ]. While the field response teams (ICRC, MoPH) had a good understanding of the operational context, using the right research methodology to answer this question depended on the precision of the research question, on whether the ICRC and the MoPH could realistically answer it, and whether the academic partner understood the specificities of the context [ 22 ]. The FXB team had experience in conducting operational research with humanitarian organizations but lacked in-depth knowledge of the situation of affected populations in Lebanon in terms of existing monitoring records, constraints on unregistered refugees, and geographical as well as cultural and political specificities of the areas selected. In order to build more equal understanding, this gap was overcome through an initial two-week scoping field visit proposed by the academic partner to explore the complex humanitarian response, the nature of the Lebanese health system, the epidemiological context (including the several different populations being served) and the field operational constraints.

In this scoping visit, existing monitoring data were analysed collectively and discussed. Unmet needs in the literature and in the Lebanese context included the lack of preventive services [ 55 , 56 , 57 ], high out-of-pocket payments [ 38 , 58 ] and high use of emergency obstetric care services [ 38 ]. The research question was driven by the need to understand who was missed and why. Formulating the research question involved recognizing the relevance of diverse skills, the constructive engagement of field response personnel and the assurance that each partner’s interests would be represented and respected [ 59 , 60 , 61 ]. Precise framing of the research question was formulated in the scoping visit through conversations between and among the ICRC, the FXB team in the field, the MoPH, and Skype calls with the FXB team leader in Boston. The scoping visit was essential to discuss constraints, express expectations and start building trust in a joint leadership structure [ 8 , 59 ]. The results of the scoping visit in specifying the research question also permitted reaching a written agreement over: a) the decision to cross-analyse population-based and facility-based surveys; b) the choice of a sampling frame that would ensure that everyone would be included; c) the use of qualitative interviews to understand why people were missed; d) the decision to build a questionnaire together and e) the dissemination of findings. This agreement in effect empowered the MoPH and ICRC response actors while assuring the FXB partner on the project’s operational feasibility and technical validity. This proposal was formally signed by the three leaders of the research team and served as a statement of commitment to a process of inquiry that guided us all through the next phases of field investigation, data analysis and writing of the report.

Failing to account for bias, study limitations, and lack of statistical data are also seen as major shortcomings

In this study the joint teams attempted to resolve the trade-offs between comprehensiveness and feasibility. Together, they analysed and addressed the different possible biases (observer, selection, and those implied by decisions on statistical power). The partners agreed that the main aim was to respond to the core research question in a way that would be operationally relevant, enabling better response to unmet needs within contextual specificities and response capacity. The academic partner entered the collaboration with an interest in many different aspects relating to the registration status of the refugees, their living conditions, and their sense of human security but these questions fell outside the research focus of the response partners on specific issues of refugee and host health needs and health-seeking behaviour. Given the limitations of funding, time and security imperatives, it was agreed by consensus to make the research more strictly operational [ 5 ]. To ensure quality and appropriateness of the questions used, the questionnaires were discussed with the health team at ICRC headquarters and the MoPH, translated into Arabic, back translated into English and piloted. The questionnaires were designed using the Qualtrics software package and the data were electronically captured in a secure off-site server for statistical analysis by the academic partner [ 1 ]. The team of interviewers, mainly Lebanese Red Cross (LRC) volunteers who were paid a nominal stipend, underwent a 2-day standardized training. This training was administered by FXB Center researchers to minimize interviewer bias and emphasize the importance of respectful data collection, with a particular focus on gender, cultural and historical differences between and among populations studied. To obtain a more in-depth understanding of issues identified at the community level by ICRC and MoPH field teams, a qualitative component was added through Focus Group Discussions (FGDs) with key community members of both Syrian and Lebanese populations. The FGDs were audio recorded with the consent of the participants and, to ensure optimal quality, were transcribed verbatim by native Arabic speakers with medical backgrounds who were familiar with Lebanese and Syrian Arabic. This text was then translated into English by a native English speaker, a researcher from the FXB Center fluent in Arabic.

Specifying the population to be studied in conflict affected areas (including populations on the move) involves balancing issues of comprehensiveness and practicality

The difficulty of including unregistered refugees [ 38 , 58 ] in the sampling frame was overcome through the following methodological strategies discussed with the academic partner:

To capture health needs from a population perspective, a cross-sectional survey questionnaire was developed to gather data from households living in the catchment areas of ICRC-supported facilities

To understand the appropriateness of ICRC supported services, a clinic survey questionnaire using a Likert scale was developed

To include all potential sub-groups in absence of registers, specific aerial Geographic Information System (GIS) mapping tools were used and an experienced ICRC GIS officer, with technical support from the FXB team, built a two-stage cluster-based randomized mapped sample of the target population [ 1 ].

By deliberate design and methods of data collection, to make sure that respondents were protected, it was made impossible to retrace the identity or location of any specific person responding to the questionnaire. Respondent confidentiality was further ensured using anonymized and non-linkable data, an ICRC data protection requirement that, after discussion, the academic partner accepted. The FXB Center would have preferred to collect data that included the geolocation variables in order to analyse findings according to socio-economic variables. From the ICRC perspective the risks of collecting, storing and managing GIS data were greater than the benefits to the data analysis and to the beneficiaries. GPS data were not recorded in the tablet, to grant an extra layer of protection of personal data of the participants. This decision required a major negotiated trade-off between protection and precision. In these settlements, where poor Lebanese lived near poor Syrian refugees, and the very poor refugees lived in informal tents, location was proxy for socio-economic status. Without data on location of individual informants, it was impossible to assess their responses in terms of this variable.

Measuring the initial health status of displaced populations is difficult especially in conflicts of long duration where essential baseline information is usually missing

Two issues with existing research or data availability were overcome through academic technical advice:

a) Existing sampling frames used for Syrians were based on (or calculated using) United Nations High Commissioner for Refugees (UNHCR) registers or convenience sampling [ 45 , 58 , 62 , 63 ]. These approaches potentially (partially or totally) missed an estimated half million unregistered Syrian refugees [ 38 , 58 ], while the ICRC fundamental intent was to include all [ 1 ]. To ensure that all vulnerable Lebanese and registered or unregistered Syrian had a similar probability of being included in the study required the use of GIS sampling.

b) Given the scattered humanitarian engagement in delivery of primary health care [ 44 ] despite MoPH continuous efforts [ 43 ], a crucial question was whether service gaps identified through facility-based data were covered by other actors or not. Therefore, the central axis of the research was to cross-analyse population-based and clinic-based information to find out if specific groups or expressed health needs were systematically missed.

Securing the functional balance of resources (such as financial, technical, human, and time) may prove daunting

In technical terms, conducting the research with existing ICRC and MoPH team members was possible only because of the specific roles, profiles and strong motivation of the participating national and international staff. It was key to engage senior team leaders who had research skills and the authority to adapt the subsequent operational response. The involvement of senior managerial staff in initial steps permitted the opening of a balanced negotiation space for the duration of the project. The background academic and public health training of MoPH and ICRC core team members allowed discussions to take place on common ground. The sound GIS capacity of the ICRC in-country team was essential for the area cluster setup. In parallel, the interaction with the FXB team constituted an opportunity for all the involved staff to learn and acquire research skills -- especially technical capacity in research ethics and field sampling methodologies.

The time trade-offs required by the field research comprised only one aspect of the time challenges baked into a project where the academic team was often in a different time zone. Yet the teams managed to stay involved on a very frequent basis via Skype calls and emails. The time allocated for coordination, planning, and research was in addition to the usual workload of all actors. The time trade-offs for the ICRC field staff were partially compensated by an opportunity to learn, to analyse the existing response through the lens of research, to nurture the understanding of operational complexities and to build research skills. Furthermore, the timing phases of the study had to be adjusted to adapt to the ICRC operational envelope, a balance between what was feasible within the ICRC field budget while remaining acceptable to the academic partner.

Adapting methodologies for field conditions becomes troublesome because lengthy prospective cohort studies or randomized controlled trials (RCTs) are difficult to conduct in unpredictable and volatile environments

In order to maintain technical standards for data collection from the field, as recommended by the academic team, a number of difficulties had to be overcome, requiring more time and skills. These issues were met by combining the complementarity of the three partners in terms of knowledge, technical capacities and continuity in key positions.

Using a GIS two stage cluster-based population sample [ 1 ] was essential to allow inclusion of all population groups but required specific field visits to inform the sampling process, convey prospective information to key stakeholders in hard to reach areas and conduct adequate trainings to ensure the proper use of geospatial maps. The FXB team’s field presence allowed the researchers to determine the parameters of the GIS sampling while incorporating the difficulty of the terrain (border areas), the complexity of the clustering methodology, and the necessity to make many adjustments in a short time frame.

Choosing to combine population- and clinic-based surveys was necessary in order to understand who was missed and what needs were unmet [ 1 ]. This design led to several complexities. First, the design required additional field visits and more manpower to meet the methodological requirements. The FXB team, in the interest of ensuring an unbiased approach to clinic surveys and to support the ICRC field research effort, proposed to have its Arabic-speaking researchers participate in the field research. The ICRC agreed that FXB researchers would augment the ICRC teams and would conduct clinic surveys. Second, this accommodation required the ICRC to engage in further field negotiations to explain why such a complex research design was necessary. Reciprocally, direct participation in the data collection required the FXB team to accommodate to the ICRC’s tight research schedule. Yet the benefits were important: The ICRC field team had the opportunity to consult in real time with the FXB researchers on questions of sampling methods and the FXB team gained deeper understanding of the complex operational, security and administrative regulations enmeshed in the work of both the ICRC and the MoPH.

Constraining research efforts are distortions imposed by issues of security and logistics

To allow sufficient time for quality data collection, in each site 16 teams of 2 interviewers each were deployed, each team conducting five 45-min interviews per day over a five-day period, for a total of 400 interviews per site and 1479 households approached [ 1 ]. Each team was supported by an ICRC team member onsite everyday resolving logistical constraints and providing guidance on sampling based on geospatial maps. The LRC volunteers relied on their own organizational hierarchy to communicate issues which were then solved between both program coordinators (ICRC, LRC). The scheduling burden on human resources was partially overcome by mobilizing ICRC field teams to include national and international, health and non-health, as well as management personnel. The initiative was an overall team effort and the trade-off was the time diverted from field operations resulting in delays in routine activities. Conversely, this extra mile was supported by ICRC management because of the expected value of achieving a better understanding of operational issues to be addressed in subsequent planning. The effort also represented an opportunity to engage in remote areas and build team relationships among senior MoPH, ICRC and FXB staff — critical to the continuity of the study between 2016 and 2019.

While the ICRC team was relying on the FXB for technical input on methodology and analysis tools, the academic team was relying on ICRC and MoPH field staff for managing all aspects of field preparation and security access. Consequently, the joint pace of field implementation was slower than initially planned. The ICRC, based on transparency, engaged with different stakeholders including municipalities, security forces and influential community groups or leaders to explain the purpose of the study and ensure a smooth process. The FXB team also had to rely entirely on internal processes for field security and abide by ICRC and MoPH operational rules.

Unstable and unpredictable funding patterns restrain the perceived scope of research

The financial barrier was partially overcome by integrating the resources for the research into the regular ICRC 2016 and 2017 field program at an affordable rate. The direct costs of the research included only FXB team field visits. The budget did not cover additional time allocated by the FXB, ICRC or MoPH teams for off-site work or regular in-depth Skype discussions to resolve issues. Financially, the direct costs of the study represented around 10% of ICRC PHC direct costs for the program, with a potential critical return on investment in terms of refining the appropriateness of care delivered. The uncertainty in estimating the exact budget and duration of the study required mutual trust, the support of the ICRC Beirut operational management team, and an overall capacity of ICRC field actors to be flexible for budget management issues. The indirect economic costs of ICRC, MoPH and FXB contributions to this research effort, in terms of human resources and time, were not included and represent an important “sunk cost” for each partner, which each absorbed internally. The indirect costs for the ICRC involved putting an additional burden on busy teams: the expected added value was the knowledge gained to help guide future response. Another issue involved the status of the academic partner, the ICRC headquarters, and different contractual obligations. The timing requirements at the field level in Lebanon demanded a rapid start. Hence the FXB Center relied on a flexible consultancy process, negotiated at the level of the ICRC Beirut delegation. The work could thus be conducted within the operational framework of the organization, which created real clarity and stability for the technical partner. This arrangement permitted essential operational latitude and speed but left the ICRC Geneva headquarters (HQ) distanced from the process, resulting in the need to include the HQ health team on decisions and findings in an ex post facto mode.

Ethical issues are complex and in certain situations of marked power differentials can appear prohibitive

ICRC facility-based information for monitoring purposes relies on existing aggregated anonymized processes, collecting personal data exclusively within the ICRC data protection frame [ 64 ]. As this study included individual and household visits and interviews, an additional ethical review was necessary, managed by the academic partner, welcomed by the ICRC and approved by the MoPH. The ethical approval was sought from the Institutional Review Board (IRB) of the Harvard T. H. Chan School of Public Health, which was granted upon submission of the study protocol and conditional on the ethical approval of the MoPH which was received. All research staff had to adhere to the ICRC code of conduct. No internal or external participant who had not received the ethical training from the FXB Center researchers was allowed to join the interviewers’ teams.

Discussing and acceding to these ethical requirements created a strong sense of purpose and cohesion among all members of the combined research teams. Adherence to the ICRC code of conduct required the FXB team to acknowledge the field dynamics--and only then made it possible for them to have access to the population under study. These agreements allowed each leader to manage protection, confidentiality, ethical and contextual issues within and among respective teams. The process by which FXB and ICRC staff adapted to joint field-group dynamics based on shared expertise, equality of status, respect and interdependence was in the main a mutually enriching experience for both teams. Despite careful oversight and agreements, however, one early situation of relational tensions had to be monitored, discussed and managed accordingly.

Given the time constraints of the ICRC and the MoPH and the prospect of a lengthy IRB process entailed in attempting to obtain ethical approval to interview adolescents below age 18, it was decided that adult caretakers of these younger adolescents would be sought to represent this specifically vulnerable group. The recognized trade-off in this decision was that the researchers could not capture the independent views of this younger population.

The research was conducted respecting official working hours and religious celebrations during which the field work was stopped. Whenever people interviewed needed medical care, they were referred in accordance with standard operating procedures of the ICRC and the MoPH [ 5 , 33 ]. To grant priority to the health needs of a population under study is also a prerequisite of gaining academic IRB approval but usually operationalizing this requirement requires considerable advance planning and negotiation with local actors. It was a significant boon, from the academic perspective, that this aspect of the field research could rely on prior pathways of referral and care.

In terms of benefits to the general population affected, the preliminary results of the study were used as the basis for substantial recommendations to re-orient the response then ongoing in the field [ 5 , 8 , 33 ]. There was a shared ethical commitment to “do no harm” to protect the response capacity of local actors beyond the time of the research [ 33 ]. This responsibility included anticipating and minimizing potential negative side-effects of the study on beneficiaries and on protecting the overall acceptance of the ICRC and MoPH among the communities. One positive side-effect of the cross-sectional population-based questionnaire was to raise awareness among all community members of ICRC supported services.

The differences in the analytic cultures of humanitarian as compared to academic actors constitute yet another type of barrier

For the academic partner, the tenth challenge surfaced at the beginning and the end of this collaborative research journey. It was embedded in the initial decisions not to maintain records of the geographic coordinates of those interviewed (thus making it impossible to correlate findings with location) and not to seek retrospectively reasons for participant refusals to participate. In these instances, the priority was placed on finding the information that would be of use to the operational actors and in adhering to ICRC codes of conduct regarding protection of individuals. The academic partner determined that these compromises were acceptable in order to gain further insights into the dilemmas and challenges of humanitarian action in general and in the context of Lebanon.

This tenth challenge also arose very early in discussions that defined the grounds for participation of an academic centre in the research endeavour. Traditionally universities demand sole authority on copyright and access to data gathered during the study. Fortunately, from prior efforts and experience in humanitarian settings, the FXB partner had negotiated with the university that copyright and access to data sets might be joint. This prior set of efforts allowed the FXB team to enter at the beginning of this collaborative investigative effort with an openness to the range of modes of publication.

For the humanitarian response actors, the capacity to create a learning space to integrate analytic information from the field is challenging in a competitive humanitarian arena where the focus is on obtaining quick positive results [ 21 ]. Discussing and sharing the results at different levels, including negative results, proved critical to adapting the subsequent health response and interpreting the identified gaps as an opportunity to change rather than a sign of failure. This partnership contributed to developing a “culture of enquiry” [ 24 ] among field responders and managers to empower them to discuss practical solutions [ 6 , 24 ].

The results of the study suggested that there was a mismatch between the services supported and the expressed needs at population level [ 1 ]. The study also showed a lack of community awareness of these services [ 1 ]. These results could not have been inferred from monitoring data at the facility level only. In order to share learnings and decrease power differentials related to academic knowledge, the ICRC staff engaged in the research and the academic partner developed a joint diffusion strategy explaining the power of the combined population- and clinic- based surveys. Progress updates were presented regularly by field teams including FXB members to key ICRC Beirut health and management staff affected by the findings. At the conclusion of the field research, preliminary results were presented by senior FXB and ICRC team members at the Geneva HQ and Beirut Delegation levels. Early discussion of the results allowed the ICRC Beirut health team to explain the operational implications prior to the completion of the formal internal ICRC comprehensive report.

This experience allows us to describe how the models of academic- or humanitarian- driven inquiry were accommodated in a joint response to a documented research gap. We also explore how this partnership allowed us to go beyond some of the limitations observed in the literature.

How the FXB center went beyond academic driven research

What drove the immediate recognition that the FXB team would be the technical partner rather than lead the endeavour were factors of experience in other refugee settings and philosophy of approach. The FXB research team knew that when an effective government controlled access to the refugee populations and all health interactions were conducted through field actors, the role of academic researchers would need to be to complement these other factors and competencies. Access to the population would be mediated by those responsible for security and the research questions would have to be of fundamental operational interest. The FXB team also knew that the extent to which issues such as appropriateness of care or observance of norms of human rights would be discussed depended on the volatility of the situation and the integrity of the humanitarian partner. The FXB team’s prior knowledge of and respect for the MoPH and the ICRC shaped the team’s confidence in entering the work as a technical partner, knowing that the ultimate result would be of benefit to all parties, including the beneficiaries. The value of the investigation, including the different insights gained in the process, was determined by the FXB team to be more than worth the gap in funding that was not accounted for in the relatively modest envelope integrated in the Delegation budget.

Furthermore, all FXB team members knew that the partnership would be intellectually fruitful and engaging. Key members of all partners shared a high regard for the power of epidemiological inquiry and recognized that structured quantitative inquiry could yield important understandings for operations in even the most distressed settings. These shared values and skills provided a crucial bedrock on which to build the collaboration.

The FXB researchers understood in general the issues facing refugees forcibly displaced in war but the particularities of their circumstances as self-settled populations in poor communities in Lebanon were of significant specific interest to the team. In this vein, the FXB knew from past experience that such work would require time that would not be compensated. The team also anticipated that what would be learned, generally and for the academic community, would be of great value. The unequivocal requirement for sufficient time is described in global health and academic partnerships [ 18 , 23 , 26 ] and is essential if academic researchers are to engage in humanitarian settings.

The FXB and ICRC teams recognized from the outset that the issues of conflicts of interest, ownership of data, and right to publish would require in-depth discussion, keeping in mind that these issues might potentially mean that operational constraints could override a publication agenda, should the protection of or access to the affected population be at stake. The fact that lead researchers from the ICRC and MoPH were academically trained combined with the readiness of the ICRC as an institution to explore the research question made the discussion very straightforward and allowed FXB researchers to focus on supporting the ICRC’s field research.

The need to be able to move away from the prime objective of publishing results to improving the operational response was one key pillar of the discussion. Another was mutual acknowledgment of the need to discuss the interests of each partner in the use of results and the presentation of the results in modes that would support the beneficiaries.

How the MoPH and the ICRC went beyond humanitarian-driven approaches

The research question was driven by observations of the response actors (ICRC, MoPH) who wanted to understand why so few women were coming to the supported services and the academic partner was brought in to help answer that question. For the ICRC and for the MoPH, the decision to divert resources from responding to the needs of beneficiaries to conducting research was a necessary and expected challenge [ 24 , 32 ]. Research can still be perceived by humanitarians as less operational when compared to field response, relating to the difficulty of producing relevant recommendations rapidly enough [ 5 , 22 ]. Longer term integrated and flexible funding was essential -- often not the case with humanitarian funding cycles even in protracted crises.

The need to be transparent in a world competing for short planning and funding cycles also had to be part of the journey, especially when unexpected results challenged the internal capacity to learn from the research process [ 18 ]. For example, the finding that supported services were not fully utilized and that the program design did not meet the key health complaints of the population had to be transmitted and absorbed in a positive mode in order to reorient the response and feed into subsequent policy discussions [ 1 ].

Furthermore, continuity of key MoPH and ICRC staff was crucial but proved challenging in the context of rapid turnover -- a struggle identified in much humanitarian and global health research [ 18 , 22 , 28 , 32 ]. Continuity of engagement in the research ensured coherence, which is essential in building individual learning capacity [ 18 , 26 ]. Staffing changes can result in minimal uptake of findings, missed opportunities, institutional memory loss, and little return on investment [ 18 , 22 ]. When human resources are a challenge, mutual cross-task learning among team members is important and capacity building at individual levels is essential to produce sustained results [ 26 ]. Central to this partnership was the constitution of a stable core team including both ICRC and MoPH junior and senior staff.

Finally, the customized use of GIS techniques for specifying the sampling frame, the academic partner’s experience in complex sampling in conflict settings as well as updates on recent research initiatives in the region opened a collective thinking and learning space [ 18 ], an enriching experience for the humanitarian staff. The debates over methods, processes, and results challenged the field staff and constructively changed their perspective on the research endeavour. The humanitarian staff need to develop these understandings and welcome the academic analysis that situates findings in a wider realm of intellectual debate. Although unsettling for some staff members, the task of writing a publication, from use of rigorous methodologies to referencing the scientific literature, allowed the humanitarian team to expand their view of their own work and the efforts of the many others devoted to humanitarian response [ 1 ].

One important specific challenge for the MoPH was to undertake to seek the opinion of the community served in a context where the expressed issues might be very difficult to respond to in practical terms. To be able to deal with such concerns as trust in the public health system, for instance, would require addressing many broad historic and structural determinants. In addition, the research was conducted in areas with security situations which truncated time for follow-up -- leading to unanswered questions for the MoPH or prompting an interest in further in-depth investigation. Finally, even if it were clear to the MoPH that the community’s concerns would best be met by enacting universal health coverage through primary health care, the overwhelming question in the current Lebanese context is feasibility [ 50 ]. Yet although broad structural issues were difficult to address, the partnership confirmed the urgent need to raise awareness about the availability of good quality services, continue the expansion and improvement of services, and accelerate outreach within the population.

The ten identified challenges were all present in the collaborative research effort described here. To meet these challenges required varying degrees of compromise and adaptation for each step. Reflecting on this experience of a small and motivated research team working on this relatively modest initiative permits specification and discussion that may be of potential use to those embarking on similar partnerships in the future [ 59 , 60 , 61 ].

Abundant work went into the front end of this collaborative effort. Identifying a research issue of joint interest depended on the individual and collective capacity to share expertise and allocate time for preparedness. Considerable effort had been expended to understand the health issues of the population, data availability and gaps in knowledge. The ground had been defined, in effect, in such a way that virtually called for the kind of inquiry the research team embarked upon. Fostering the involvement of a diverse group of actors (academics, humanitarians and public health authorities) at the earliest stage created a shared readiness to construct a multi-faceted understanding of the issues and their inter-relationships.

Methodological challenges proved relatively easy to negotiate because the team was small, and the leaders had a sense of shared expertise and interdependence. Building a stable multi-skilled team allows all partners to mobilize their research potential. The different capacities of different actors must also strongly align with a respect for the strength of epidemiologic methods. Relying on these methods-- appropriately adapted for statistical power to obtain information in varying field settings on the health needs of diverse populations—meant that all actors learned how to discern and elucidate the crucial factors and relationships that undergird the provision of relevant services to populations in war and displacement. The research effort involved paying focused attention to teaching, de-briefing, and discussing a myriad of findings with many senior and junior participants in the research endeavour—those in the field and those who worked more from the delegation office in Beirut. In such ways, it was possible for the core research group to embed institutional learning useful to future efforts to understand key issues by relying at least in part on empirical field-based findings.

Creating and maintaining a participative mechanism for decision-making and transparent space for negotiation are delicate processes. The main challenges in this research project lay in the ones identified in the literature regarding the management of issues relating to security and logistics, ethics and norms, and to organisational cultures. While the FXB team appreciated the streamlined aspects of this project to the extent that the ICRC handled all the tricky and subtle aspects of security and logistics, this arrangement introduced a distance from the daily negotiations and created for FXB an unaccustomed sense of disconnection from the field dynamics. Mutual respect for specific spheres of decision-making was essential to re-setting the equilibrium and sustaining the partnership. Creating this channel of discussion begins with early acknowledgment and recognition of the different competencies of each partner. Mutual trust sustained throughout the partnership helped to account for power inequalities in particular spheres of command or expertise and permitted sharing of uncertainties. Issues of research ethics and power differentials arose during this collaboration. Because questions related to geospatial localisation were not permitted, substantial information was consequently not obtained. But this loss, weighed against the anticipated robust results, was resolved through respectful argument. The organizational identities and responsibilities also need to support the choice of modalities for dissemination of results (publication vs. internal reports) within a more global longer-term perspective that seeks to include the needs and requirements of all partners and builds upon their respective strengths.

The broader institutional trust and autonomy of the core research team further supported resolution of issues during the intense pace of calls that kept the communication channels clear and open. Problems could be handled in very real time over 2 years. The choice of the senior actors to maintain overall flexibility and field management permitted a temporary relative release from institutional processes. To reduce some of the complexities of collaboration between a field-based organization and an academic institution, it is suggested that relative operational and financial autonomy should be designed into the management of the research. We also recognize the importance of having funds embedded in existing institutional mechanisms in order to be able to secure the essential link between empirical research outcomes and influence on the subsequent planning phases of the response.

What academics should keep in mind

Time is needed to develop a trusting relationship, to define the relevant research questions and select the appropriate methodologies collectively. The research design should become part of the operational response, grounded in the operational reality. The research efforts in the humanitarian sphere should constitute an interactive part of the operational response, where the previous extensive field knowledge of humanitarian contexts by academic team members is key, so that research questions and results can be used for improved operational and policy response. The academic skills developed both by ICRC and MoPH staff prior to this joint effort facilitated in-depth technical discussions, allowing an active engagement in prioritizing key issues and selecting appropriate research tools, such as the GIS mapping. Academic partners should develop the ability to engage on equal footing with humanitarian and health authority actors in order to provide rapid preliminary results useful for important operational decisions, to nurture the operational thinking with updates from the broader literature, and to receive feedback on early results.

What humanitarians should keep in mind

Embedding operational research in humanitarian operations will be done either to the detriment of the research or the operations unless there is a pre-defined time and commitment of financial resources to support all partners in the research. Allocated resources are needed to allow each team member to contribute to the discussion through the lens of his or her specific competencies and to engage in challenging discussions, always tethered to the need to solve operational issues. Adequate time to work together should be factored into the routine work of health authorities and humanitarian actors if such joint initiatives are meant to be sustained. Time is needed for joint adaptations, for creating a shared vision, for securing continued funding and for anticipating the next phases of research work.

Field partners should allow the academic thinking and analytical process to take place, involving field personnel as the results take shape. Collaborative processes with academic partners can accelerate integration of research findings into the operational and policy reality, linking early results with planning processes. Allowing key staff engaged to be part of the research process irrespective of their field assignment, which allows a longer personal learning process and perspective, can be very helpful. The link to an updated set of broader academic literature, writing skills and technical tools is difficult to maintain in the humanitarian setup and can be nurtured and developed together with academic partners.

Joint research involving field actors and academics has the potential to contribute to improved responses for the most vulnerable affected by complex protracted crisis if it is conducted with proper resources, mutual respect for competencies and constraints, and trust in a shared vision. As the challenges of sustaining effective humanitarian operations in conflict settings increase, it is only prudent to consider how to marshal the resources of research partnerships to help define these challenges and suggest operational interventions to make the humanitarian response tighter, more equitable, and ultimately more effective.

Availability of data and materials

The data that support the findings of the primary field study and the debate are available from the ICRC but restrictions apply to the availability of these data due to confidentiality of information, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of the ICRC.

Abbreviations

Every Woman and Every Child Everywhere

Focus Group Discussions

François-Xavier Bagnoud

Geographic Information System

Global Positioning System

Headquarters

International Committee of the Red Cross

Internally Displaced Populations

Institutional Review Board

Lebanese Red Cross

Ministry of Public Health

Ministry of Social Affairs

Médecins Sans Frontières

Non-Governmental Organizations

Randomized Controlled Trials

Structured Operational Research and Training Initiative

Sexual and Reproductive Health

United Nations High Commissioner for Refugees

Truppa C, et al. Utilization of primary health care services among Syrian refugee and Lebanese women targeted by the ICRC program in Lebanon: a cross-sectional study. Confl Heal. 2019;13(1):7.

Article   Google Scholar  

Blanchet K, et al. Evidence on public health interventions in humanitarian crises. Lancet. 2017;390(10109):2287–96.

Article   PubMed   Google Scholar  

Checchi F, et al. Public health information in crisis-affected populations: a review of methods and their use for advocacy and action. Lancet. 2017;390(10109):2297–313.

Samarasekera U, Horton R. Improving evidence for health in humanitarian crises. Lancet. 2017;390(10109):2223–4.

Ford N, et al. Ethics of conducting research in conflict settings. Confl Heal. 2009;3(1):7.

Waldman RJ, Toole MJ. Where is the science in humanitarian health? Lancet. 2017;390(10109):2224–6.

Blanchet K, Duclos D. In: Blanchet K, Allen C, Breckon J, Davies P, Duclos D, Jansen J, Mthiyane H, Clarke M, editors. Research evidence in the humanitarian sector a PRACTICE GUIDE; 2018.

Google Scholar  

Sibai A, et al. North–south inequities in research collaboration in humanitarian and conflict contexts. Lancet. 2019;394:1597–600.

Sukarieh M, Tannock S. Subcontracting academia: alienation, exploitation and disillusionment in the UK overseas Syrian refugee research industry. Antipode. 2019;51(2):664–80.

Hedt-Gauthier B, et al. Academic promotion policies and equity in global health collaborations. Lancet. 2018;392(10158):1607–9.

Bowsher G, et al. A narrative review of health research capacity strengthening in low and middle-income countries: lessons for conflict-affected areas. Glob Health. 2019;15(1):23.

Singh NS, et al. Evaluating the effectiveness of sexual and reproductive health services during humanitarian crises: A systematic review. PLoS One. 2018;13:7.

Pascucci E. The humanitarian infrastructure and the question of over-research: reflections on fieldwork in the refugee crises in the Middle East and North Africa. Area. 2016;49:249.

Bhutta ZA, et al. Protecting women and children in conflict settings. BMJ. 2019;364:l1095.

DeJong J, et al. Health research in a turbulent region: the Reproductive Health Working Group. Reprod Health Matters. 2017;25(sup1):4–15.

Onyango MA, Heidari S. Care with dignity in humanitarian crises: ensuring sexual and reproductive health and rights of displaced populations. Reprod Health Matters. 2017;25(51):1–6.

DeJong J. Challenges to understanding the reproductive health needs of women forcibly displaced by the Syrian conflict. J Fam Planning Reprod Health Care. 2017;43(2):103.

Olivier C, Hunt M, Ridde V. NGO–researcher partnerships in global health research: benefits, challenges, and approaches that promote success. Dev Pract. 2016;26:444–55.

Delisle H, et al. The role of NGOs in global health research for development. Health Research Policy and Systems. 2005;3(1):3.

Article   PubMed   PubMed Central   Google Scholar  

Chu KM, et al. Building Research Capacity in Africa: Equity and Global Health Collaborations. PLoS Med. 2014;11:3.

Colombo S, Pavignani E. Recurrent failings of medical humanitarianism: intractable, ignored, or just exaggerated? Lancet. 2017;390:2314–24.

Zachariah R, et al. Is operational research delivering the goods? The journey to success in low-income countries. Lancet Infect Dis. 2012;12(5):415–21.

Zachariah R, et al. Operational research in low-income countries: what, why, and how? Lancet Infect Dis. 2009;9(11):711–7.

Zachariah R, Draquez B. Operational research in non-governmental organisations: necessity or luxury? Public Health Action. 2012;2(2):31.

Kumar AMV, et al. Operational research capacity building using ‘the Union/MSF’ model: adapting as we go along. BMC Res Notes. 2014;7:819.

Sewankambo N, et al. Enabling dynamic partnerships through joint degrees between low- and high-income countries for capacity development in Global Health research: experience from the Karolinska Institutet/Makerere University Partnership. PLoS Med. 2015;12(2):e1001784.

Glass RI. How can we conduct reserach in humanitarian crises? Global Health Matters. 2017;16:5.

Keus, K., et al., Field research in humanitarian medical programmes. Treatment of a cohort of tuberculosis patients using the Manyatta regimen in a conflict zone in South Sudan 2008.

DeJong J, et al. Reproductive, maternal, neonatal and child health in conflict: a case study on Syria using Countdown indicators. BMJ Glob Health. 2017;2:3.

O'Mathuna D, Siriwardhana C. Research ethics and evidence for humanitarian health. Lancet. 2017;390(10109):2228–9.

Raimondo E. The power and dysfunctions of evaluation systems in international organizations. Evaluation. 2018;24(1):26–41.

Zachariah R, et al. Conducting operational research within a non governmental organization: the example of Médecins Sans Frontières. Int Health. 2010;2(1):1–8.

Article   CAS   PubMed   Google Scholar  

Parker M, Kingori P. Good and bad research collaborations: researchers’ views on science and ethics in Global Health research. PLoS One. 2016;11(10):e0163579.

Article   PubMed   PubMed Central   CAS   Google Scholar  

Harries AD. Operational research: getting it done and making a difference. Public health action. 2012;2(1):1–2.

Zachariah R, et al. Building global capacity for conducting operational research using the SORT IT model: where and who? PLoS One. 2016;11(8):e0160837.

Zachariah R, et al. Building leadership capacity and future leaders in operational research in low-income countries: why and how? Int J Tuberc Lung Dis. 2011;15(11):1426–35 i.

Tripathy JP, et al. Does the structured operational research and training initiative (SORT IT) continue to influence health policy and/or practice? Glob Health Action. 2018;11(1):1500762.

GoL and UN, Lebanon Crisis Response Plan 2017–2020 (2018 update). 2018.

ICRC. Protracted conflict and humanitarian action: some recent ICRC experiences. Geneva: International Committee of the Red Cross; 2016.

UN, THE GLOBAL STRATEGY FOR WOMEN’S, CHILDREN’S and adolescents’ health (2016–2030). Every Woman Every Child, 2015.

ICRC, The ICRC its mission and work. 2009.

GoL and UN, Lebanon Crisis Response Plan 2015–2016. https://www.unocha.org/sites/dms/CAP/2015-2016_Lebanon_CRP_EN.pdf , Dec 2014.

Ammar W, et al. Health system resilience: Lebanon and the Syrian refugee crisis. J Glob Health. 2016;6(2):020704.

Blanchet K, Fouad FM, Pherali T. Syrian refugees in Lebanon: the search for universal health coverage. Confl Heal. 2016;10:12.

UNHCR, UNICEF, WFP. Vulnerability Assessment of Syrian Refugees in Lebanon. United Nations and GoL, 2018.

Parkinson SE, Behrouzan O. Negotiating health and life: Syrian refugees and the politics of access in Lebanon. Soc Sci Med. 2015;146:324–31.

LCRP, Lebanon crisis response plan 2017–2020 (2019 update). Government of Lebanon and United Nations, 2019.

MoPH. Emergency Primary Health Care Restauration Project (EPHRP) dashboard Jan-Dec 2017. Republic of Lebanon Ministry of Public Health, 2017.

MoPH, Vital Data Observatory (VDO) dashboard 2018. Republic of Lebanon Ministry of Public Health, 2018.

Hemadeh R, Hammoud R, Kdouh O. Lebanon's essential health care benefit package: a gateway for universal health coverage. Int J Health Plann Manag. 2019;34(4):e1921–36.

Mortier D, P S, Arpagaus M. Quality improvement programme on the frontline: An International Committee of the Red Cross experience in the Democratic Republic of Congo. Int J Qual Health Care. 2005;17(4):293–300.

Bernasconi A, et al. Can the use of digital algorithms improve quality care? An example from Afghanistan. PLoS One. 2018;13(11):e0207233.

Rossi R, et al. Vaccination coverage cluster surveys in middle Dreib – Akkar, Lebanon: comparison of vaccination coverage in children aged 12-59 months pre- and post-vaccination campaign. PLoS One. 2016;11(12):e0168145.

Health Inter-Agency Coordination Lebanon. Health 2018 Dashboard, 2018.

Benage M, et al. An assessment of antenatal care among Syrian refugees in Lebanon. Confl Health. 2015;9:8.

Reese Masterson A, et al. Assessment of reproductive health and violence against women among displaced Syrians in Lebanon. BMC Womens Health. 2014;14(1):25.

Tappis H, et al. Maternal health care utilization among Syrian refugees in Lebanon and Jordan. Matern Child Health J. 2017;21(9):1798–807.

Lyles E, et al. Health service utilization and access to medicines among Syrian refugee and host community children in Lebanon. J Int Hum Action. 2016;1(1):10.

Beran D, et al. Partnerships in global health and collaborative governance: lessons learnt from the division of tropical and humanitarian medicine at the Geneva University Hospitals. Glob Health. 2016;12(1):14.

Oliver K, Kothari A, Mays N. The dark side of coproduction: do the costs outweigh the benefits for health research? Health Res Pol Syst. 2019;17(1):33.

Oliver K, Pearce W. Three lessons from evidence-based medicine and policy: increase transparency, balance inputs and understand power. Palgrave Communications. 2017;3(1):43.

Kukreti N. Poverty, inequality and social protection in Lebanon. Oxfam and Issam Fares Institute of American University Beirut; 2016.

Vulnerability Assessment of Syrian Refugees in Lebanon. United Nations (UNHCR, UNICEF, WFP), 2017. https://reliefweb.int/sites/reliefweb.int/files/resources/VASyR%202017.compressed.pdf .

ICRC. ICRC rules on data protection. Geneva: ICRC publication; 2016.

Download references

Acknowledgements

We thank the ICRC Head of Region for the Near and Middle East (NAME), Fabrizio Carboni for the continuous support throughout the research process in Beirut and in Geneva.

Special thanks go to the ICRC health Unit, Esperanza Martinez, Marie-Thérèse Pahud and Stéphane du Mortier for the support provided throughout the study from the ICRC headquarters in Geneva.

We thank Kathleen Hamill from the Harvard FXB Center for Health and Human Rights for the fruitful discussions and her contribution to the birth of the idea of this study.

We thank Manal Alabduljabbar for training the interviewers and conducting the focus group discussions in the initial field study.

We thank all the Academic staff who have contributed to the initial field study, Arlan F. Fueller and Josyann Abisaab from the Harvard François Xavier Bagnoud Center for Health and Human Rights and Warda S. Toma from the University of British Columbia, Vancouver, Canada.

We thank all the ICRC staff members who have contributed to the initial study field study: Nicole El Hayek, Faten Al Ali, Mahmoud Al Wais, Charbel Elia, Aya El Khatib, Mohammad Jajieh, Alli Miikkulainen, Elsa Ragasa Hernandez, Kinda Khamasmieh, Margarita Rodas Iglesias, and Dima Touhami.

A wholehearted thank you to all the Lebanese Red Cross volunteers who have conducted the interviews for the household survey in the initial field study.

We thank the reviewer’s rich feedback that allowed to integrate notions of power balance and equity in the analysis.

The primary field study was entirely funded through the Beirut Delegation budget of the ICRC. No external funding was received for this debate.

Author information

Authors and affiliations.

International Committee of the Red Cross Delegation, Beirut, Lebanon

Enrica Leresche, Claudia Truppa, Christophe Martin & Carla Zmeter

Harvard TH Chan School of Public Health, Boston, USA

Ariana Marnicio

International Committee of the Red Cross, Geneva, Switzerland

Rodolfo Rossi

Lebanese Ministry of Public Health, Beirut, Lebanon

Hilda Harb & Randa Sami Hamadeh

Harvard François Xavier Bagnoud Center for Health and Human Rights, Boston, USA

Jennifer Leaning

You can also search for this author in PubMed   Google Scholar

Contributions

EL participated in the definition of the research question and the study methodology, framed the literature review, structured the debate and contributed to the writing of all sections of this paper. CT participated in the definition of the research question and the study methodology, framed the literature review, structured the debate and contributed to the writing of all sections of this paper. CM participated in the definition of the research question, participated in the debate and contributed to the writing of all sections of this paper. AM participated in the definition of the research question, participated in debate and contributed to the writing of all sections of this paper. RR participated in the definition of the research question and the study methodology, structured the debate and contributed to the writing of all sections of this paper. CZ participated in the debate and writing of all sections of this paper. HH participated in the definition of the research question and the study methodology, participated in the debate and contributed to the writing of all sections of this paper. RH participated in the definition of the research question and the study methodology, participated in the debate and contributed to the writing of all sections of this paper. JL participated in the definition of the research question and the study methodology, framed the literature review, structured the debate and contributed to the writing of all sections of this paper. All authors approved the final manuscript.

Corresponding author

Correspondence to Enrica Leresche .

Ethics declarations

Ethics approval and consent to participate.

Ethical approval for the primary field study was sought and obtained from the Institutional Review Board (IRB) of Harvard University and at the national level from the Lebanese MoPH and MoSA. At each site, local authorities, including heads of municipalities and military intelligence, were informed of the objectives and methodology of the study. Teams of interviewers (mainly Lebanese Red Cross volunteers) who underwent 2-day standardized training administered the questionnaire. All interviewers were trained in research ethics principles and instructed to read to all eligible participants an information sheet about the study and to request oral informed consent before proceeding to the interview. All participants in the study provided oral consent to participate, that was recorded through electronic data capture with the software used to collect the interview.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Leresche, E., Truppa, C., Martin, C. et al. Conducting operational research in humanitarian settings: is there a shared path for humanitarians, national public health authorities and academics?. Confl Health 14 , 25 (2020). https://doi.org/10.1186/s13031-020-00280-2

Download citation

Received : 27 November 2019

Accepted : 05 May 2020

Published : 13 May 2020

DOI : https://doi.org/10.1186/s13031-020-00280-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Humanitarian response
  • Operational research
  • Research partnership
  • Protracted crisis
  • Evidence-based humanitarian action
  • Co-production

Conflict and Health

ISSN: 1752-1505

qualitative and quantitative research techniques for humanitarian needs assessment

  • Focal points
  • Become a member
  • Intellectual property
  • Work with us
  • Cartoons Campaign
  • Core Humanitarian Standard
  • Humanitarian Standards Partnership
  • Submit a comment
  • Download the PDF
  • Read online
  • Ukraine response
  • COVID-19 guidance
  • Trainers info
  • Training resources
  • Sphere in Practice e-learning and MOOC
  • Upcoming events
  • Past events

Members can log in to create events, publish stories, share resources and modify their password and newsletter subscription.

By clicking below to submit this form, I hereby agree to the Sphere’s Privacy Policy and terms of use .

Some text in the modal.

  • News and Events

New tools for assessing humanitarian needs

The Good Enough Guide to Humanitarian Needs Assessment (GEGA) has been developed by the Emergency Capacity Building (ECB) Project and the Assessment Capacities Project (ACAPS). It provides a comprehensive framework for needs assessment, accompanied by a set of practical tools.

This guide is primarily targeted at field staff tasked with carrying out assessments, specifically project staff and their managers. Senior staff needing to understand what assessments involve are a secondary audience.

Sphere for assessments

It is geared to assessment teams in the field, managers implementing organisation-wide assessment strategy, and coordinators developing and implementing joint assessments.

This tool is part of the ongoing series of “Sphere 4” guides that will cover the humanitarian programming cycle including assessment, programming, monitoring and evaluation.

Copies of the guides are available on request to those interested in contributing to the piloting process (see details at the end of this article).

Both guides agree that the use of commonly agreed indicators in humanitarian needs assessment will contribute to greater coherence and coordination at the national level and in the humanitarian sector overall.

Both follow the assessment cycle from assessment preparedness through to information-sharing and learning.

The GEGA provides an overall framework and practical tools to manage the assessment cycle, while Sphere 4 Assessments provides more detailed guidance on how to ensure that standards and indicators are incorporated into the assessment process.

Both guides strive to improve the sector’s competence in making assessments and are based on best practice. The GEGA provides a practical framework for this, while Sphere 4 Assessments provides more detailed guidance in the area of standards and indicators.

Both resources are platform-neutral and will be useful for staff in any organisation, regardless of the specific assessment approach used by that organisation.

Sphere 4 Assessments complements the GEGA in three ways:

The GEGA is targeted at assessment staff with limited or no assessment experience, while Sphere 4 Assessments addresses a higher level of competence and provides targeted guidance on how to work with Sphere principles and standards.

The GEGA describes various assessment techniques and tools, while Sphere 4 Assessments provides specific content to work with in the area of standards and indicators.

Sphere 4 Assessments is relevant for in-house guidance on the use of assessment indicators, regardless of how much of the GEGA resource is incorporated into an agency assessment.

After the pilot phase, appropriate cross-referencing will be included in the final versions.

  • Would you like to pilot Sphere 4 Assessments ? Please email the Sphere Project office
  • Would you like to pilot the GEGA? Please download it from the ACAPS website  

Subscribe to Sphere's newsletter

What is your role?* (You may pick more than one option)

Your Privacy is protected. We will never pass your information to third parties. For further information how your data is used, please read our Privacy Policy .

Introduction to Humanitarian Needs Assessments and Analysis

Already over

The course aims to introduce you to the processes, skills and knowledge required to design and implement Assessment and Analysis (A&A) processes, with particular focus on making sense of available information and reports in humanitarian emergencies coming from field locations. Therefore, it relies on the broad experience of ACAPS (initially: "The Assessment Capacities Project”), using guidance from ACAPS Technical Briefs and the Humanitarian Needs Assessment – The Good Enough Guide (GEGA) – as backdrops. The pedagogical approach will follow a mixture of introductory presentations and group work, including the work on case studies.

The course is targeting headquarter based staff in humanitarian organisations. Consequently, the discussion will be centred around assessment methods and approaches using secondary data, open access data analysis.

The course is planned as face-to-face training on 22 and 23 April 2024.

You will learn

how to describe main activities related to the various steps of the analysis workflow and the analysis spectrum

how to use simple techniques to collate, process and analyse quantitative and qualitative data

how to use the different steps of the analysis spectrum in humanitarian analysis to ensure better sense making of findings

Target group

Staff of humanitarian NGOs with previous experience in humanitarian action and/or project management. Focus on staff in Germany who manage projects or programs remotely.

  • Basic principles & Assessment & Analysis (A&A) design
  • Exercise using Analysis Canvas
  • Humanitarian needs analysis & analysis frameworks
  • Exercise on analysis planning

Rolf M. Bakken, Consultant in Disaster Management and Humanitarian Affairs

qualitative and quantitative research techniques for humanitarian needs assessment

Rolf has 20 years of experience with disaster management and humanitarian assistance, in particular with emergency response, needs analysis, and coordination of disaster operations. He works with methodology development and training-design focusing on participatory learning methods and action-based learning. He has worked for OCHA/UNDAC in both complex emergencies and emergency responses to sudden onset disasters. Rolf worked for ACAPS as an assessment expert on the Syria Needs Analysis Project in 2013-14, and in subsequent years as a Senior Analyst on the response to Cyclone Idai in Mozambique 2019, the Beirut Port Explosion in 2020, and for the ACAPS Ukraine Analysis Hub. His last mission was to the Türkiye earthquake in 2023 and he is currently the Project Manager for the ACAPS Syria/Türkiye EQ project.

ACAPS (2014): Humanitarian Needs Assessment. The Good Enough Guide.

22.4.2024 - 23.4.2024

14 hours of training

Catering included

Maximum of 16 attendees

14 of 10 required registration for this course. You will be informed after registration. Share this course with friends and colleagues to make sure it will be carried out.

qualitative and quantitative research techniques for humanitarian needs assessment

Privacy policies

Terms and conditions

Improving humanitarian needs assessments through natural language processing

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

  • Social Science

Qualitative and Quantitative Research Techniques for

qualitative and quantitative research techniques for humanitarian needs assessment

Related documents

Matter Worksheet

Add this document to collection(s)

You can add this document to your study collection(s)

Add this document to saved

You can add this document to your saved list

Suggest us how to improve StudyLib

(For complaints, use another form )

Input it if you want to receive answer

You are using an outdated browser. Please upgrade your browser to improve your experience.

Network on Humanitarian Action International Association of Universities

Methodology and Research Methods in Humanitarian Studies

Course description, learning outcomes, teaching and learning methodology, assessment methods and criteria, required reading.

Last updated: 16 January 2018

This site uses cookies to enhance user experience and to track usage statistics. For more information, see NOHA’s Data Privacy Policy .

Accept     Decline

5 Assessment Content and Scope

A.5 assessment content and scope.

The hygiene promotion (HP) assessment should identify the main public health risks and current hygiene practices that contribute to the risks A.2 . It should determine which individuals and groups are vulnerable to which WASH-related risks ( A.7 and chapter  E  ) and why. It should identify factors that can both hinder and motivate positive behaviours and preventive action ( A.2 and chapter  B  )

Assessing WASH-related public health risks and how to address them will require an understanding of:

  • Current use of WASH facilities and services,
  • Access to essential household hygiene items P.6 ‚
  • Current coping strategies, local customs and beliefs,
  • Social structures and power dynamics in the community A.7 ,
  • Where people go for healthcare (including traditional healers, pharmacies, clinics),
  • Who is responsible for operating and maintaining WASH infrastructure,
  • Disease surveillance data linked to WASH,
  • Social, physical and communication barriers to accessing WASH facilities and services, particularly for women and girls, older people and persons with disabilities,
  • Income-level variations,
  • Environmental conditions and seasonal trends for diseases.

The assessment must also try to understand the social and behavioural factors (chapter  B  ) that influence different peoples’ hygiene practices and how these can be used to influence change as shown in teh figure below.

It will also need to identify the communication preferences of different groups to design an effective response (chapter  C  ).

Process & Good Practice

Consider different community groups (e.g. men, women, adolescents, elders and people with disabilities) and identify those who are marginalised or particularly vulnerable ( A.7 and chapter  E  ). 

Recognise that the affected community are ‘experts’ in their situation and have knowledge to share. 

Remember the importance of communicating with people in their language and ensure that interpreters are well briefed before commencing the assessment C.7 . 

Use the ‘F’ diagram, influences on health graphic A.2 and social and behaviour change models B.2 to help identify a broad range of assessment factors and continue to deepen understanding as the programme progresses.

Aim to answer the following ten questions through the HP assessment: 

  • What were ‘normal’ practices before the emergency and how have people adapted to the emergency? 
  • What are the widespread ‘risky ‘practices in the community? 
  • What are the different motivators and barriers to practising safer hygiene for different groups? 
  • How can we enable changes in practice and improvements in hygiene? 
  • Who uses ‘safe’ practices and who and what motivates and influences them to do so – can this be used to influence others? 
  • What communication channels are available and which are trusted for promoting hygiene? 
  • What facilities or materials do people need in order to carry out the ‘safe’ practices? 
  • How much time, money or effort are people willing to contribute for those facilities/materials? 
  • Where will those facilities/materials be available? 
  • How will people know that the facilities/materials exist and where they can be obtained?

Use different senses to gather information. It is not enough to just ask questions; use other senses such as Observation T.28 to cross check and deepen understanding. 

Some questions about hygiene can seem intrusive and there may be taboos about some issues e.g. menstrual hygiene P.7 . It can be useful to ask such questions indirectly such as ‘what do women here do?’ rather than ask ‘what do you do?’

To identify which individuals and groups are vulnerable to which WASH-related risks and why. 

The assessment must cover public health risks, WASH needs, hygiene behaviour, communication preferences and identify how different groups can best be supported.

It is important to understand the complexity of the affected community ( A.7 and chapter  E  ) and to identify the different hygiene needs that may be present in a given context (e.g. for menstrual hygiene materials, incontinence aids or child-friendly toilets).

The use of Behaviour Change B.2 and WASH models can help to ensure a more in-depth assessment leading to a more effective response. These models should be employed throughout the response.

General and technical standards in relation to carrying out assessments (including a WASH checklist)

Sphere Association (2018): The Sphere Handbook: Humanitarian Charter and Minimum Standards in Humanitarian Response  4th Edition

Assessment checklists

CAWST (2021): Behaviour Change Checklist

Rosato-Scott, C., Barrington, D. et al. (2020): How to Talk About Incontinence: A Checklist , IDS

Qualitative and quantitative research techniques to collect, collate, analyse, and synthesise information for humanitarian needs assessment

ACAPS (2012): Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment. An Introductory Brief

How to guides for barrier analysis using doer/non doer method

Kittle, B. (2017): A Practical Guide to Conducting a Barrier Analysis  2nd Edition, Helen Keller International

Davis, J., Thomas, P. (2010): Barrier Analysis Facilitator’s Guide: A Tool for Improving Behavior Change Communication in Child Survival and Community Development Programs , Food for the Hungry

  • Acknowledgement
  • Privacy Policy
  • Legal Notice

+49 30 419 343 - 45 +49 30 419 343 - 40 [email protected] www.washnet.de

Public Office

German Toilet Organization e.V. Juliusstr. 41 D-12051 Berlin Germany

Developed by

Logo

Financially supported by

Logo

  • Tools & Methods
  • Case Studies

This is still a Beta Version!

The platform will be further enhanced in the coming weeks and months to include more language options and additional functionalities. Any comments and feedback that help improving and further developing the platform are highly appreciated.

Title of Keyword

qualitative and quantitative research techniques for humanitarian needs assessment

Sanihub - topic - Sanitation Project Cycle - Data Collection and Sources

Data Collection and Sources

A combination of different sources and types of data is required to design and plan a sanitation response. Sources of information include both primary and secondary data . Types of information include qualitative and quantitative data . It is important to select the most appropriate methods in order to conduct timely, relevant and effective assessments.  

  • Be clear about what you need to know at each stage of the response and in what detail (in the acute response half of the whole picture may be better than the whole of half the picture).
  • Consider using a checklist (such as the Sphere WASH assessment checklist, see key resources), not necessarily as a questionnaire format but to ensure nothing gets overlooked.
  • Weigh up the advantages of qualitative and quantitative methods relative to your assessment purpose to decide which methods are appropriate and when in the response.
  • Use a combination of methods that are both quantitative (e.g. how many functioning toilets are in operation?) and qualitative (e.g. how do women feel about going to the toilet at night and what barriers to access do they face?).
  • Gather secondary data . Common sources of quantitative secondary data are the Demographic and Health Survey , the Multi-Indicator Cluster Survey and any existing Knowledge, Attitude and Practice (KAP) or Knowledge, Practices and Coverage (KPC) reports, as well as current mortality, morbidity and other epidemiological data from the health sector.
  • KPC and KAP surveys are the most common quantitative methods used in the WASH humanitarian sector to assess, plan, monitor and evaluate WASH programs but may not always be feasible in the acute phase of an emergency.
  • Choose between collecting primary data using pen and paper or tablets. First, consider the most convenient method for the affected population and then which collection method will allow quick and accurate analysis of the collected data.
  • Ensure that the competencies needed for quantitative methods are in place: specific skills and data are required (e.g. household lists in villages) to ensure the validity of the results.
  • Plan using the four basic steps to analyse qualitative data : (1) Organise data, (2) Shape or code the data, (3) Interpret and summarise the information and (4) Explain the information.
  • Triangulate information using different methods and sources and cross-check findings to minimise the bias of using only one method and increase the reliability of the data.

Data collection is an essential part of an initial sanitation assessment to identify critical needs (e.g. the number of people affected, specific vulnerabilities), key stakeholders, contextual information (e.g. soil and groundwater conditions, local sanitation standards and legislation, weather and climate conditions), the existing sanitation infrastructure and management arrangements (including gaps, access issues, hazards, damage and the overall risks to public health), current sanitation practices and cultural habits that might affect sanitation preferences (sitting or squatting, anal cleansing practices) and to identify available resources and local capacities to lead or support the response.

As the situation in emergencies is often dynamic and can rapidly change, existing information must be continuously validated and collected.

Information should be disaggregated and collected from as many different gender, diversity and age-balanced sources as possible and triangulated. The assessment should be coordinated and supervised by experienced WASH professionals and, preferably, undertaken by or with local actors familiar with the context and who speak the local language. Ideally, the team should be gender-balanced.

Primary data is gathered directly from the affected population. It is collected by assessment teams through fieldwork, most often through direct Interviews or Discussions with members of the affected community . It may also be gathered through other methods including Community Mapping , Transect Walks , phone interviews, Social Media and Email Exchange , Radio Communication and direct Observation . Primary data collection is an important way to engage with the population at an early stage of the programme design. It also ensures that the project is inclusive and relevant at the local level and that the assessment builds a holistic and accurate picture of the affected population.

Secondary data  is data that already exists (e.g. reports, statistics, research or maps); it is usually available from governmental agencies, national or regional WASH cluster structures or other organisations previously active in the affected area. It can serve as a preliminary introduction to the context . A significant amount of information can be obtained using secondary data . However, secondary data should always be considered with care; the additional collection of primary data  through direct contact with the respondents is recommended.

Selected Primary Data Methods and Sources

  • Household Visit
  • Key Informant Interview
  • Focus Group Discussion
  • Community Mapping
  • Observation
  • Transect Walk
  • Three-Pile-Sorting
  • Problem Ranking
  • Pocket Chart Voting

Selected Secondary Data Sources

  • Water, energy, environment, health, urban development ministries and local authorities
  • Census data and household enumeration
  • Demographic and health surveys
  • Global satellite image providers (UNITAR/UNOSAT)
  • Country-specific cluster information on https://www.humanitarianresponse.info/
  • UNHCR and UNICEF databases and reports
  • NGOs and development agencies that worked in the area before the crisis

Both primary and secondary data can be collected and analysed using quantitative or qualitative assessment methods.

Quantitative methods collect numerical data through surveys or by working with pre-existing statistical data. Findings can either be applied across groups of people, to explain a particular phenomenon, or to describe a characteristic. They are useful during the assessment phase as they measure coverage, knowledge and practices. Data collection methods may include structured Observation , Surveys and Checklists , polls, telephone or face-to-face Interviews . Analysis of quantitative methods requires some knowledge of statistics but software is available to support this.

Qualitative methods are useful during the assessment phase to collect and analyse data that reveals attitudes, perceptions or intentions e.g. to determine people’s perception of risk or the barriers to healthy behaviours.  Qualitative data is what people describe or illustrate. It is usually analysed by identifying common themes and issues of concern and grouping them to draw broader conclusions. The results of qualitative data analysis should not be translated into percentages or numerical data without a clear explanation.

The best way to get a sufficiently accurate assessment is to use several sources of information which can be triangulated and, if necessary, complemented by further research. Triangulation compares several different data sources and methods to cross-check and confirm findings. For example, teachers, community health workers, children and parents’ perspectives on school sanitation can be compared to prevent assumptions from being made. Triangulation can strengthen conclusions or identify areas for further work.

At all stages the reliability of the information being collected should be assessed according to the following categories: very reliable (a source fully trusted for its methods and time relevance of its data), reliable (from a reliable source, using scientific methods and data reflecting current or projected conditions), somewhat reliable (a reasonable but potentially unreliable source, methods or time relevance of data).

Bibliography

Compendium of Hygiene Promotion in Emergencies

Compendium of Sanitation Technologies in Emergencies

UNHCR WASH Manual: Practical Guidance for Refugee Settings

Key Resources and Tools

Qualitative and quantitative research techniques for humanitarian needs assessment. an introductory brief.

Qualitative and quantitative research techniques to collect, collate, analyse and synthesise information for humanitarian needs…

The Good Enough Guide: Humanitarian Needs Assessment

Practical guidance on assessment and data collection

Questionnaire Design. How to Design a Questionnaire for Needs Assessments in Humanitarian Emergencies

Guidance on designing questionnaires and sampling

Assessment, Analysis and Planning. Compendium of Hygiene Promotion in Emergencies Online Platform

Overview of key aspects related to WASH assessment, analysis and planning incl. data collection

Sphere Handbook. 4th Edition. WASH Chapter. Appendix 1: WASH Initial Needs Assessment Checklist (pages: 139-142)

Selected WASH Assessment Checklists

UNHCR WASH Manual. 7th Edition (page 50)

Rapid methods for assessing water, sanitation and hygiene (wash) services in emergency settings.

UNHCR’s methodology for conducting rapid WASH household assessments in refugee settings

Related Common Questions

What Information Do I Need to Collect at the Beginning of an Intervention?

Related Topics

Context analysis, needs assessment and analysis, prioritisation, still have questions.

You could not find the information you were looking for? Please contact our helpdesk team of experts for direct and individual support.

qualitative and quantitative research techniques for humanitarian needs assessment

  • Privacy Overview
  • Strictly Necessary Cookies
  • 3rd Party Cookies
  • Cookie Policy

qualitative and quantitative research techniques for humanitarian needs assessment

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.

This website uses Google Tag Manager to collect anonymous information such as the number of visitors to the site, and the most popular pages.

Keeping this cookie enabled helps us to improve our website.

Please enable Strictly Necessary Cookies first so that we can save your preferences!

More information about our Cookie Policy

IMAGES

  1. Humanitarian Needs Assessment

    qualitative and quantitative research techniques for humanitarian needs assessment

  2. Qualitative vs Quantitative Research: What's the Difference?

    qualitative and quantitative research techniques for humanitarian needs assessment

  3. Qualitative V/S Quantitative Research Method: Which One Is Better?

    qualitative and quantitative research techniques for humanitarian needs assessment

  4. Qualitative vs. Quantitative Research: Definition and Types

    qualitative and quantitative research techniques for humanitarian needs assessment

  5. Understanding Qualitative Research: An In-Depth Study Guide

    qualitative and quantitative research techniques for humanitarian needs assessment

  6. Qualitative Vs. Quantitative Research

    qualitative and quantitative research techniques for humanitarian needs assessment

VIDEO

  1. Qualitative and quantitative research part 2

  2. Qualitative Research vs Quantitative Research #shorts @libraryscience123

  3. Quantitative and Qualitative research in research psychology

  4. Exploring Qualitative and Quantitative Research Methods and why you should use them

  5. Research Training Video: Quantitative and Qualitative Research Posters

  6. Qualitative vs Quantitative Research Methods

COMMENTS

  1. Qualitative and Quantitative Research Techniques for Humanitarian Needs

    Collection, collation, analysis, and synthesis of qualitative and quantitative information, gathered and analysed using appropriate sources, tools, and methods is the cornerstone of rapid needs ...

  2. PDF Qualitative and Quantitative Research Techniques for Humanitarian Needs

    Quantitative research methods are characterised by the collection of information which can be analysed numerically, the results of which are typically presented using statistics, tables and graphs.

  3. PDF Needs Assessment Guide Considerations for Qualitative and Quantitative

    Considerations for Qualitative and Quantitative Approaches . Qualitative Approaches Quantitative Approaches When to ... assessment: Objectives and Main Features • To explore, understand phenomena ... Adapted from "Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment," Table 2, ACAPS Technical Brief, May 2012.

  4. Qualitative and Quantitative Research Techniques for Humanitarian Needs

    Collection, collation, analysis, and synthesis of qualitative and quantitative information, gathered and analysed using appropriate sources, tools, and methods is the cornerstone of rapid needs assessments that allows decision makers to plan a timely, appropriate, and coordinated emergency response. Bibliographic information. ACAPS (2012).

  5. PDF Compendium of Good Practices on Rapid Needs Assessment Methodology

    Rapid humanitarian assessment in urban settings (2013) provides practical guidance on designing a needs assessment methodology in urban contexts. Qualitative and quantitative techniques for humanitarian needs assessment (2012). Building an effective assessment team (2012) elaborates on the team architecture and profiles.

  6. PDF Improving humanitarian impact assessment: bridging theory and practice

    The focus of humanitarian evaluations is shifting towards analysis of the impact of humanitarian assistance - to understand, in an evidence-based way, how aid ultimately affects the lives and livelihoods of aid recipients. Improved impact assessment could contribute to greatly improved beneficiary participation, more robust needs assessments ...

  7. PDF How to design a questionnaire for needs assessments in humanitarian

    Conversely, many assessment teams, after conducting assessments, find that important questions have been excluded. Therefore, when planning the questionnaire design, carefully consider possible omissions. But keep in mind that assessment will almost always leave some questions unanswered, which provides a need for further research.

  8. PDF Measuring humanitarian need:A critical review of needs assessment ...

    the primary goal of humanitarian action is to protect human life. The second is that the international response to a given situation should be proportionate in scale and appropriate in nature to unmet humanitarian needs in that situation.4 Proportionality is an extension of the basic humanitarian principle of impartiality, that assistance is given

  9. Perceived needs versus actual needs: Humanitarian Emergency Settings

    Needs assessments conducted in humanitarian fields are conducted either through qualitative approaches through focus groups based on convenience samples or with objective indicators developed to measure large scale population needs, such as quantitative indicators for malnutrituion and mortality rates.

  10. Humanitarian Needs Assessment

    ACAPS has published Humanitarian Needs Assessment - The Good Enough Guide, to help humanitarian staff design and implement needs assessments in emergencies. ... Policy Research Working Paper ...

  11. Needs Assessments and Analysis

    Evidence-base. Needs assessments and analysis provide the evidence-base for strategic planning, as well as the baseline information upon which situation and response monitoring systems will rely. It should therefore form a continuous process throughout the Humanitarian Programme Cycle (HPC). Coordinated assessments are carried out in ...

  12. Section 15. Qualitative Methods to Assess Community Issues

    Online Resources. The Action Catalogue is an online decision support tool that is intended to enable researchers, policy-makers and others wanting to conduct inclusive research, to find the method best suited for their specific project needs.. Chapter 6: Research Methods in the "Introduction to Community Psychology" describes the ecological lens in community research, the role of ethics, the ...

  13. A People & Purpose Approach to Humanitarian Data ...

    The following elements have been used to guide the pre-activation assessment and used as an iterative roadmap to reassess risk and engage in mitigation measures during the response period. ... Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment, An Introductory Brief. [11] UN Global Pulse (2015, June 18 ...

  14. Conducting operational research in humanitarian settings: is there a

    In humanitarian contexts, it is a difficult and multi-faceted task to enlist academics, humanitarian actors and health authorities in a collaborative research effort. The lack of research in such settings has been widely described in the past decade, but few have analysed the challenges in building strong and balanced research partnerships. The major issues include considering operational ...

  15. New tools for assessing humanitarian needs

    The Good Enough Guide to Humanitarian Needs Assessment (GEGA) has been developed by the Emergency Capacity Building (ECB) Project and the Assessment Capacities Project (ACAPS). It provides a comprehensive framework for needs assessment, accompanied by a set of practical tools. This guide is primarily targeted at field staff tasked with carrying out assessments, specifically project staff and ...

  16. BELES PARADISE COLLEGE: Qualitative and Quantitative Research

    Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment: Other Titles: Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment: Authors: Beles Paradise College: Keywords: Master of Business Administration: Issue Date: 1-Feb-2022:

  17. Introduction to Humanitarian Needs Assessments and Analysis

    The course aims to introduce you to the processes, skills and knowledge required to design and implement Assessment and Analysis (A&A) processes, with particular focus on making sense of available information and reports in humanitarian emergencies coming from field locations. Therefore, it relies on the broad experience of ACAPS (initially: "The Assessment Capacities Project"), using ...

  18. Improving humanitarian needs assessments through natural language

    An effective response to humanitarian crises relies on detailed information about the needs of the affected population. Current assessment approaches often require interviewers to convert complex, open-ended responses into simplified quantitative data. More nuanced insights require the use of qualitative methods, but proper transcription and manual coding are hard to conduct rapidly and at ...

  19. Qualitative and Quantitative Research Techniques for

    May 2012 Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment An Introductory Brief Table of Contents 1. ... When undertaking a needs assessment, a combination of different types and sources of data is required to build a holistic picture of the affected population. ... when used along with quantitative methods ...

  20. Interpreting Humanitarian Research: Revealing Efficient Techniques for

    "Research is the compass guiding us through the complexities of humanitarian crises, illuminating the path toward impactful and empathetic responses."— Kofi Annan. Qualitative Insights: Unveiling Human Narratives. Focus groups, in-depth interviews, ethnographic studies, and other qualitative research techniques dive deeply into the lived experiences and viewpoints of impacted communities.

  21. Methodology and Research Methods in Humanitarian Studies

    With the theoretical bases firmly in place, the study unit will move on to provide an overview of quantitative and qualitative methods used in IR research. It will examine a number of methods more in-depth: case studies, ethnography, surveys, and interview techniques and discuss the appropriateness and usefulness of the different methods ...

  22. Components

    Qualitative and quantitative research techniques to collect, collate, analyse, and synthesise information for humanitarian needs assessment. ACAPS (2012): Qualitative and Quantitative Research Techniques for Humanitarian Needs Assessment. An Introductory Brief. How to guides for barrier analysis using doer/non doer method . Kittle, B. ...

  23. Data Collection and Sources

    Analysis of quantitative methods requires some knowledge of statistics but software is available to support this. Qualitative methods are useful during the assessment phase to collect and analyse data that reveals attitudes, perceptions or intentions e.g. to determine people's perception of risk or the barriers to healthy behaviours.