Logo for Open Educational Resources

Chapter 2. Research Design

Getting started.

When I teach undergraduates qualitative research methods, the final product of the course is a “research proposal” that incorporates all they have learned and enlists the knowledge they have learned about qualitative research methods in an original design that addresses a particular research question. I highly recommend you think about designing your own research study as you progress through this textbook. Even if you don’t have a study in mind yet, it can be a helpful exercise as you progress through the course. But how to start? How can one design a research study before they even know what research looks like? This chapter will serve as a brief overview of the research design process to orient you to what will be coming in later chapters. Think of it as a “skeleton” of what you will read in more detail in later chapters. Ideally, you will read this chapter both now (in sequence) and later during your reading of the remainder of the text. Do not worry if you have questions the first time you read this chapter. Many things will become clearer as the text advances and as you gain a deeper understanding of all the components of good qualitative research. This is just a preliminary map to get you on the right road.

Null

Research Design Steps

Before you even get started, you will need to have a broad topic of interest in mind. [1] . In my experience, students can confuse this broad topic with the actual research question, so it is important to clearly distinguish the two. And the place to start is the broad topic. It might be, as was the case with me, working-class college students. But what about working-class college students? What’s it like to be one? Why are there so few compared to others? How do colleges assist (or fail to assist) them? What interested me was something I could barely articulate at first and went something like this: “Why was it so difficult and lonely to be me?” And by extension, “Did others share this experience?”

Once you have a general topic, reflect on why this is important to you. Sometimes we connect with a topic and we don’t really know why. Even if you are not willing to share the real underlying reason you are interested in a topic, it is important that you know the deeper reasons that motivate you. Otherwise, it is quite possible that at some point during the research, you will find yourself turned around facing the wrong direction. I have seen it happen many times. The reason is that the research question is not the same thing as the general topic of interest, and if you don’t know the reasons for your interest, you are likely to design a study answering a research question that is beside the point—to you, at least. And this means you will be much less motivated to carry your research to completion.

Researcher Note

Why do you employ qualitative research methods in your area of study? What are the advantages of qualitative research methods for studying mentorship?

Qualitative research methods are a huge opportunity to increase access, equity, inclusion, and social justice. Qualitative research allows us to engage and examine the uniquenesses/nuances within minoritized and dominant identities and our experiences with these identities. Qualitative research allows us to explore a specific topic, and through that exploration, we can link history to experiences and look for patterns or offer up a unique phenomenon. There’s such beauty in being able to tell a particular story, and qualitative research is a great mode for that! For our work, we examined the relationships we typically use the term mentorship for but didn’t feel that was quite the right word. Qualitative research allowed us to pick apart what we did and how we engaged in our relationships, which then allowed us to more accurately describe what was unique about our mentorship relationships, which we ultimately named liberationships ( McAloney and Long 2021) . Qualitative research gave us the means to explore, process, and name our experiences; what a powerful tool!

How do you come up with ideas for what to study (and how to study it)? Where did you get the idea for studying mentorship?

Coming up with ideas for research, for me, is kind of like Googling a question I have, not finding enough information, and then deciding to dig a little deeper to get the answer. The idea to study mentorship actually came up in conversation with my mentorship triad. We were talking in one of our meetings about our relationship—kind of meta, huh? We discussed how we felt that mentorship was not quite the right term for the relationships we had built. One of us asked what was different about our relationships and mentorship. This all happened when I was taking an ethnography course. During the next session of class, we were discussing auto- and duoethnography, and it hit me—let’s explore our version of mentorship, which we later went on to name liberationships ( McAloney and Long 2021 ). The idea and questions came out of being curious and wanting to find an answer. As I continue to research, I see opportunities in questions I have about my work or during conversations that, in our search for answers, end up exposing gaps in the literature. If I can’t find the answer already out there, I can study it.

—Kim McAloney, PhD, College Student Services Administration Ecampus coordinator and instructor

When you have a better idea of why you are interested in what it is that interests you, you may be surprised to learn that the obvious approaches to the topic are not the only ones. For example, let’s say you think you are interested in preserving coastal wildlife. And as a social scientist, you are interested in policies and practices that affect the long-term viability of coastal wildlife, especially around fishing communities. It would be natural then to consider designing a research study around fishing communities and how they manage their ecosystems. But when you really think about it, you realize that what interests you the most is how people whose livelihoods depend on a particular resource act in ways that deplete that resource. Or, even deeper, you contemplate the puzzle, “How do people justify actions that damage their surroundings?” Now, there are many ways to design a study that gets at that broader question, and not all of them are about fishing communities, although that is certainly one way to go. Maybe you could design an interview-based study that includes and compares loggers, fishers, and desert golfers (those who golf in arid lands that require a great deal of wasteful irrigation). Or design a case study around one particular example where resources were completely used up by a community. Without knowing what it is you are really interested in, what motivates your interest in a surface phenomenon, you are unlikely to come up with the appropriate research design.

These first stages of research design are often the most difficult, but have patience . Taking the time to consider why you are going to go through a lot of trouble to get answers will prevent a lot of wasted energy in the future.

There are distinct reasons for pursuing particular research questions, and it is helpful to distinguish between them.  First, you may be personally motivated.  This is probably the most important and the most often overlooked.   What is it about the social world that sparks your curiosity? What bothers you? What answers do you need in order to keep living? For me, I knew I needed to get a handle on what higher education was for before I kept going at it. I needed to understand why I felt so different from my peers and whether this whole “higher education” thing was “for the likes of me” before I could complete my degree. That is the personal motivation question. Your personal motivation might also be political in nature, in that you want to change the world in a particular way. It’s all right to acknowledge this. In fact, it is better to acknowledge it than to hide it.

There are also academic and professional motivations for a particular study.  If you are an absolute beginner, these may be difficult to find. We’ll talk more about this when we discuss reviewing the literature. Simply put, you are probably not the only person in the world to have thought about this question or issue and those related to it. So how does your interest area fit into what others have studied? Perhaps there is a good study out there of fishing communities, but no one has quite asked the “justification” question. You are motivated to address this to “fill the gap” in our collective knowledge. And maybe you are really not at all sure of what interests you, but you do know that [insert your topic] interests a lot of people, so you would like to work in this area too. You want to be involved in the academic conversation. That is a professional motivation and a very important one to articulate.

Practical and strategic motivations are a third kind. Perhaps you want to encourage people to take better care of the natural resources around them. If this is also part of your motivation, you will want to design your research project in a way that might have an impact on how people behave in the future. There are many ways to do this, one of which is using qualitative research methods rather than quantitative research methods, as the findings of qualitative research are often easier to communicate to a broader audience than the results of quantitative research. You might even be able to engage the community you are studying in the collecting and analyzing of data, something taboo in quantitative research but actively embraced and encouraged by qualitative researchers. But there are other practical reasons, such as getting “done” with your research in a certain amount of time or having access (or no access) to certain information. There is nothing wrong with considering constraints and opportunities when designing your study. Or maybe one of the practical or strategic goals is about learning competence in this area so that you can demonstrate the ability to conduct interviews and focus groups with future employers. Keeping that in mind will help shape your study and prevent you from getting sidetracked using a technique that you are less invested in learning about.

STOP HERE for a moment

I recommend you write a paragraph (at least) explaining your aims and goals. Include a sentence about each of the following: personal/political goals, practical or professional/academic goals, and practical/strategic goals. Think through how all of the goals are related and can be achieved by this particular research study . If they can’t, have a rethink. Perhaps this is not the best way to go about it.

You will also want to be clear about the purpose of your study. “Wait, didn’t we just do this?” you might ask. No! Your goals are not the same as the purpose of the study, although they are related. You can think about purpose lying on a continuum from “ theory ” to “action” (figure 2.1). Sometimes you are doing research to discover new knowledge about the world, while other times you are doing a study because you want to measure an impact or make a difference in the world.

Purpose types: Basic Research, Applied Research, Summative Evaluation, Formative Evaluation, Action Research

Basic research involves research that is done for the sake of “pure” knowledge—that is, knowledge that, at least at this moment in time, may not have any apparent use or application. Often, and this is very important, knowledge of this kind is later found to be extremely helpful in solving problems. So one way of thinking about basic research is that it is knowledge for which no use is yet known but will probably one day prove to be extremely useful. If you are doing basic research, you do not need to argue its usefulness, as the whole point is that we just don’t know yet what this might be.

Researchers engaged in basic research want to understand how the world operates. They are interested in investigating a phenomenon to get at the nature of reality with regard to that phenomenon. The basic researcher’s purpose is to understand and explain ( Patton 2002:215 ).

Basic research is interested in generating and testing hypotheses about how the world works. Grounded Theory is one approach to qualitative research methods that exemplifies basic research (see chapter 4). Most academic journal articles publish basic research findings. If you are working in academia (e.g., writing your dissertation), the default expectation is that you are conducting basic research.

Applied research in the social sciences is research that addresses human and social problems. Unlike basic research, the researcher has expectations that the research will help contribute to resolving a problem, if only by identifying its contours, history, or context. From my experience, most students have this as their baseline assumption about research. Why do a study if not to make things better? But this is a common mistake. Students and their committee members are often working with default assumptions here—the former thinking about applied research as their purpose, the latter thinking about basic research: “The purpose of applied research is to contribute knowledge that will help people to understand the nature of a problem in order to intervene, thereby allowing human beings to more effectively control their environment. While in basic research the source of questions is the tradition within a scholarly discipline, in applied research the source of questions is in the problems and concerns experienced by people and by policymakers” ( Patton 2002:217 ).

Applied research is less geared toward theory in two ways. First, its questions do not derive from previous literature. For this reason, applied research studies have much more limited literature reviews than those found in basic research (although they make up for this by having much more “background” about the problem). Second, it does not generate theory in the same way as basic research does. The findings of an applied research project may not be generalizable beyond the boundaries of this particular problem or context. The findings are more limited. They are useful now but may be less useful later. This is why basic research remains the default “gold standard” of academic research.

Evaluation research is research that is designed to evaluate or test the effectiveness of specific solutions and programs addressing specific social problems. We already know the problems, and someone has already come up with solutions. There might be a program, say, for first-generation college students on your campus. Does this program work? Are first-generation students who participate in the program more likely to graduate than those who do not? These are the types of questions addressed by evaluation research. There are two types of research within this broader frame; however, one more action-oriented than the next. In summative evaluation , an overall judgment about the effectiveness of a program or policy is made. Should we continue our first-gen program? Is it a good model for other campuses? Because the purpose of such summative evaluation is to measure success and to determine whether this success is scalable (capable of being generalized beyond the specific case), quantitative data is more often used than qualitative data. In our example, we might have “outcomes” data for thousands of students, and we might run various tests to determine if the better outcomes of those in the program are statistically significant so that we can generalize the findings and recommend similar programs elsewhere. Qualitative data in the form of focus groups or interviews can then be used for illustrative purposes, providing more depth to the quantitative analyses. In contrast, formative evaluation attempts to improve a program or policy (to help “form” or shape its effectiveness). Formative evaluations rely more heavily on qualitative data—case studies, interviews, focus groups. The findings are meant not to generalize beyond the particular but to improve this program. If you are a student seeking to improve your qualitative research skills and you do not care about generating basic research, formative evaluation studies might be an attractive option for you to pursue, as there are always local programs that need evaluation and suggestions for improvement. Again, be very clear about your purpose when talking through your research proposal with your committee.

Action research takes a further step beyond evaluation, even formative evaluation, to being part of the solution itself. This is about as far from basic research as one could get and definitely falls beyond the scope of “science,” as conventionally defined. The distinction between action and research is blurry, the research methods are often in constant flux, and the only “findings” are specific to the problem or case at hand and often are findings about the process of intervention itself. Rather than evaluate a program as a whole, action research often seeks to change and improve some particular aspect that may not be working—maybe there is not enough diversity in an organization or maybe women’s voices are muted during meetings and the organization wonders why and would like to change this. In a further step, participatory action research , those women would become part of the research team, attempting to amplify their voices in the organization through participation in the action research. As action research employs methods that involve people in the process, focus groups are quite common.

If you are working on a thesis or dissertation, chances are your committee will expect you to be contributing to fundamental knowledge and theory ( basic research ). If your interests lie more toward the action end of the continuum, however, it is helpful to talk to your committee about this before you get started. Knowing your purpose in advance will help avoid misunderstandings during the later stages of the research process!

The Research Question

Once you have written your paragraph and clarified your purpose and truly know that this study is the best study for you to be doing right now , you are ready to write and refine your actual research question. Know that research questions are often moving targets in qualitative research, that they can be refined up to the very end of data collection and analysis. But you do have to have a working research question at all stages. This is your “anchor” when you get lost in the data. What are you addressing? What are you looking at and why? Your research question guides you through the thicket. It is common to have a whole host of questions about a phenomenon or case, both at the outset and throughout the study, but you should be able to pare it down to no more than two or three sentences when asked. These sentences should both clarify the intent of the research and explain why this is an important question to answer. More on refining your research question can be found in chapter 4.

Chances are, you will have already done some prior reading before coming up with your interest and your questions, but you may not have conducted a systematic literature review. This is the next crucial stage to be completed before venturing further. You don’t want to start collecting data and then realize that someone has already beaten you to the punch. A review of the literature that is already out there will let you know (1) if others have already done the study you are envisioning; (2) if others have done similar studies, which can help you out; and (3) what ideas or concepts are out there that can help you frame your study and make sense of your findings. More on literature reviews can be found in chapter 9.

In addition to reviewing the literature for similar studies to what you are proposing, it can be extremely helpful to find a study that inspires you. This may have absolutely nothing to do with the topic you are interested in but is written so beautifully or organized so interestingly or otherwise speaks to you in such a way that you want to post it somewhere to remind you of what you want to be doing. You might not understand this in the early stages—why would you find a study that has nothing to do with the one you are doing helpful? But trust me, when you are deep into analysis and writing, having an inspirational model in view can help you push through. If you are motivated to do something that might change the world, you probably have read something somewhere that inspired you. Go back to that original inspiration and read it carefully and see how they managed to convey the passion that you so appreciate.

At this stage, you are still just getting started. There are a lot of things to do before setting forth to collect data! You’ll want to consider and choose a research tradition and a set of data-collection techniques that both help you answer your research question and match all your aims and goals. For example, if you really want to help migrant workers speak for themselves, you might draw on feminist theory and participatory action research models. Chapters 3 and 4 will provide you with more information on epistemologies and approaches.

Next, you have to clarify your “units of analysis.” What is the level at which you are focusing your study? Often, the unit in qualitative research methods is individual people, or “human subjects.” But your units of analysis could just as well be organizations (colleges, hospitals) or programs or even whole nations. Think about what it is you want to be saying at the end of your study—are the insights you are hoping to make about people or about organizations or about something else entirely? A unit of analysis can even be a historical period! Every unit of analysis will call for a different kind of data collection and analysis and will produce different kinds of “findings” at the conclusion of your study. [2]

Regardless of what unit of analysis you select, you will probably have to consider the “human subjects” involved in your research. [3] Who are they? What interactions will you have with them—that is, what kind of data will you be collecting? Before answering these questions, define your population of interest and your research setting. Use your research question to help guide you.

Let’s use an example from a real study. In Geographies of Campus Inequality , Benson and Lee ( 2020 ) list three related research questions: “(1) What are the different ways that first-generation students organize their social, extracurricular, and academic activities at selective and highly selective colleges? (2) how do first-generation students sort themselves and get sorted into these different types of campus lives; and (3) how do these different patterns of campus engagement prepare first-generation students for their post-college lives?” (3).

Note that we are jumping into this a bit late, after Benson and Lee have described previous studies (the literature review) and what is known about first-generation college students and what is not known. They want to know about differences within this group, and they are interested in ones attending certain kinds of colleges because those colleges will be sites where academic and extracurricular pressures compete. That is the context for their three related research questions. What is the population of interest here? First-generation college students . What is the research setting? Selective and highly selective colleges . But a host of questions remain. Which students in the real world, which colleges? What about gender, race, and other identity markers? Will the students be asked questions? Are the students still in college, or will they be asked about what college was like for them? Will they be observed? Will they be shadowed? Will they be surveyed? Will they be asked to keep diaries of their time in college? How many students? How many colleges? For how long will they be observed?

Recommendation

Take a moment and write down suggestions for Benson and Lee before continuing on to what they actually did.

Have you written down your own suggestions? Good. Now let’s compare those with what they actually did. Benson and Lee drew on two sources of data: in-depth interviews with sixty-four first-generation students and survey data from a preexisting national survey of students at twenty-eight selective colleges. Let’s ignore the survey for our purposes here and focus on those interviews. The interviews were conducted between 2014 and 2016 at a single selective college, “Hilltop” (a pseudonym ). They employed a “purposive” sampling strategy to ensure an equal number of male-identifying and female-identifying students as well as equal numbers of White, Black, and Latinx students. Each student was interviewed once. Hilltop is a selective liberal arts college in the northeast that enrolls about three thousand students.

How did your suggestions match up to those actually used by the researchers in this study? It is possible your suggestions were too ambitious? Beginning qualitative researchers can often make that mistake. You want a research design that is both effective (it matches your question and goals) and doable. You will never be able to collect data from your entire population of interest (unless your research question is really so narrow to be relevant to very few people!), so you will need to come up with a good sample. Define the criteria for this sample, as Benson and Lee did when deciding to interview an equal number of students by gender and race categories. Define the criteria for your sample setting too. Hilltop is typical for selective colleges. That was a research choice made by Benson and Lee. For more on sampling and sampling choices, see chapter 5.

Benson and Lee chose to employ interviews. If you also would like to include interviews, you have to think about what will be asked in them. Most interview-based research involves an interview guide, a set of questions or question areas that will be asked of each participant. The research question helps you create a relevant interview guide. You want to ask questions whose answers will provide insight into your research question. Again, your research question is the anchor you will continually come back to as you plan for and conduct your study. It may be that once you begin interviewing, you find that people are telling you something totally unexpected, and this makes you rethink your research question. That is fine. Then you have a new anchor. But you always have an anchor. More on interviewing can be found in chapter 11.

Let’s imagine Benson and Lee also observed college students as they went about doing the things college students do, both in the classroom and in the clubs and social activities in which they participate. They would have needed a plan for this. Would they sit in on classes? Which ones and how many? Would they attend club meetings and sports events? Which ones and how many? Would they participate themselves? How would they record their observations? More on observation techniques can be found in both chapters 13 and 14.

At this point, the design is almost complete. You know why you are doing this study, you have a clear research question to guide you, you have identified your population of interest and research setting, and you have a reasonable sample of each. You also have put together a plan for data collection, which might include drafting an interview guide or making plans for observations. And so you know exactly what you will be doing for the next several months (or years!). To put the project into action, there are a few more things necessary before actually going into the field.

First, you will need to make sure you have any necessary supplies, including recording technology. These days, many researchers use their phones to record interviews. Second, you will need to draft a few documents for your participants. These include informed consent forms and recruiting materials, such as posters or email texts, that explain what this study is in clear language. Third, you will draft a research protocol to submit to your institutional review board (IRB) ; this research protocol will include the interview guide (if you are using one), the consent form template, and all examples of recruiting material. Depending on your institution and the details of your study design, it may take weeks or even, in some unfortunate cases, months before you secure IRB approval. Make sure you plan on this time in your project timeline. While you wait, you can continue to review the literature and possibly begin drafting a section on the literature review for your eventual presentation/publication. More on IRB procedures can be found in chapter 8 and more general ethical considerations in chapter 7.

Once you have approval, you can begin!

Research Design Checklist

Before data collection begins, do the following:

  • Write a paragraph explaining your aims and goals (personal/political, practical/strategic, professional/academic).
  • Define your research question; write two to three sentences that clarify the intent of the research and why this is an important question to answer.
  • Review the literature for similar studies that address your research question or similar research questions; think laterally about some literature that might be helpful or illuminating but is not exactly about the same topic.
  • Find a written study that inspires you—it may or may not be on the research question you have chosen.
  • Consider and choose a research tradition and set of data-collection techniques that (1) help answer your research question and (2) match your aims and goals.
  • Define your population of interest and your research setting.
  • Define the criteria for your sample (How many? Why these? How will you find them, gain access, and acquire consent?).
  • If you are conducting interviews, draft an interview guide.
  •  If you are making observations, create a plan for observations (sites, times, recording, access).
  • Acquire any necessary technology (recording devices/software).
  • Draft consent forms that clearly identify the research focus and selection process.
  • Create recruiting materials (posters, email, texts).
  • Apply for IRB approval (proposal plus consent form plus recruiting materials).
  • Block out time for collecting data.
  • At the end of the chapter, you will find a " Research Design Checklist " that summarizes the main recommendations made here ↵
  • For example, if your focus is society and culture , you might collect data through observation or a case study. If your focus is individual lived experience , you are probably going to be interviewing some people. And if your focus is language and communication , you will probably be analyzing text (written or visual). ( Marshall and Rossman 2016:16 ). ↵
  • You may not have any "live" human subjects. There are qualitative research methods that do not require interactions with live human beings - see chapter 16 , "Archival and Historical Sources." But for the most part, you are probably reading this textbook because you are interested in doing research with people. The rest of the chapter will assume this is the case. ↵

One of the primary methodological traditions of inquiry in qualitative research, ethnography is the study of a group or group culture, largely through observational fieldwork supplemented by interviews. It is a form of fieldwork that may include participant-observation data collection. See chapter 14 for a discussion of deep ethnography. 

A methodological tradition of inquiry and research design that focuses on an individual case (e.g., setting, institution, or sometimes an individual) in order to explore its complexity, history, and interactive parts.  As an approach, it is particularly useful for obtaining a deep appreciation of an issue, event, or phenomenon of interest in its particular context.

The controlling force in research; can be understood as lying on a continuum from basic research (knowledge production) to action research (effecting change).

In its most basic sense, a theory is a story we tell about how the world works that can be tested with empirical evidence.  In qualitative research, we use the term in a variety of ways, many of which are different from how they are used by quantitative researchers.  Although some qualitative research can be described as “testing theory,” it is more common to “build theory” from the data using inductive reasoning , as done in Grounded Theory .  There are so-called “grand theories” that seek to integrate a whole series of findings and stories into an overarching paradigm about how the world works, and much smaller theories or concepts about particular processes and relationships.  Theory can even be used to explain particular methodological perspectives or approaches, as in Institutional Ethnography , which is both a way of doing research and a theory about how the world works.

Research that is interested in generating and testing hypotheses about how the world works.

A methodological tradition of inquiry and approach to analyzing qualitative data in which theories emerge from a rigorous and systematic process of induction.  This approach was pioneered by the sociologists Glaser and Strauss (1967).  The elements of theory generated from comparative analysis of data are, first, conceptual categories and their properties and, second, hypotheses or generalized relations among the categories and their properties – “The constant comparing of many groups draws the [researcher’s] attention to their many similarities and differences.  Considering these leads [the researcher] to generate abstract categories and their properties, which, since they emerge from the data, will clearly be important to a theory explaining the kind of behavior under observation.” (36).

An approach to research that is “multimethod in focus, involving an interpretative, naturalistic approach to its subject matter.  This means that qualitative researchers study things in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them.  Qualitative research involves the studied use and collection of a variety of empirical materials – case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts – that describe routine and problematic moments and meanings in individuals’ lives." ( Denzin and Lincoln 2005:2 ). Contrast with quantitative research .

Research that contributes knowledge that will help people to understand the nature of a problem in order to intervene, thereby allowing human beings to more effectively control their environment.

Research that is designed to evaluate or test the effectiveness of specific solutions and programs addressing specific social problems.  There are two kinds: summative and formative .

Research in which an overall judgment about the effectiveness of a program or policy is made, often for the purpose of generalizing to other cases or programs.  Generally uses qualitative research as a supplement to primary quantitative data analyses.  Contrast formative evaluation research .

Research designed to improve a program or policy (to help “form” or shape its effectiveness); relies heavily on qualitative research methods.  Contrast summative evaluation research

Research carried out at a particular organizational or community site with the intention of affecting change; often involves research subjects as participants of the study.  See also participatory action research .

Research in which both researchers and participants work together to understand a problematic situation and change it for the better.

The level of the focus of analysis (e.g., individual people, organizations, programs, neighborhoods).

The large group of interest to the researcher.  Although it will likely be impossible to design a study that incorporates or reaches all members of the population of interest, this should be clearly defined at the outset of a study so that a reasonable sample of the population can be taken.  For example, if one is studying working-class college students, the sample may include twenty such students attending a particular college, while the population is “working-class college students.”  In quantitative research, clearly defining the general population of interest is a necessary step in generalizing results from a sample.  In qualitative research, defining the population is conceptually important for clarity.

A fictional name assigned to give anonymity to a person, group, or place.  Pseudonyms are important ways of protecting the identity of research participants while still providing a “human element” in the presentation of qualitative data.  There are ethical considerations to be made in selecting pseudonyms; some researchers allow research participants to choose their own.

A requirement for research involving human participants; the documentation of informed consent.  In some cases, oral consent or assent may be sufficient, but the default standard is a single-page easy-to-understand form that both the researcher and the participant sign and date.   Under federal guidelines, all researchers "shall seek such consent only under circumstances that provide the prospective subject or the representative sufficient opportunity to consider whether or not to participate and that minimize the possibility of coercion or undue influence. The information that is given to the subject or the representative shall be in language understandable to the subject or the representative.  No informed consent, whether oral or written, may include any exculpatory language through which the subject or the representative is made to waive or appear to waive any of the subject's rights or releases or appears to release the investigator, the sponsor, the institution, or its agents from liability for negligence" (21 CFR 50.20).  Your IRB office will be able to provide a template for use in your study .

An administrative body established to protect the rights and welfare of human research subjects recruited to participate in research activities conducted under the auspices of the institution with which it is affiliated. The IRB is charged with the responsibility of reviewing all research involving human participants. The IRB is concerned with protecting the welfare, rights, and privacy of human subjects. The IRB has the authority to approve, disapprove, monitor, and require modifications in all research activities that fall within its jurisdiction as specified by both the federal regulations and institutional policy.

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 29 April 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

  • Tools and Resources
  • Customer Services
  • Original Language Spotlight
  • Alternative and Non-formal Education 
  • Cognition, Emotion, and Learning
  • Curriculum and Pedagogy
  • Education and Society
  • Education, Change, and Development
  • Education, Cultures, and Ethnicities
  • Education, Gender, and Sexualities
  • Education, Health, and Social Services
  • Educational Administration and Leadership
  • Educational History
  • Educational Politics and Policy
  • Educational Purposes and Ideals
  • Educational Systems
  • Educational Theories and Philosophies
  • Globalization, Economics, and Education
  • Languages and Literacies
  • Professional Learning and Development
  • Research and Assessment Methods
  • Technology and Education
  • Share This Facebook LinkedIn Twitter

Article contents

Qualitative design research methods.

  • Michael Domínguez Michael Domínguez San Diego State University
  • https://doi.org/10.1093/acrefore/9780190264093.013.170
  • Published online: 19 December 2017

Emerging in the learning sciences field in the early 1990s, qualitative design-based research (DBR) is a relatively new methodological approach to social science and education research. As its name implies, DBR is focused on the design of educational innovations, and the testing of these innovations in the complex and interconnected venue of naturalistic settings. As such, DBR is an explicitly interventionist approach to conducting research, situating the researcher as a part of the complex ecology in which learning and educational innovation takes place.

With this in mind, DBR is distinct from more traditional methodologies, including laboratory experiments, ethnographic research, and large-scale implementation. Rather, the goal of DBR is not to prove the merits of any particular intervention, or to reflect passively on a context in which learning occurs, but to examine the practical application of theories of learning themselves in specific, situated contexts. By designing purposeful, naturalistic, and sustainable educational ecologies, researchers can test, extend, or modify their theories and innovations based on their pragmatic viability. This process offers the prospect of generating theory-developing, contextualized knowledge claims that can complement the claims produced by other forms of research.

Because of this interventionist, naturalistic stance, DBR has also been the subject of ongoing debate concerning the rigor of its methodology. In many ways, these debates obscure the varied ways DBR has been practiced, the varied types of questions being asked, and the theoretical breadth of researchers who practice DBR. With this in mind, DBR research may involve a diverse range of methods as researchers from a variety of intellectual traditions within the learning sciences and education research design pragmatic innovations based on their theories of learning, and document these complex ecologies using the methodologies and tools most applicable to their questions, focuses, and academic communities.

DBR has gained increasing interest in recent years. While it remains a popular methodology for developmental and cognitive learning scientists seeking to explore theory in naturalistic settings, it has also grown in importance to cultural psychology and cultural studies researchers as a methodological approach that aligns in important ways with the participatory commitments of liberatory research. As such, internal tension within the DBR field has also emerged. Yet, though approaches vary, and have distinct genealogies and commitments, DBR might be seen as the broad methodological genre in which Change Laboratory, design-based implementation research (DBIR), social design-based experiments (SDBE), participatory design research (PDR), and research-practice partnerships might be categorized. These critically oriented iterations of DBR have important implications for educational research and educational innovation in historically marginalized settings and the Global South.

  • design-based research
  • learning sciences
  • social-design experiment
  • qualitative research
  • research methods

Educational research, perhaps more than many other disciplines, is a situated field of study. Learning happens around us every day, at all times, in both formal and informal settings. Our worlds are replete with complex, dynamic, diverse communities, contexts, and institutions, many of which are actively seeking guidance and support in the endless quest for educational innovation. Educational researchers—as a source of potential expertise—are necessarily implicated in this complexity, linked to the communities and institutions through their very presence in spaces of learning, poised to contribute with possible solutions, yet often positioned as separate from the activities they observe, creating dilemmas of responsibility and engagement.

So what are educational scholars and researchers to do? These tensions invite a unique methodological challenge for the contextually invested researcher, begging them to not just produce knowledge about learning, but to participate in the ecology, collaborating on innovations in the complex contexts in which learning is taking place. In short, for many educational researchers, our backgrounds as educators, our connections to community partners, and our sociopolitical commitments to the process of educational innovation push us to ensure that our work is generative, and that our theories and ideas—our expertise—about learning and education are made pragmatic, actionable, and sustainable. We want to test what we know outside of laboratories, designing, supporting, and guiding educational innovation to see if our theories of learning are accurate, and useful to the challenges faced in schools and communities where learning is messy, collaborative, and contested. Through such a process, we learn, and can modify our theories to better serve the real needs of communities. It is from this impulse that qualitative design-based research (DBR) emerged as a new methodological paradigm for education research.

Qualitative design-based research will be examined, documenting its origins, the major tenets of the genre, implementation considerations, and methodological issues, as well as variance within the paradigm. As a relatively new methodology, much tension remains in what constitutes DBR, and what design should mean, and for whom. These tensions and questions, as well as broad perspectives and emergent iterations of the methodology, will be discussed, and considerations for researchers looking toward the future of this paradigm will be considered.

The Origins of Design-Based Research

Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown ( 1992 ) and Allan Collins ( 1992 ). For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory experiments, ethnographies, and large-scale educational interventions were the only methods available. During these decades, a growing community of learning science and educational researchers (e.g., Bereiter & Scardamalia, 1989 ; Brown, Campione, Webber, & McGilley, 1992 ; Cobb & Steffe, 1983 ; Cole, 1995 ; Scardamalia & Bereiter, 1991 ; Schoenfeld, 1982 , 1985 ; Scribner & Cole, 1978 ) interested in educational innovation and classroom interventions in situated contexts began to find the prevailing methodologies insufficient for the types of learning they wished to document, the roles they wished to play in research, and the kinds of knowledge claims they wished to explore. The laboratory, or laboratory-like settings, where research on learning was at the time happening, was divorced from the complexity of real life, and necessarily limiting. Alternatively, most ethnographic research, while more attuned to capturing these complexities and dynamics, regularly assumed a passive stance 1 and avoided interceding in the learning process, or allowing researchers to see what possibility for innovation existed from enacting nascent learning theories. Finally, large-scale interventions could test innovations in practice but lost sight of the nuance of development and implementation in local contexts (Brown, 1992 ; Collins, Joseph, & Bielaczyc, 2004 ).

Dissatisfied with these options, and recognizing that in order to study and understand learning in the messiness of socially, culturally, and historically situated settings, new methods were required, Brown ( 1992 ) proposed an alternative: Why not involve ourselves in the messiness of the process, taking an active, grounded role in disseminating our theories and expertise by becoming designers and implementers of educational innovations? Rather than observing from afar, DBR researchers could trace their own iterative processes of design, implementation, tinkering, redesign, and evaluation, as it unfolded in shared work with teachers, students, learners, and other partners in lived contexts. This premise, initially articulated as “design experiments” (Brown, 1992 ), would be variously discussed over the next decade as “design research,” (Edelson, 2002 ) “developmental research,” (Gravemeijer, 1994 ), and “design-based research,” (Design-Based Research Collective, 2003 ), all of which reflect the original, interventionist, design-oriented concept. The latter term, “design-based research” (DBR), is used here, recognizing this as the prevailing terminology used to refer to this research approach at present. 2

Regardless of the evolving moniker, the prospects of such a methodology were extremely attractive to researchers. Learning scientists acutely aware of various aspects of situated context, and interested in studying the applied outcomes of learning theories—a task of inquiry into situated learning for which canonical methods were rather insufficient—found DBR a welcome development (Bell, 2004 ). As Barab and Squire ( 2004 ) explain: “learning scientists . . . found that they must develop technological tools, curriculum, and especially theories that help them systematically understand and predict how learning occurs” (p. 2), and DBR methodologies allowed them to do this in proactive, hands-on ways. Thus, rather than emerging as a strict alternative to more traditional methodologies, DBR was proposed to fill a niche that other methodologies were ill-equipped to cover.

Effectively, while its development is indeed linked to an inherent critique of previous research paradigms, neither Brown nor Collins saw DBR in opposition to other forms of research. Rather, by providing a bridge from the laboratory to the real world, where learning theories and proposed innovations could interact and be implemented in the complexity of lived socio-ecological contexts (Hoadley, 2004 ), new possibilities emerged. Learning researchers might “trace the evolution of learning in complex, messy classrooms and schools, test and build theories of teaching and learning, and produce instructional tools that survive the challenges of everyday practice” (Shavelson, Phillips, Towne, & Feuer, 2003 , p. 25). Thus, DBR could complement the findings of laboratory, ethnographic, and large-scale studies, answering important questions about the implementation, sustainability, limitations, and usefulness of theories, interventions, and learning when introduced as innovative designs into situated contexts of learning. Moreover, while studies involving these traditional methodologies often concluded by pointing toward implications—insights subsequent studies would need to take up—DBR allowed researchers to address implications iteratively and directly. No subsequent research was necessary, as emerging implications could be reflexively explored in the context of the initial design, offering considerable insight into how research is translated into theory and practice.

Since its emergence in 1992 , DBR as a methodological approach to educational and learning research has quickly grown and evolved, used by researchers from a variety of intellectual traditions in the learning sciences, including developmental and cognitive psychology (e.g., Brown & Campione, 1996 , 1998 ; diSessa & Minstrell, 1998 ), cultural psychology (e.g., Cole, 1996 , 2007 ; Newman, Griffin, & Cole, 1989 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ), cultural anthropology (e.g., Barab, Kinster, Moore, Cunningham, & the ILF Design Team, 2001 ; Polman, 2000 ; Stevens, 2000 ; Suchman, 1995 ), and cultural-historical activity theory (e.g., Engeström, 2011 ; Espinoza, 2009 ; Espinoza & Vossoughi, 2014 ; Gutiérrez, 2008 ; Sannino, 2011 ). Given this plurality of epistemological and theoretical fields that employ DBR, it might best be understood as a broad methodology of educational research, realized in many different, contested, heterogeneous, and distinct iterations, and engaging a variety of qualitative tools and methods (Bell, 2004 ). Despite tensions among these iterations, and substantial and important variances in the ways they employ design-as-research in community settings, there are several common, methodological threads that unite the broad array of research that might be classified as DBR under a shared, though pluralistic, paradigmatic umbrella.

The Tenets of Design-Based Research

Why design-based research.

As we turn to the core tenets of the design-based research (DBR) paradigm, it is worth considering an obvious question: Why use DBR as a methodology for educational research? To answer this, it is helpful to reflect on the original intentions for DBR, particularly, that it is not simply the study of a particular, isolated intervention. Rather, DBR methodologies were conceived of as the complete, iterative process of designing, modifying, and assessing the impact of an educational innovation in a contextual, situated learning environment (Barab & Kirshner, 2001 ; Brown, 1992 ; Cole & Engeström, 2007 ). The design process itself—inclusive of the theory of learning employed, the relationships among participants, contextual factors and constraints, the pedagogical approach, any particular intervention, as well as any changes made to various aspects of this broad design as it proceeds—is what is under study.

Considering this, DBR offers a compelling framework for the researcher interested in having an active and collaborative hand in designing for educational innovation, and interested in creating knowledge about how particular theories of learning, pedagogical or learning practices, or social arrangements function in a context of learning. It is a methodology that can put the researcher in the position of engineer , actively experimenting with aspects of learning and sociopolitical ecologies to arrive at new knowledge and productive outcomes, as Cobb, Confrey, diSessa, Lehrer, and Schauble ( 2003 ) explain:

Prototypically, design experiments entail both “engineering” particular forms of learning and systematically studying those forms of learning within the context defined by the means of supporting them. This designed context is subject to test and revision, and the successive iterations that result play a role similar to that of systematic variation in experiment. (p. 9)

This being said, how directive the engineering role the researcher takes on varies considerably among iterations of DBR. Indeed, recent approaches have argued strongly for researchers to take on more egalitarian positionalities with respect to the community partners with whom they work (e.g., Zavala, 2016 ), acting as collaborative designers, rather than authoritative engineers.

Method and Methodology in Design-Based Research

Now, having established why we might use DBR, a recurring question that has faced the DBR paradigm is whether DBR is a methodology at all. Given the variety of intellectual and ontological traditions that employ it, and thus the pluralism of methods used in DBR to enact the “engineering” role (whatever shape that may take) that the researcher assumes, it has been argued that DBR is not, in actuality a methodology at all (Kelly, 2004 ). The proliferation and diversity of approaches, methods, and types of analysis purporting to be DBR have been described as a lack of coherence that shows there is no “argumentative grammar” or methodology present in DBR (Kelly, 2004 ).

Now, the conclusions one will eventually draw in this debate will depend on one’s orientations and commitments, but it is useful to note that these demands for “coherence” emerge from previous paradigms in which methodology was largely marked by a shared, coherent toolkit for data collection and data analysis. These previous paradigmatic rules make for an odd fit when considering DBR. Yet, even if we proceed—within the qualitative tradition from which DBR emerges—defining methodology as an approach to research that is shaped by the ontological and epistemological commitments of the particular researcher, and methods as the tools for research, data collection, and analysis that are chosen by the researcher with respect to said commitments (Gutiérrez, Engeström, & Sannino, 2016 ), then a compelling case for DBR as a methodology can be made (Bell, 2004 ).

Effectively, despite the considerable variation in how DBR has been and is employed, and tensions within the DBR field, we might point to considerable, shared epistemic common ground among DBR researchers, all of whom are invested in an approach to research that involves engaging actively and iteratively in the design and exploration of learning theory in situated, natural contexts. This common epistemic ground, even in the face of pluralistic ideologies and choices of methods, invites in a new type of methodological coherence, marked by “intersubjectivity without agreement” (Matusov, 1996 ), that links DBR from traditional developmental and cognitive psychology models of DBR (e.g., Brown, 1992 ; Brown & Campione, 1998 ; Collins, 1992 ), to more recent critical and sociocultural manifestations (e.g., Bang & Vossoughi, 2016 ; Engeström, 2011 ; Gutiérrez, 2016 ), and everything in between.

Put in other terms, even as DBR researchers may choose heterogeneous methods for data collection, data analysis, and reporting results complementary to the ideological and sociopolitical commitments of the particular researcher and the types of research questions that are under examination (Bell, 2004 ), a shared epistemic commitment gives the methodology shape. Indeed, the common commitment toward design innovation emerges clearly across examples of DBR methodological studies ranging in method from ethnographic analyses (Salvador, Bell, & Anderson, 1999 ) to studies of critical discourse within a design (Kärkkäinen, 1999 ), to focused examinations of metacognition of individual learners (White & Frederiksen, 1998 ), and beyond. Rather than indicating a lack of methodology, or methodological weakness, the use of varying qualitative methods for framing data collection and retrospective analyses within DBR, and the tensions within the epistemic common ground itself, simply reflects the scope of its utility. Learning in context is complex, contested, and messy, and the plurality of methods present across DBR allow researchers to dynamically respond to context as needed, employing the tools that fit best to consider the questions that are present, or may arise.

All this being the case, it is useful to look toward the coherent elements—the “argumentative grammar” of DBR, if you will—that can be identified across the varied iterations of DBR. Understanding these shared features, in the context and terms of the methodology itself, help us to appreciate what is involved in developing robust and thorough DBR research, and how DBR seeks to make strong, meaningful claims around the types of research questions it takes up.

Coherent Features of Design-Based Research

Several scholars have provided comprehensive overviews and listings of what they see as the cross-cutting features of DBR, both in the context of more traditional models of DBR (e.g., Cobb et al., 2003 ; Design-Based Research Collective, 2003 ), and in regards to newer iterations (e.g., Gutiérrez & Jurow, 2016 ; Bang & Vossoughi, 2016 ). Rather than try to offer an overview of each of these increasingly pluralistic classifications, the intent here is to attend to three broad elements that are shared across articulations of DBR and reflect the essential elements that constitute the methodological approach DBR offers to educational researchers.

Design research is concerned with the development, testing, and evolution of learning theory in situated contexts

This first element is perhaps most central to what DBR of all types is, anchored in what Brown ( 1992 ) was initially most interested in: testing the pragmatic validity of theories of learning by designing interventions that engaged with, or proposed, entire, naturalistic, ecologies of learning. Put another way, while DBR studies may have various units of analysis, focuses, and variables, and may organize learning in many different ways, it is the theoretically informed design for educational innovation that is most centrally under evaluation. DBR actively and centrally exists as a paradigm that is engaged in the development of theory, not just the evaluation of aspects of its usage (Bell, 2004 ; Design-Based Research Collective, 2003 ; Lesh & Kelly, 2000 ; van den Akker, 1999 ).

Effectively, where DBR is taking place, theory as a lived possibility is under examination. Specifically, in most DBR, this means a focus on “intermediate-level” theories of learning, rather than “grand” ones. In essence, DBR does not contend directly with “grand” learning theories (such as developmental or sociocultural theory writ large) (diSessa, 1991 ). Rather, DBR seeks to offer constructive insights by directly engaging with particular learning processes that flow from these theories on a “grounded,” “intermediate” level. This is not, however, to say DBR is limited in what knowledge it can produce; rather, tinkering in this “intermediate” realm can produce knowledge that informs the “grand” theory (Gravemeijer, 1994 ). For example, while cognitive and motivational psychology provide “grand” theoretical frames, interest-driven learning (IDL) is an “intermediate” theory that flows from these and can be explored in DBR to both inform the development of IDL designs in practice and inform cognitive and motivational psychology more broadly (Joseph, 2004 ).

Crucially, however, DBR entails putting the theory in question under intense scrutiny, or, “into harm’s way” (Cobb et al., 2003 ). This is an especially core element to DBR, and one that distinguishes it from the proliferation of educational-reform or educational-entrepreneurship efforts that similarly take up the discourse of “design” and “innovation.” Not only is the reflexive, often participatory element of DBR absent from such efforts—that is, questioning and modifying the design to suit the learning needs of the context and partners—but the theory driving these efforts is never in question, and in many cases, may be actively obscured. Indeed, it is more common to see educational-entrepreneur design innovations seek to modify a context—such as the way charter schools engage in selective pupil recruitment and intensive disciplinary practices (e.g., Carnoy et al., 2005 ; Ravitch, 2010 ; Saltman, 2007 )—rather than modify their design itself, and thus allow for humility in their theory. Such “innovations” and “design” efforts are distinct from DBR, which must, in the spirit of scientific inquiry, be willing to see the learning theory flail and struggle, be modified, and evolve.

This growth and evolution of theory and knowledge is of course central to DBR as a rigorous research paradigm; moving it beyond simply the design of local educational programs, interventions, or innovations. As Barab and Squire ( 2004 ) explain:

Design-based research requires more than simply showing a particular design works but demands that the researcher (move beyond a particular design exemplar to) generate evidence-based claims about learning that address contemporary theoretical issues and further the theoretical knowledge of the field. (pp. 5–6)

DBR as a research paradigm offers a design process through which theories of learning can be tested; they can be modified, and by allowing them to operate with humility in situated conditions, new insights and knowledge, even new theories, may emerge that might inform the field, as well as the efforts and directions of other types of research inquiry. These productive, theory-developing outcomes, or “ontological innovations” (diSessa & Cobb, 2004 ), represent the culmination of an effective program of DBR—the production of new ways to understand, conceptualize, and enact learning as a lived, contextual process.

Design research works to understand learning processes, and the design that supports them in situated contexts

As a research methodology that operates by tinkering with “grounded” learning theories, DBR is itself grounded, and seeks to develop its knowledge claims and designs in naturalistic, situated contexts (Brown, 1992 ). This is, again, a distinguishing element of DBR—setting it apart from laboratory research efforts involving design and interventions in closed, controlled environments. Rather than attempting to focus on singular variables, and isolate these from others, DBR is concerned with the multitude of variables that naturally occur across entire learning ecologies, and present themselves in distinct ways across multiple planes of possible examination (Rogoff, 1995 ; Collins, Joseph, & Bielaczyc, 2004 ). Certainly, specific variables may be identified as dependent, focal units of analysis, but identifying (while not controlling for) the variables beyond these, and analyzing their impact on the design and learning outcomes, is an equally important process in DBR (Collins et al., 2004 ; Barab & Kirshner, 2001 ). In practice, this of course varies across iterations in its depth and breadth. Traditional models of developmental or cognitive DBR may look to account for the complexity and nuance of a setting’s social, developmental, institutional, and intellectual characteristics (e.g., Brown, 1992 ; Cobb et al., 2003 ), while more recent, critical iterations will give increased attention to how historicity, power, intersubjectivity, and culture, among other things, influence and shape a setting, and the learning that occurs within it (e.g., Gutiérrez, 2016 ; Vakil, de Royston, Nasir, & Kirshner, 2016 ).

Beyond these variations, what counts as “design” in DBR varies widely, and so too will what counts as a naturalistic setting. It has been well documented that learning occurs all the time, every day, and in every space imaginable, both formal and informal (Leander, Phillips, & Taylor, 2010 ), and in ways that span strictly defined setting boundaries (Engeström, Engeström, & Kärkkäinen, 1995 ). DBR may take place in any number of contexts, based on the types of questions asked, and the learning theories and processes that a researcher may be interested in exploring. DBR may involve one-to-one tutoring and learning settings, single classrooms, community spaces, entire institutions, or even holistically designed ecologies (Design-Based Research Collective, 2003 ; Engeström, 2008 ; Virkkunen & Newnham, 2013 ). In all these cases, even the most completely designed experimental ecology, the setting remains naturalistic and situated because DBR actively embraces the uncontrollable variables that participants bring with them to the learning process for and from their situated worlds, lives, and experiences—no effort is made to control for these complicated influences of life, simply to understand how they operate in a given ecology as innovation is attempted. Thus, the extent of the design reflects a broader range of qualitative and theoretical study, rather than an attempt to control or isolate some particular learning process from outside influence.

While there is much variety in what design may entail, where DBR takes place, what types of learning ecologies are under examination, and what methods are used, situated ecologies are always the setting of this work. In this way, conscious of naturalistic variables, and the influences that culture, historicity, participation, and context have on learning, researchers can use DBR to build on prior research, and extend knowledge around the learning that occurs in the complexity of situated contexts and lived practices (Collins et al., 2004 ).

Design based research is iterative; it changes, grows, and evolves to meet the needs and emergent questions of the context, and this tinkering process is part of the research

The final shared element undergirding models of DBR is that it is an iterative, active, and interventionist process, interested in and focused on producing educational innovation by actually and actively putting design innovations into practice (Brown, 1992 , Collins, 1992 ; Gutiérrez, 2008 ). Given this interventionist, active stance, tinkering with the design and the theory of learning informing the design is as much a part of the research process as the outcome of the intervention or innovation itself—we learn what impacts learning as much, if not more, than we learn what was learned. In this sense, DBR involves a focus on analyzing the theory-driven design itself, and its implementation as an object of study (Edelson, 2002 ; Penuel, Fishman, Cheng, & Sabelli, 2011 ), and is ultimately interested in the improvement of the design—of how it unfolds, how it shifts, how it is modified, and made to function productively for participants in their contexts and given their needs (Kirshner & Polman, 2013 ).

While DBR is iterative and contextual as a foundational methodological principle, what this means varies across conceptions of DBR. For instance, in more traditional models, Brown and Campione ( 1996 ) pointed out the dangers of “lethal mutation” in which a design, introduced into a context, may become so warped by the influence, pressures, incomplete implementation, or misunderstanding of participants in the local context, that it no longer reflects or tests the theory under study. In short, a theory-driven intervention may be put in place, and then subsumed to such a degree by participants based on their understanding and needs, that it remains the original innovative design in name alone. The assertion here is that in these cases, the research ceases to be DBR in the sense that the design is no longer central, actively shaping learning. We cannot, they argue, analyze a design—and the theory it was meant to reflect—as an object of study when it has been “mutated,” and it is merely a banner under which participants are enacting their idiosyncratic, pragmatic needs.

While the ways in which settings and individuals might disrupt designs intended to produce robust learning is certainly a tension to be cautious of in DBR, it is also worth noting that in many critical approaches to DBR, such mutations—whether “lethal” to the original design or not—are seen as compelling and important moments. Here, where collaboration and community input is more central to the design process, iterative is understood differently. Thus, a “mutation” becomes a point where reflexivity, tension, and contradiction might open the door for change, for new designs, for reconsiderations of researcher and collaborative partner positionalities, or for ethnographic exploration into how a context takes up, shapes, and ultimately engages innovations in a particular sociocultural setting. In short, accounting for and documenting changes in design is a vital part of the DBR process, allowing researchers to respond to context in a variety of ways, always striving for their theories and designs to act with humility, and in the interest of usefulness .

With this in mind, the iterative nature of DBR means that the relationships researchers have with other design partners (educators and learners) in the ecology are incredibly important, and vital to consider (Bang et al., 2016 ; Engeström, 2007 ; Engeström, Sannino, & Virkkunen, 2014 ). Different iterations of DBR might occur in ways in which the researcher is more or less intimately involved in the design and implementation process, both in terms of actual presence and intellectual ownership of the design. Regarding the former, in some cases, a researcher may hand a design off to others to implement, periodically studying and modifying it, while in other contexts or designs, the researcher may be actively involved, tinkering in every detail of the implementation and enactment of the design. With regard to the latter, DBR might similarly range from a somewhat prescribed model, in which the researcher is responsible for the original design, and any modifications that may occur based on their analyses, without significant input from participants (e.g., Collins et al., 2004 ), to incredibly participatory models, in which all parties (researchers, educators, learners) are part of each step of the design-creation, modification, and research process (e.g., Bang, Faber, Gurneau, Marin, & Soto, 2016 ; Kirshner, 2015 ).

Considering the wide range of ideological approaches and models for DBR, we might acknowledge that DBR can be gainfully conducted through many iterations of “openness” to the design process. However, the strength of the research—focused on analyzing the design itself as a unit of study reflective of learning theory—will be bolstered by thoughtfully accounting for how involved the researcher will be, and how open to participation the modification process is. These answers should match the types of questions, and conceptual or ideological framing, with which researchers approach DBR, allowing them to tinker with the process of learning as they build on prior research to extend knowledge and test theory (Barab & Kirshner, 2001 ), while thoughtfully documenting these changes in the design as they go.

Implementation and Research Design

As with the overarching principles of design-based research (DBR), even amid the pluralism of conceptual frameworks of DBR researchers, it is possible, and useful, to trace the shared contours in how DBR research design is implemented. Though texts provide particular road maps for undertaking various iterations of DBR consistent with the specific goals, types of questions, and ideological orientations of these scholarly communities (e.g., Cole & Engeström, 2007 ; Collins, Joseph, & Bielaczyc, 2004 ; Fishman, Penuel, Allen, Cheng, & Sabelli, 2013 ; Gutiérrez & Jurow, 2016 ; Virkkunen & Newnham, 2013 ), certain elements, realized differently, can be found across all of these models, and may be encapsulated in five broad methodological phases.

Considering the Design Focus

DBR begins by considering what the focus of the design, the situated context, and the units of analysis for research will be. Prospective DBR researchers will need to consider broader research in regard to the “grand” theory of learning with which they work to determine what theoretical questions they have, or identify “intermediate” aspects of the theories that might be studied and strengthened by a design process in situated contexts, and what planes of analysis (Rogoff, 1995 ) will be most suitable for examination. This process allows for the identification of the critical theoretical elements of a design, and articulation of initial research questions.

Given the conceptual framework, theoretical and research questions, and sociopolitical interests at play, researchers may undertake this, and subsequent steps in the process, on their own, or in close collaboration with the communities and individuals in the situated contexts in which the design will unfold. As such, across iterations of DBR, and with respect to the ways DBR researchers choose to engage with communities, the origin of the design will vary, and might begin in some cases with theoretical questions, or arise in others as a problem of practice (Coburn & Penuel, 2016 ), though as has been noted, in either case, theory and practice are necessarily linked in the research.

Creating and Implementing a Designed Innovation

From the consideration and identification of the critical elements, planned units of analysis, and research questions that will drive a design, researchers can then actively create (either on their own or in conjunction with potential design partners) a designed intervention reflecting these critical elements, and the overarching theory.

Here, the DBR researcher should consider what partners exist in the process and what ownership exists around these partnerships, determine exactly what the pragmatic features of the intervention/design will be and who will be responsible for them, and consider when checkpoints for modification and evaluation will be undertaken, and by whom. Additionally, researchers should at this stage consider questions of timeline and of recruiting participants, as well as what research materials will be needed to adequately document the design, its implementation, and its outcomes, and how and where collected data will be stored.

Once a design (the planned, theory-informed innovative intervention) has been produced, the DBR researcher and partners can begin the implementation process, putting the design into place and beginning data collection and documentation.

Assessing the Impact of the Design on the Learning Ecology

Chronologically, the next two methodological steps happen recursively in the iterative process of DBR. The researcher must assess the impact of the design, and then, make modifications as necessary, before continuing to assess the impact of these modifications. In short, these next two steps are a cycle that continues across the life and length of the research design.

Once a design has been created and implemented, the researcher begins to observe and document the learning, the ecology, and the design itself. Guided by and in conversation with the theory and critical elements, the researcher should periodically engage in ongoing data analysis, assessing the success of the design, and of learning, paying equal attention to the design itself, and how its implementation is working in the situated ecology.

Within the realm of qualitative research, measuring or assessing variables of learning and assessing the design may look vastly different, require vastly different data-collection and data-analysis tools, and involve vastly different research methods among different researchers.

Modifying the Design

Modification, based on ongoing assessment of the design, is what makes DBR iterative, helping the researcher extend the field’s knowledge about the theory, design, learning, and the context under examination.

Modification of the design can take many forms, from complete changes in approach or curriculum, to introducing an additional tool or mediating artifact into a learning ecology. Moreover, how modification unfolds involves careful reflection from the researcher and any co-designing participants, deciding whether modification will be an ongoing, reflexive, tinkering process, or if it will occur only at predefined checkpoints, after formal evaluation and assessment. Questions of ownership, issues of resource availability, technical support, feasibility, and communication are all central to the work of design modification, and answers will vary given the research questions, design parameters, and researchers’ epistemic commitments.

Each moment of modification indicates a new phase in a DBR project, and a new round of assessing—through data analysis—the impact of the design on the learning ecology, either to guide continued or further modification, report the results of the design, or in some cases, both.

Reporting the Results of the Design

The final step in DBR methodology is to report on the results of the designed intervention, how it contributed to understandings of theory, and how it impacted the local learning ecology or context. The format, genre, and final data analysis methods used in reporting data and research results will vary across iterations of DBR. However, it is largely understood that to avoid methodological confusion, DBR researchers should clearly situate themselves in the DBR paradigm by clearly describing and detailing the design itself; articulating the theory, central elements, and units of analysis under scrutiny, what modifications occurred and what precipitated these changes, and what local effects were observed; and exploring any potential contributions to learning theory, while accounting for the context and their interventionist role and positionality in the design. As such, careful documentation of pragmatic and design decisions for retrospective data analysis, as well as research findings, should be done at each stage of this implementation process.

Methodological Issues in the Design-Based Research Paradigm

Because of its pluralistic nature, its interventionist, nontraditional stance, and the fact that it remains in its conceptual infancy, design-based research (DBR) is replete with ongoing methodological questions and challenges, both from external and internal sources. While there are many more that may exist, addressed will be several of the most pressing the prospective DBR researcher may encounter, or want to consider in understanding the paradigm and beginning a research design.

Challenges to Rigor and Validity

Perhaps the place to begin this reflection on tensions in the DBR paradigm is the recurrent and ongoing challenge to the rigor and validity of DBR, which has asked: Is DBR research at all? Given the interventionist and activist way in which DBR invites the researcher to participate, and the shift in orientation from long-accepted research paradigms, such critiques are hardly surprising, and fall in line with broader challenges to the rigor and objectivity of qualitative social science research in general. Historically, such complaints about DBR are linked to decades of critique of any research that does not adhere to the post-positivist approach set out as the U.S. Department of Education began to prioritize laboratory and large-scale randomized control-trial experimentation as the “gold standard” of research design (e.g., Mosteller & Boruch, 2002 ).

From the outset, DBR, as an interventionist, local, situated, non-laboratory methodology, was bound to run afoul of such conservative trends. While some researchers involved in (particularly traditional developmental and cognitive) DBR have found broader acceptance within these constraints, the rigor of DBR remains contested. It has been suggested that DBR is under-theorized and over-methologized, a haphazard way for researchers to do activist work without engaging in the development of robust knowledge claims about learning (Dede, 2004 ), and an approach lacking in coherence that sheltered interventionist projects of little impact to developing learning theory and allowed researchers to make subjective, pet claims through selective analysis of large bodies of collected data (Kelly, 2003 , 2004 ).

These critiques, however, impose an external set of criteria on DBR, desiring it to fit into the molds of rigor and coherence as defined by canonical methodologies. Bell ( 2004 ) and Bang and Vossoughi ( 2016 ) have made compelling cases for the wide variety of methods and approaches present in DBR not as a fracturing, but as a generative proliferation of different iterations that can offer powerful insights around the different types of questions that exist about learning in the infinitely diverse settings in which it occurs. Essentially, researchers have argued that within the DBR paradigm, and indeed within educational research more generally, the practical impact of research on learning, context, and practices should be a necessary component of rigor (Gutiérrez & Penuel, 2014 ), and the pluralism of methods and approaches available in DBR ensures that the practical impacts and needs of the varied contexts in which the research takes place will always drive the design and research tools.

These moves are emblematic of the way in which DBR is innovating and pushing on paradigms of rigor in educational research altogether, reflecting how DBR fills a complementary niche with respect to other methodologies and attends to elements and challenges of learning in lived, real environments that other types of research have consistently and historically missed. Beyond this, Brown ( 1992 ) was conscious of the concerns around data collection, validity, rigor, and objectivity from the outset, identifying this dilemma—the likelihood of having an incredible amount of data collected in a design only a small fraction of which can be reported and shared, thus leading potentially to selective data analysis and use—as the Bartlett Effect (Brown, 1992 ). Since that time, DBR researchers have been aware of this challenge, actively seeking ways to mitigate this threat to validity by making data sets broadly available, documenting their design, tinkering, and modification processes, clearly situating and describing disconfirming evidence and their own position in the research, and otherwise presenting the broad scope of human and learning activity that occurs within designs in large learning ecologies as comprehensively as possible.

Ultimately, however, these responses are likely to always be insufficient as evidence of rigor to some, for the root dilemma is around what “counts” as education science. While researchers interested and engaged in DBR ought rightly to continue to push themselves to ensure the methodological rigor of their work and chosen methods, it is also worth noting that DBR should seek to hold itself to its own criteria of assessment. This reflects broader trends in qualitative educational research that push back on narrow constructions of what “counts” as science, recognizing the ways in which new methodologies and approaches to research can help us examine aspects of learning, culture, and equity that have continued to be blind spots for traditional education research; invite new voices and perspectives into the process of achieving rigor and validity (Erickson & Gutiérrez, 2002 ); bolster objectivity by bringing it into conversation with the positionality of the researcher (Harding, 1993 ); and perhaps most important, engage in axiological innovation (Bang, Faber, Gurneau, Marin, & Soto, 2016 ), or the exploration of and design for what is, “good right, true, and beautiful . . . in cultural ecologies” (p. 2).

Questions of Generalizability and Usefulness

The generalizability of research results in DBR has been an ongoing and contentious issue in the development of the paradigm. Indeed, by the standards of canonical methods (e.g., laboratory experimentation, ethnography), these local, situated interventions should lack generalizability. While there is reason to discuss and question the merit of generalizability as a goal of qualitative research at all, researchers in the DBR paradigm have long been conscious of this issue. Understanding the question of generalizability around DBR, and how the paradigm has responded to it, can be done in two ways.

First, by distinguishing questions specific to a particular design from the generalizability of the theory. Cole’s (Cole & Underwood, 2013 ) 5th Dimension work, and the nationwide network of linked, theoretically similar sites, operating nationwide with vastly different designs, is a powerful example of this approach to generalizability. Rather than focus on a single, unitary, potentially generalizable design, the project is more interested in variability and sustainability of designs across local contexts (e.g., Cole, 1995 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ; Jurow, Tracy, Hotchkiss, & Kirshner, 2012 ). Through attention to sustainable, locally effective innovations, conscious of the wide variation in culture and context that accompanies any and all learning processes, 5th Dimension sites each derive their idiosyncratic structures from sociocultural theory, sharing some elements, but varying others, while seeking their own “ontological innovations” based on the affordances of their contexts. This pattern reflects a key element of much of the DBR paradigm: that questions of generalizability in DBR may be about the generalizability of the theory of learning, and the variability of learning and design in distinct contexts, rather than the particular design itself.

A second means of addressing generalizability in DBR has been to embrace the pragmatic impacts of designing innovations. This response stems from Messick ( 1992 ) and Schoenfeld’s ( 1992 ) arguments early on in the development of DBR that the consequentialness and validity of DBR efforts as potentially generalizable research depend on the “ usefulness ” of the theories and designs that emerge. Effectively, because DBR is the examination of situated theory, a design must be able to show pragmatic impact—it must succeed at showing the theory to be useful . If there is evidence of usefulness to both the context in which it takes place, and the field of educational research more broadly, then the DBR researcher can stake some broader knowledge claims that might be generalizable. As a result, the DBR paradigm tends to “treat changes in [local] contexts as necessary evidence for the viability of a theory” (Barab & Squire, 2004 , p. 6). This of course does not mean that DBR is only interested in successful efforts. A design that fails or struggles can provide important information and knowledge to the field. Ultimately, though, DBR tends to privilege work that proves the usefulness of designs, whose pragmatic or theoretical findings can then be generalized within the learning science and education research fields.

With this said, the question of usefulness is not always straightforward, and is hardly unitary. While many DBR efforts—particularly those situated in developmental and cognitive learning science traditions—are interested in the generalizability of their useful educational designs (Barab & Squire, 2004 ; Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003 ; Joseph, 2004 ; Steffe & Thompson, 2000 ), not all are. Critical DBR researchers have noted that if usefulness remains situated in the extant sociopolitical and sociocultural power-structures—dominant conceptual and popular definitions of what useful educational outcomes are—the result will be a bar for research merit that inexorably bends toward the positivist spectrum (Booker & Goldman, 2016 ; Dominguez, 2015 ; Zavala, 2016 ). This could potentially, and likely, result in excluding the non-normative interventions and innovations that are vital for historically marginalized communities, but which might have vastly different-looking outcomes, that are nonetheless useful in the sociopolitical context they occur in. Alternative framings to this idea of usefulness push on and extend the intention, and seek to involve the perspectives and agency of situated community partners and their practices in what “counts” as generative and rigorous research outcomes (Gutiérrez & Penuel, 2014 ). An example in this regard is the idea of consequential knowledge (Hall & Jurow, 2015 ; Jurow & Shea, 2015 ), which suggests outcomes that are consequential will be taken up by participants in and across their networks, and over-time—thus a goal of consequential knowledge certainly meets the standard of being useful , but it also implicates the needs and agency of communities in determining the success and merit of a design or research endeavor in important ways that strict usefulness may miss.

Thus, the bar of usefulness that characterizes the DBR paradigm should not be approached without critical reflection. Certainly designs that accomplish little for local contexts should be subject to intense questioning and critique, but considering the sociopolitical and systemic factors that might influence what “counts” as useful in local contexts and education science more generally, should be kept firmly in mind when designing, choosing methods, and evaluating impacts (Zavala, 2016 ). Researchers should think deeply about their goals, whether they are reaching for generalizability at all, and in what ways they are constructing contextual definitions of success, and be clear about these ideologically influenced answers in their work, such that generalizability and the usefulness of designs can be adjudicated based on and in conversation with the intentions and conceptual framework of the research and researcher.

Ethical Concerns of Sustainability, Participation, and Telos

While there are many external challenges to rigor and validity of DBR, another set of tensions comes from within the DBR paradigm itself. Rather than concerns about rigor or validity, these internal critiques are not unrelated to the earlier question of the contested definition of usefulness , and more accurately reflect questions of research ethics and grow from ideological concerns with how an intentional, interventionist stance is taken up in research as it interacts with situated communities.

Given that the nature of DBR is to design and implement some form of educational innovation, the DBR researcher will in some way be engaging with an individual or community, becoming part of a situated learning ecology, complete with a sociopolitical and cultural history. As with any research that involves providing an intervention or support, the question of what happens when the research ends is as much an ethical as a methodological one. Concerns then arise given how traditional models of DBR seem intensely focused on creating and implementing a “complete” cycle of design, but giving little attention to what happens to the community and context afterward (Engeström, 2011 ). In contrast to this privileging of “completeness,” sociocultural and critical approaches to DBR have suggested that if research is actually happening in naturalistic, situated contexts that authentically recognize and allow social and cultural dimensions to function (i.e., avoid laboratory-type controls to mitigate independent variables), there can never be such a thing as “complete,” for the design will, and should, live on as part of the ecology of the space (Cole, 2007 ; Engeström, 2000 ). Essentially, these internal critiques push DBR to consider sustainability, and sustainable scale, as equally important concerns to the completeness of an innovation. Not only are ethical questions involved, but accounting for the unbounded and ongoing nature of learning as a social and cultural activity can help strengthen the viability of knowledge claims made, and what degree of generalizability is reasonably justified.

Related to this question of sustainability are internal concerns regarding the nature and ethics of participation in DBR, whether partners in a design are being adequately invited to engage in the design and modification processes that will unfold in their situated contexts and lived communities (Bang et al., 2016 ; Engeström, 2011 ). DBR has actively sought to examine multiple planes of analysis in learning that might be occurring in a learning ecology but has rarely attended to the subject-subject dynamics (Bang et al., 2016 ), or “relational equity” (DiGiacomo & Gutiérrez, 2015 ) that exists between researchers and participants as a point of focus. Participatory design research (PDR) (Bang & Vossoughi, 2016 ) models have recently emerged as a way to better attend to these important dimensions of collective participation (Engeström, 2007 ), power (Vakil et al., 2016 ), positionality (Kirshner, 2015 ), and relational agency (Edwards, 2007 , 2009 ; Sannino & Engeström, 2016 ) as they unfold in DBR.

Both of these ethical questions—around sustainability and participation—reflect challenges to what we might call the telos —or direction—that DBR takes to innovation and research. These are questions related to whose voices are privileged, in what ways, for what purposes, and toward what ends. While DBR, like many other forms of educational research, has involved work with historically marginalized communities, it has, like many other forms of educational research, not always done so in humanizing ways. Put another way, there are ethical and political questions surrounding whether the designs, goals, and standards of usefulness we apply to DBR efforts should be purposefully activist, and have explicitly liberatory ends. To this point, critical and decolonial perspectives have pushed on the DBR paradigm, suggesting that DBR should situate itself as being a space of liberatory innovation and potential, in which communities and participants can become designers and innovators of their own futures (Gutiérrez, 2005 ). This perspective is reflected in the social design experiment (SDE) approach to DBR (Gutiérrez, 2005 , 2008 ; Gutierréz & Vossoughi, 2010 ; Gutiérrez, 2016 ; Gutiérrez & Jurow, 2016 ), which begins in participatory fashion, engaging a community in identifying its own challenges and desires, and reflecting on the historicity of learning practices, before proleptic design efforts are undertaken that ensure that research is done with , not on , communities of color (Arzubiaga, Artiles, King, & Harris-Murri, 2008 ), and intentionally focused on liberatory goals.

Global Perspectives and Unique Iterations

While design-based research (DBR) has been a methodology principally associated with educational research in the United States, its development is hardly limited to the U.S. context. Rather, while DBR emerged in U.S. settings, similar methods of situated, interventionist research focused on design and innovation were emerging in parallel in European contexts (e.g., Gravemeijer, 1994 ), most significantly in the work of Vygotskian scholars both in Europe and the United States (Cole, 1995 ; Cole & Engeström, 1993 , 2007 ; Engeström, 1987 ).

Particularly, where DBR began in the epistemic and ontological terrain of developmental and cognitive psychology, this vein of design-based research work began deeply grounded in cultural-historical activity theory (CHAT). This ontological and epistemic grounding meant that the approach to design that was taken was more intensively conscious of context, historicity, hybridity, and relational factors, and framed around understanding learning as a complex, collective activity system that, through design, could be modified and transformed (Cole & Engeström, 2007 ). The models of DBR that emerged in this context abroad were the formative intervention (Engeström, 2011 ; Engeström, Sannino, & Virkkunen, 2014 ), which relies heavily on Vygotskian double-stimulation to approach learning in nonlinear, unbounded ways, accounting for the role of learner, educator, and researcher in a collective process, shifting and evolving and tinkering with the design as the context needs and demands; and the Change Laboratory (Engeström, 2008 ; Virkkunen & Newnham, 2013 ), which similarly relies on the principle of double stimulation, while presenting holistic way to approach transforming—or changing—entire learning activity systems in fundamental ways through designs that encourage collective “expansive learning” (Engeström, 2001 ), through which participants can produce wholly new activity systems as the object of learning itself.

Elsewhere in the United States, still parallel to the developmental- or cognitive-oriented DBR work that was occurring, American researchers employing CHAT began to leverage the tools and aims of expansive learning in conversation with the tensions and complexity of the U.S. context (Cole, 1995 ; Gutiérrez, 2005 ; Gutiérrez & Rogoff, 2003 ). Like the CHAT design research of the European context, there was a focus on activity systems, historicity, nonlinear and unbounded learning, and collective learning processes and outcomes. Rather than a simple replication, however, these researchers put further attention on questions of equity, diversity, and justice in this work, as Gutiérrez, Engeström, and Sannino ( 2016 ) note:

The American contribution to a cultural historical activity theoretic perspective has been its attention to diversity, including how we theorize, examine, and represent individuals and their communities. (p. 276)

Effectively, CHAT scholars in parts of the United States brought critical and decolonial perspectives to bear on their design-focused research, focusing explicitly on the complex cultural, racial, and ethnic terrain in which they worked, and ensuring that diversity, equity, justice, and non-dominant perspectives would become central principles to the types of design research conducted. The result was the emergence of the aforementioned social design experiments (e.g., Gutiérrez, 2005 , 2016 ), and participatory design research (Bang & Vossoughi, 2016 ) models, which attend intentionally to historicity and relational equity, tailor their methods to the liberation of historically marginalized communities, aim intentionally for liberatory outcomes as key elements of their design processes, and seek to produce outcomes in which communities of learners become designers of new community futures (Gutiérrez, 2016 ). While these approaches emerged in the United States, their origins reflect ontological and ideological perspectives quite distinct from more traditional learning science models of DBR, and dominant U.S. ontologies in general. Indeed, these iterations of DBR are linked genealogically to the ontologies, ideologies, and concerns of peoples in the Global South, offering some promise for the method in those regions, though DBR has yet to broadly take hold among researchers beyond the United States and Europe.

There is, of course, much more nuance to these models, and each of these models (formative interventions, Change Laboratories, social design experiments, and participatory design research) might itself merit independent exploration and review well beyond the scope here. Indeed, there is some question as to whether all adherents of these CHAT design-based methodologies, with their unique genealogies and histories, would even consider themselves under the umbrella of DBR. Yet, despite significant ontological divergences, these iterations share many of the same foundational tenets of the traditional models (though realized differently), and it is reasonable to argue that they do indeed share the same, broad methodological paradigm (DBR), or at the very least, are so intimately related that any discussion of DBR, particularly one with a global view, should consider the contributions CHAT iterations have made to the DBR methodology in the course of their somewhat distinct, but parallel, development.

Possibilities and Potentials for Design-Based Research

Since its emergence in 1992 , the DBR methodology for educational research has continued to grow in popularity, ubiquity, and significance. Its use has begun to expand beyond the confines of the learning sciences, taken up by researchers in a variety of disciplines, and across a breadth of theoretical and intellectual traditions. While still not as widely recognized as more traditional and well-established research methodologies, DBR as a methodology for rigorous research is unquestionably here to stay.

With this in mind, the field ought to still be cautious of the ways in which the discourse of design is used. Not all design is DBR, and preserving the integrity, rigor, and research ethics of the paradigm (on its own terms) will continue to require thoughtful reflection as its pluralistic parameters come into clearer focus. Yet the proliferation of methods in the DBR paradigm should be seen as a positive. There are far too many theories of learning and ideological perspectives that have meaningful contributions to make to our knowledge of the world, communities, and learning to limit ourselves to a unitary approach to DBR, or set of methods. The paradigm has shown itself to have some core methodological principles, but there is no reason not to expect these to grow, expand, and evolve over time.

In an increasingly globalized, culturally diverse, and dynamic world, there is tremendous potential for innovation couched in this proliferation of DBR. Particularly in historically marginalized communities and across the Global South, we will need to know how learning theories can be lived out in productive ways in communities that have been understudied, and under-engaged. The DBR paradigm generally, and critical and CHAT iterations particularly, can fill an important need for participatory, theory-developing research in these contexts that simultaneously creates lived impacts. Participatory design research (PDR), social design experiments (SDE), and Change Laboratory models of DBR should be of particular interest and attention moving forward, as current trends toward culturally sustaining pedagogies and learning will need to be explored in depth and in close collaboration with communities, as participatory design partners, in the press toward liberatory educational innovations.

Bibliography

The following special issues of journals are encouraged starting points for engaging more deeply with current and past trends in design-based research.

  • Bang, M. , & Vossoughi, S. (Eds.). (2016). Participatory design research and educational justice: Studying learning and relations within social change making [Special issue]. Cognition and Instruction , 34 (3).
  • Barab, S. (Ed.). (2004). Design-based research [Special issue]. Journal of the Learning Sciences , 13 (1).
  • Cole, M. , & The Distributed Literacy Consortium. (2006). The Fifth Dimension: An after-school program built on diversity . New York, NY: Russell Sage Foundation.
  • Kelly, A. E. (Ed.). (2003). Special issue on the role of design in educational research [Special issue]. Educational Researcher , 32 (1).
  • Arzubiaga, A. , Artiles, A. , King, K. , & Harris-Murri, N. (2008). Beyond research on cultural minorities: Challenges and implications of research as situated cultural practice. Exceptional Children , 74 (3), 309–327.
  • Bang, M. , Faber, L. , Gurneau, J. , Marin, A. , & Soto, C. (2016). Community-based design research: Learning across generations and strategic transformations of institutional relations toward axiological innovations. Mind, Culture, and Activity , 23 (1), 28–41.
  • Bang, M. , & Vossoughi, S. (2016). Participatory design research and educational justice: Studying learning and relations within social change making. Cognition and Instruction , 34 (3), 173–193.
  • Barab, S. , Kinster, J. G. , Moore, J. , Cunningham, D. , & The ILF Design Team. (2001). Designing and building an online community: The struggle to support sociability in the Inquiry Learning Forum. Educational Technology Research and Development , 49 (4), 71–96.
  • Barab, S. , & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences , 13 (1), 1–14.
  • Barab, S. A. , & Kirshner, D. (2001). Methodologies for capturing learner practices occurring as part of dynamic learning environments. Journal of the Learning Sciences , 10 (1–2), 5–15.
  • Bell, P. (2004). On the theoretical breadth of design-based research in education. Educational Psychologist , 39 (4), 243–253.
  • Bereiter, C. , & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 361–392). Hillsdale, NJ: Lawrence Erlbaum.
  • Booker, A. , & Goldman, S. (2016). Participatory design research as a practice for systemic repair: Doing hand-in-hand math research with families. Cognition and Instruction , 34 (3), 222–235.
  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences , 2 (2), 141–178.
  • Brown, A. , & Campione, J. C. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In L. Schauble & R. Glaser (Eds.), Innovations in learning: New environments for education (pp. 289–325). Mahwah, NJ: Lawrence Erlbaum.
  • Brown, A. L. , & Campione, J. C. (1998). Designing a community of young learners: Theoretical and practical lessons. In N. M. Lambert & B. L. McCombs (Eds.), How students learn: Reforming schools through learner-centered education (pp. 153–186). Washington, DC: American Psychological Association.
  • Brown, A. , Campione, J. , Webber, L. , & McGilley, K. (1992). Interactive learning environments—A new look at learning and assessment. In B. R. Gifford & M. C. O’Connor (Eds.), Future assessment: Changing views of aptitude, achievement, and instruction (pp. 121–211). Boston, MA: Academic Press.
  • Carnoy, M. , Jacobsen, R. , Mishel, L. , & Rothstein, R. (2005). The charter school dust-up: Examining the evidence on enrollment and achievement . Washington, DC: Economic Policy Institute.
  • Carspecken, P. (1996). Critical ethnography in educational research . New York, NY: Routledge.
  • Cobb, P. , Confrey, J. , diSessa, A. , Lehrer, R. , & Schauble, L. (2003). Design experiments in educational research. Educational Researcher , 32 (1), 9–13.
  • Cobb, P. , & Steffe, L. P. (1983). The constructivist researcher as teacher and model builder. Journal for Research in Mathematics Education , 14 , 83–94.
  • Coburn, C. , & Penuel, W. (2016). Research-practice partnerships in education: Outcomes, dynamics, and open questions. Educational Researcher , 45 (1), 48–54.
  • Cole, M. (1995). From Moscow to the Fifth Dimension: An exploration in romantic science. In M. Cole & J. Wertsch (Eds.), Contemporary implications of Vygotsky and Luria (pp. 1–38). Worcester, MA: Clark University Press.
  • Cole, M. (1996). Cultural psychology: A once and future discipline . Cambridge, MA: Harvard University Press.
  • Cole, M. (2007). Sustaining model systems of educational activity: Designing for the long haul. In J. Campione , K. Metz , & A. S. Palinscar (Eds.), Children’s learning in and out of school: Essays in honor of Ann Brown (pp. 71–89). New York, NY: Routledge.
  • Cole, M. , & Engeström, Y. (1993). A cultural historical approach to distributed cognition. In G. Saloman (Ed.), Distributed cognitions: Psychological and educational considerations (pp. 1–46). Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Engeström, Y. (2007). Cultural-historical approaches to designing for development. In J. Valsiner & A. Rosa (Eds.), The Cambridge handbook of sociocultural psychology , Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Underwood, C. (2013). The evolution of the 5th Dimension. In The Story of the Laboratory of Comparative Human Cognition: A polyphonic autobiography . https://lchcautobio.ucsd.edu/polyphonic-autobiography/section-5/chapter-12-the-later-life-of-the-5th-dimension-and-its-direct-progeny/ .
  • Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). New York, NY: Springer-Verlag.
  • Collins, A. , Joseph, D. , & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences , 13 (1), 15–42.
  • Dede, C. (2004). If design-based research is the answer, what is the question? A commentary on Collins, Joseph, and Bielaczyc; DiSessa and Cobb; and Fishman, Marx, Blumenthal, Krajcik, and Soloway in the JLS special issue on design-based research. Journal of the Learning Sciences , 13 (1), 105–114.
  • Design-Based Research Collective . (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher , 32 (1), 5–8.
  • DiGiacomo, D. , & Gutiérrez, K. D. (2015). Relational equity as a design tool within making and tinkering activities. Mind, Culture, and Activity , 22 (3), 1–15.
  • diSessa, A. A. (1991). Local sciences: Viewing the design of human-computer systems as cognitive science. In J. M. Carroll (Ed.), Designing interaction: Psychology at the human-computer interface (pp. 162–202). Cambridge, U.K.: Cambridge University Press.
  • diSessa, A. A. , & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. Journal of the Learning Sciences , 13 (1), 77–103.
  • diSessa, A. A. , & Minstrell, J. (1998). Cultivating conceptual change with benchmark lessons. In J. G. Greeno & S. Goldman (Eds.), Thinking practices (pp. 155–187). Mahwah, NJ: Lawrence Erlbaum.
  • Dominguez, M. (2015). Decolonizing teacher education: Explorations of expansive learning and culturally sustaining pedagogy in a social design experiment (Doctoral dissertation). University of Colorado, Boulder.
  • Edelson, D. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences , 11 (1), 105–121.
  • Edwards, A. (2007). Relational agency in professional practice: A CHAT analysis. Actio: An International Journal of Human Activity Theory , 1 , 1–17.
  • Edwards, A. (2009). Agency and activity theory: From the systemic to the relational. In A. Sannino , H. Daniels , & K. Gutiérrez (Eds.), Learning and expanding with activity theory (pp. 197–211). Cambridge, U.K.: Cambridge University Press.
  • Engeström, Y. (1987). Learning by expanding . Helsinki, Finland: University of Helsinki, Department of Education.
  • Engeström, Y. (2000). Can people learn to master their future? Journal of the Learning Sciences , 9 , 525–534.
  • Engeström, Y. (2001). Expansive learning at work: Toward an activity theoretical reconceptualization. Journal of Education and Work , 14 (1), 133–156.
  • Engeström, Y. (2007). Enriching the theory of expansive learning: Lessons from journeys toward co-configuration. Mind, Culture, and Activity , 14 (1–2), 23–39.
  • Engeström, Y. (2008). Putting Vygotksy to work: The Change Laboratory as an application of double stimulation. In H. Daniels , M. Cole , & J. Wertsch (Eds.), Cambridge companion to Vygotsky (pp. 363–382). New York, NY: Cambridge University Press.
  • Engeström, Y. (2011). From design experiments to formative interventions. Theory & Psychology , 21 (5), 598–628.
  • Engeström, Y. , Engeström, R. , & Kärkkäinen, M. (1995). Polycontextuality and boundary crossing in expert cognition: Learning and problem solving in complex work activities. Learning and Instruction , 5 (4), 319–336.
  • Engeström, Y. , & Sannino, A. (2010). Studies of expansive learning: Foundations, findings and future challenges. Educational Research Review , 5 (1), 1–24.
  • Engeström, Y. , & Sannino, A. (2011). Discursive manifestations of contradictions in organizational change efforts: A methodological framework. Journal of Organizational Change Management , 24 (3), 368–387.
  • Engeström, Y. , Sannino, A. , & Virkkunen, J. (2014). On the methodological demands of formative interventions. Mind, Culture, and Activity , 2 (2), 118–128.
  • Erickson, F. , & Gutiérrez, K. (2002). Culture, rigor, and science in educational research. Educational Researcher , 31 (8), 21–24.
  • Espinoza, M. (2009). A case study of the production of educational sanctuary in one migrant classroom. Pedagogies: An International Journal , 4 (1), 44–62.
  • Espinoza, M. L. , & Vossoughi, S. (2014). Perceiving learning anew: Social interaction, dignity, and educational rights. Harvard Educational Review , 84 (3), 285–313.
  • Fine, M. (1994). Dis-tance and other stances: Negotiations of power inside feminist research. In A. Gitlin (Ed.), Power and method (pp. 13–25). New York, NY: Routledge.
  • Fishman, B. , Penuel, W. , Allen, A. , Cheng, B. , & Sabelli, N. (2013). Design-based implementation research: An emerging model for transforming the relationship of research and practice. National Society for the Study of Education , 112 (2), 136–156.
  • Gravemeijer, K. (1994). Educational development and developmental research in mathematics education. Journal for Research in Mathematics Education , 25 (5), 443–471.
  • Gutiérrez, K. (2005). Intersubjectivity and grammar in the third space . Scribner Award Lecture.
  • Gutiérrez, K. (2008). Developing a sociocritical literacy in the third space. Reading Research Quarterly , 43 (2), 148–164.
  • Gutiérrez, K. (2016). Designing resilient ecologies: Social design experiments and a new social imagination. Educational Researcher , 45 (3), 187–196.
  • Gutiérrez, K. , Bien, A. , Selland, M. , & Pierce, D. M. (2011). Polylingual and polycultural learning ecologies: Mediating emergent academic literacies for dual language learners. Journal of Early Childhood Literacy , 11 (2), 232–261.
  • Gutiérrez, K. , Engeström, Y. , & Sannino, A. (2016). Expanding educational research and interventionist methodologies. Cognition and Instruction , 34 (2), 275–284.
  • Gutiérrez, K. , & Jurow, A. S. (2016). Social design experiments: Toward equity by design. Journal of Learning Sciences , 25 (4), 565–598.
  • Gutiérrez, K. , & Penuel, W. R. (2014). Relevance to practice as a criterion for rigor. Educational Researcher , 43 (1), 19–23.
  • Gutiérrez, K. , & Rogoff, B. (2003). Cultural ways of learning: Individual traits or repertoires of practice. Educational Researcher , 32 (5), 19–25.
  • Gutierréz, K. , & Vossoughi, S. (2010). Lifting off the ground to return anew: Mediated praxis, transformative learning, and social design experiments. Journal of Teacher Education , 61 (1–2), 100–117.
  • Hall, R. , & Jurow, A. S. (2015). Changing concepts in activity: Descriptive and design studies of consequential learning in conceptual practices. Educational Psychologist , 50 (3), 173–189.
  • Harding, S. (1993). Rethinking standpoint epistemology: What is “strong objectivity”? In L. Alcoff & E. Potter (Eds.), Feminist epistemologies (pp. 49–82). New York, NY: Routledge.
  • Hoadley, C. (2002). Creating context: Design-based research in creating and understanding CSCL. In G. Stahl (Ed.), Computer support for collaborative learning 2002 (pp. 453–462). Mahwah, NJ: Lawrence Erlbaum.
  • Hoadley, C. (2004). Methodological alignment in design-based research. Educational Psychologist , 39 (4), 203–212.
  • Joseph, D. (2004). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context. Educational Psychologist , 39 (4), 235–242.
  • Jurow, A. S. , & Shea, M. V. (2015). Learning in equity-oriented scale-making projects. Journal of the Learning Sciences , 24 (2), 286–307.
  • Jurow, S. , Tracy, R. , Hotchkiss, J. , & Kirshner, B. (2012). Designing for the future: How the learning sciences can inform the trajectories of preservice teachers. Journal of Teacher Education , 63 (2), 147–60.
  • Kärkkäinen, M. (1999). Teams as breakers of traditional work practices: A longitudinal study of planning and implementing curriculum units in elementary school teacher teams . Helsinki, Finland: University of Helsinki, Department of Education.
  • Kelly, A. (2004). Design research in education: Yes, but is it methodological? Journal of the Learning Sciences , 13 (1), 115–128.
  • Kelly, A. E. , & Sloane, F. C. (2003). Educational research and the problems of practice. Irish Educational Studies , 22 , 29–40.
  • Kirshner, B. (2015). Youth activism in an era of education inequality . New York: New York University Press.
  • Kirshner, B. , & Polman, J. L. (2013). Adaptation by design: A context-sensitive, dialogic approach to interventions. National Society for the Study of Education Yearbook , 112 (2), 215–236.
  • Leander, K. M. , Phillips, N. C. , & Taylor, K. H. (2010). The changing social spaces of learning: Mapping new mobilities. Review of Research in Education , 34 , 329–394.
  • Lesh, R. A. , & Kelly, A. E. (2000). Multi-tiered teaching experiments. In A. E. Kelly & R. A. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 197–230). Mahwah, NJ: Lawrence Erlbaum.
  • Matusov, E. (1996). Intersubjectivty without agreement. Mind, Culture, and Activity , 3 (1), 29–45.
  • Messick, S. (1992). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher , 23 (2), 13–23.
  • Mosteller, F. , & Boruch, R. F. (Eds.). (2002). Evidence matters: Randomized trials in education research . Washington, DC: Brookings Institution Press.
  • Newman, D. , Griffin, P. , & Cole, M. (1989). The construction zone: Working for cognitive change in school . London, U.K.: Cambridge University Press.
  • Penuel, W. R. , Fishman, B. J. , Cheng, B. H. , & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher , 40 (7), 331–337.
  • Polman, J. L. (2000). Designing project-based science: Connecting learners through guided inquiry . New York, NY: Teachers College Press.
  • Ravitch, D. (2010). The death and life of the great American school system: How testing and choice are undermining education . New York, NY: Basic Books.
  • Rogoff, B. (1990). Apprenticeship in thinking: Cognitive development in social context . New York, NY: Oxford University Press.
  • Rogoff, B. (1995). Observing sociocultural activity on three planes: Participatory appropriation, guided participation, and apprenticeship. In J. V. Wertsch , P. D. Rio , & A. Alvarez (Eds.), Sociocultural studies of mind (pp. 139–164). Cambridge U.K.: Cambridge University Press.
  • Saltman, K. J. (2007). Capitalizing on disaster: Taking and breaking public schools . Boulder, CO: Paradigm.
  • Salvador, T. , Bell, G. , & Anderson, K. (1999). Design ethnography. Design Management Journal , 10 (4), 35–41.
  • Sannino, A. (2011). Activity theory as an activist and interventionist theory. Theory & Psychology , 21 (5), 571–597.
  • Sannino, A. , & Engeström, Y. (2016). Relational agency, double stimulation and the object of activity: An intervention study in a primary school. In A. Edwards (Ed.), Working relationally in and across practices: Cultural-historical approaches to collaboration (pp. 58–77). Cambridge, U.K.: Cambridge University Press.
  • Scardamalia, M. , & Bereiter, C. (1991). Higher levels of agency for children in knowledge building: A challenge for the design of new knowledge media. Journal of the Learning Sciences , 1 , 37–68.
  • Schoenfeld, A. H. (1982). Measures of problem solving performance and of problem solving instruction. Journal for Research in Mathematics Education , 13 , 31–49.
  • Schoenfeld, A. H. (1985). Mathematical problem solving . Orlando, FL: Academic Press.
  • Schoenfeld, A. H. (1992). On paradigms and methods: What do you do when the ones you know don’t do what you want them to? Issues in the analysis of data in the form of videotapes. Journal of the Learning Sciences , 2 (2), 179–214.
  • Scribner, S. , & Cole, M. (1978). Literacy without schooling: Testing for intellectual effects. Harvard Educational Review , 48 (4), 448–461.
  • Shavelson, R. J. , Phillips, D. C. , Towne, L. , & Feuer, M. J. (2003). On the science of education design studies. Educational Researcher , 32 (1), 25–28.
  • Steffe, L. P. , & Thompson, P. W. (2000). Teaching experiment methodology: Underlying principles and essential elements. In A. Kelly & R. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 267–307). Mahwah, NJ: Erlbaum.
  • Stevens, R. (2000). Divisions of labor in school and in the workplace: Comparing computer and paper-supported activities across settings. Journal of the Learning Sciences , 9 (4), 373–401.
  • Suchman, L. (1995). Making work visible. Communications of the ACM , 38 (9), 57–64.
  • Vakil, S. , de Royston, M. M. , Nasir, N. , & Kirshner, B. (2016). Rethinking race and power in design-based research: Reflections from the field. Cognition and Instruction , 34 (3), 194–209.
  • van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker , R. M. Branch , K. Gustafson , N. Nieveen , & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 1–14). Boston, MA: Kluwer Academic.
  • Virkkunen, J. , & Newnham, D. (2013). The Change Laboratory: A tool for collaborative development of work and education . Rotterdam, The Netherlands: Sense.
  • White, B. Y. , & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction , 16 , 3–118.
  • Zavala, M. (2016). Design, participation, and social change: What design in grassroots spaces can teach learning scientists. Cognition and Instruction , 34 (3), 236–249.

1. The reader should note the emergence of critical ethnography (e.g., Carspecken, 1996 ; Fine, 1994 ), and other more participatory models of ethnography that deviated from this traditional paradigm during this same time period. These new forms of ethnography comprised part of the genealogy of the more critical approaches to DBR, described later in this article.

2. The reader will also note that the adjective “qualitative” largely drops away from the acronym “DBR.” This is largely because, as described, DBR, as an exploration of naturalistic ecologies with multitudes of variables, and social and learning dynamics, necessarily demands a move beyond what can be captured by quantitative measurement alone. The qualitative nature of the research is thus implied and embedded as part of what makes DBR a unique and distinct methodology.

Related Articles

  • Qualitative Data Analysis
  • The Entanglements of Ethnography and Participatory Action Research (PAR) in Educational Research in North America
  • Writing Educational Ethnography
  • Qualitative Data Analysis and the Use of Theory
  • Comparative Case Study Research
  • Use of Qualitative Methods in Evaluation Studies
  • Writing Qualitative Dissertations
  • Ethnography in Early Childhood Education
  • A History of Qualitative Research in Education in China
  • Qualitative Research in the Field of Popular Education
  • Qualitative Methodological Considerations for Studying Undocumented Students in the United States
  • Culturally Responsive Evaluation as a Form of Critical Qualitative Inquiry
  • Participatory Action Research in Education
  • Complexity Theory as a Guide to Qualitative Methodology in Teacher Education
  • Observing Schools and Classrooms

Printed from Oxford Research Encyclopedias, Education. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 01 May 2024

  • Cookie Policy
  • Privacy Policy
  • Legal Notice
  • Accessibility
  • [66.249.64.20|185.39.149.46]
  • 185.39.149.46

Character limit 500 /500

  • Technical Support
  • Find My Rep

You are here

Research Design

Research Design Qualitative, Quantitative, and Mixed Methods Approaches

  • John W. Creswell - Department of Family Medicine, University of Michigan
  • J. David Creswell - Carnegie Mellon University, USA
  • Description

See what’s new to this edition by selecting the Features tab on this page. Should you need additional information or have questions regarding the HEOA information provided for this title, including what is new to this edition, please email [email protected] . Please include your name, contact information, and the name of the title for which you would like more information. For information on the HEOA, please go to http://ed.gov/policy/highered/leg/hea08/index.html .

For assistance with your order: Please email us at [email protected] or connect with your SAGE representative.

SAGE 2455 Teller Road Thousand Oaks, CA 91320 www.sagepub.com

Supplements

“A long time ago, I participated in one of Dr. Creswell’s workshops on mixed methods research.... I am still learning from Dr. Creswell. I appreciate how he takes complex topics and makes them accessible to everyone. But I must caution my students that Dr. Creswell’s easygoing cadence and elegant descriptions sometimes mask the depth of the material. This reminds me of why he is such a highly respected researcher and teacher.”

“I always have enjoyed using Creswell's books (as a student and as an instructor) because the writing is straightforward.”

“This book is based around dissertation chapters, and that's why I love it using in my class. Practical, concise, and to the point!”

“This book is easy to use. The information and additional charts are also helpful.”

Clear material, student support website, and faculty resources.

The book provides a comprehensive overview and does well at demystifying the research philosophy. I have recommended it to my level 7 students for their dissertation project.

This book will be added to next academic year's reading list.

I am fed up with trying to get access to this "inspection copy". You don't respond to emails (and the email addresses you provide do not work). I get regular emails from you saying my ebook order is ready, but it does not appear in VitalSource and I cannot access it through any link on this web page. I am not willing to waste any more time on this. There are good alternatives.

Excellent introduction for research methods.

Creswell has always had excellent textbooks. Sixth Edition is no exception!

  • Fully updated for the 7th edition of the Publication Manual of the American Psychological Association.
  • More inclusive and supportive language throughout helps readers better see themselves in the research process.
  • Learning Objectives provide additional structure and clarity to the reading process.
  • The latest information on participatory research, evaluating literature for quality, using software to design literature maps, and additional statistical software types is newly included in this edition.
  • Chapter 4: Writing Strategies and Ethical Considerations now includes information on indigenous populations and data collection after IRB review.
  • An updated Chapter 8: Quantitative Methods now includes more foundational details, such as Type 1 and Type 2 errors and discussions of advantages and disadvantages of quantitative designs.
  • A restructured and revised Chapter 10: Mixed Methods Procedures brings state-of-the-art thinking to this increasingly popular approach.
  • Chapters 8, 9, and 10 now have parallel structures so readers can better compare and contrast each approach.
  • Reworked end-of-chapter exercises offer a more straightforward path to application for students.
  • New research examples throughout the text offer students contemporary studies for evaluation.
  • Current references and additional readings are included in this new edition.
  • Compares qualitative, quantitative, and mixed methods research in one book for unparalleled coverage.
  • Highly interdisciplinary examples make this book widely appealing to a broad range of courses and disciplines.
  • Ethical coverage throughout consistently reminds students to use good judgment and to be fair and unbiased in their research.
  • Writing exercises conclude each chapter so that readers can practice the principles learned in the chapter; if the reader completes all of the exercises, they will have a written plan for their scholarly study.
  • Numbered points provide checklists of each step in a process.
  • Annotated passages help reinforce the reader's comprehension of key research ideas.

Sample Materials & Chapters

Chapter 1: The Selection of a Research Approach

Chapter 2: Review of the Literature

For instructors

Select a purchasing option, related products.

A Concise Introduction to Mixed Methods Research

  • Privacy Policy

Research Method

Home » Qualitative Research – Methods, Analysis Types and Guide

Qualitative Research – Methods, Analysis Types and Guide

Table of Contents

Qualitative Research

Qualitative Research

Qualitative research is a type of research methodology that focuses on exploring and understanding people’s beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus groups, observations, and textual analysis.

Qualitative research aims to uncover the meaning and significance of social phenomena, and it typically involves a more flexible and iterative approach to data collection and analysis compared to quantitative research. Qualitative research is often used in fields such as sociology, anthropology, psychology, and education.

Qualitative Research Methods

Types of Qualitative Research

Qualitative Research Methods are as follows:

One-to-One Interview

This method involves conducting an interview with a single participant to gain a detailed understanding of their experiences, attitudes, and beliefs. One-to-one interviews can be conducted in-person, over the phone, or through video conferencing. The interviewer typically uses open-ended questions to encourage the participant to share their thoughts and feelings. One-to-one interviews are useful for gaining detailed insights into individual experiences.

Focus Groups

This method involves bringing together a group of people to discuss a specific topic in a structured setting. The focus group is led by a moderator who guides the discussion and encourages participants to share their thoughts and opinions. Focus groups are useful for generating ideas and insights, exploring social norms and attitudes, and understanding group dynamics.

Ethnographic Studies

This method involves immersing oneself in a culture or community to gain a deep understanding of its norms, beliefs, and practices. Ethnographic studies typically involve long-term fieldwork and observation, as well as interviews and document analysis. Ethnographic studies are useful for understanding the cultural context of social phenomena and for gaining a holistic understanding of complex social processes.

Text Analysis

This method involves analyzing written or spoken language to identify patterns and themes. Text analysis can be quantitative or qualitative. Qualitative text analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Text analysis is useful for understanding media messages, public discourse, and cultural trends.

This method involves an in-depth examination of a single person, group, or event to gain an understanding of complex phenomena. Case studies typically involve a combination of data collection methods, such as interviews, observations, and document analysis, to provide a comprehensive understanding of the case. Case studies are useful for exploring unique or rare cases, and for generating hypotheses for further research.

Process of Observation

This method involves systematically observing and recording behaviors and interactions in natural settings. The observer may take notes, use audio or video recordings, or use other methods to document what they see. Process of observation is useful for understanding social interactions, cultural practices, and the context in which behaviors occur.

Record Keeping

This method involves keeping detailed records of observations, interviews, and other data collected during the research process. Record keeping is essential for ensuring the accuracy and reliability of the data, and for providing a basis for analysis and interpretation.

This method involves collecting data from a large sample of participants through a structured questionnaire. Surveys can be conducted in person, over the phone, through mail, or online. Surveys are useful for collecting data on attitudes, beliefs, and behaviors, and for identifying patterns and trends in a population.

Qualitative data analysis is a process of turning unstructured data into meaningful insights. It involves extracting and organizing information from sources like interviews, focus groups, and surveys. The goal is to understand people’s attitudes, behaviors, and motivations

Qualitative Research Analysis Methods

Qualitative Research analysis methods involve a systematic approach to interpreting and making sense of the data collected in qualitative research. Here are some common qualitative data analysis methods:

Thematic Analysis

This method involves identifying patterns or themes in the data that are relevant to the research question. The researcher reviews the data, identifies keywords or phrases, and groups them into categories or themes. Thematic analysis is useful for identifying patterns across multiple data sources and for generating new insights into the research topic.

Content Analysis

This method involves analyzing the content of written or spoken language to identify key themes or concepts. Content analysis can be quantitative or qualitative. Qualitative content analysis involves close reading and interpretation of texts to identify recurring themes, concepts, and patterns. Content analysis is useful for identifying patterns in media messages, public discourse, and cultural trends.

Discourse Analysis

This method involves analyzing language to understand how it constructs meaning and shapes social interactions. Discourse analysis can involve a variety of methods, such as conversation analysis, critical discourse analysis, and narrative analysis. Discourse analysis is useful for understanding how language shapes social interactions, cultural norms, and power relationships.

Grounded Theory Analysis

This method involves developing a theory or explanation based on the data collected. Grounded theory analysis starts with the data and uses an iterative process of coding and analysis to identify patterns and themes in the data. The theory or explanation that emerges is grounded in the data, rather than preconceived hypotheses. Grounded theory analysis is useful for understanding complex social phenomena and for generating new theoretical insights.

Narrative Analysis

This method involves analyzing the stories or narratives that participants share to gain insights into their experiences, attitudes, and beliefs. Narrative analysis can involve a variety of methods, such as structural analysis, thematic analysis, and discourse analysis. Narrative analysis is useful for understanding how individuals construct their identities, make sense of their experiences, and communicate their values and beliefs.

Phenomenological Analysis

This method involves analyzing how individuals make sense of their experiences and the meanings they attach to them. Phenomenological analysis typically involves in-depth interviews with participants to explore their experiences in detail. Phenomenological analysis is useful for understanding subjective experiences and for developing a rich understanding of human consciousness.

Comparative Analysis

This method involves comparing and contrasting data across different cases or groups to identify similarities and differences. Comparative analysis can be used to identify patterns or themes that are common across multiple cases, as well as to identify unique or distinctive features of individual cases. Comparative analysis is useful for understanding how social phenomena vary across different contexts and groups.

Applications of Qualitative Research

Qualitative research has many applications across different fields and industries. Here are some examples of how qualitative research is used:

  • Market Research: Qualitative research is often used in market research to understand consumer attitudes, behaviors, and preferences. Researchers conduct focus groups and one-on-one interviews with consumers to gather insights into their experiences and perceptions of products and services.
  • Health Care: Qualitative research is used in health care to explore patient experiences and perspectives on health and illness. Researchers conduct in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education: Qualitative research is used in education to understand student experiences and to develop effective teaching strategies. Researchers conduct classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work : Qualitative research is used in social work to explore social problems and to develop interventions to address them. Researchers conduct in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : Qualitative research is used in anthropology to understand different cultures and societies. Researchers conduct ethnographic studies and observe and interview members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : Qualitative research is used in psychology to understand human behavior and mental processes. Researchers conduct in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy : Qualitative research is used in public policy to explore public attitudes and to inform policy decisions. Researchers conduct focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

How to Conduct Qualitative Research

Here are some general steps for conducting qualitative research:

  • Identify your research question: Qualitative research starts with a research question or set of questions that you want to explore. This question should be focused and specific, but also broad enough to allow for exploration and discovery.
  • Select your research design: There are different types of qualitative research designs, including ethnography, case study, grounded theory, and phenomenology. You should select a design that aligns with your research question and that will allow you to gather the data you need to answer your research question.
  • Recruit participants: Once you have your research question and design, you need to recruit participants. The number of participants you need will depend on your research design and the scope of your research. You can recruit participants through advertisements, social media, or through personal networks.
  • Collect data: There are different methods for collecting qualitative data, including interviews, focus groups, observation, and document analysis. You should select the method or methods that align with your research design and that will allow you to gather the data you need to answer your research question.
  • Analyze data: Once you have collected your data, you need to analyze it. This involves reviewing your data, identifying patterns and themes, and developing codes to organize your data. You can use different software programs to help you analyze your data, or you can do it manually.
  • Interpret data: Once you have analyzed your data, you need to interpret it. This involves making sense of the patterns and themes you have identified, and developing insights and conclusions that answer your research question. You should be guided by your research question and use your data to support your conclusions.
  • Communicate results: Once you have interpreted your data, you need to communicate your results. This can be done through academic papers, presentations, or reports. You should be clear and concise in your communication, and use examples and quotes from your data to support your findings.

Examples of Qualitative Research

Here are some real-time examples of qualitative research:

  • Customer Feedback: A company may conduct qualitative research to understand the feedback and experiences of its customers. This may involve conducting focus groups or one-on-one interviews with customers to gather insights into their attitudes, behaviors, and preferences.
  • Healthcare : A healthcare provider may conduct qualitative research to explore patient experiences and perspectives on health and illness. This may involve conducting in-depth interviews with patients and their families to gather information on their experiences with different health care providers and treatments.
  • Education : An educational institution may conduct qualitative research to understand student experiences and to develop effective teaching strategies. This may involve conducting classroom observations and interviews with students and teachers to gather insights into classroom dynamics and instructional practices.
  • Social Work: A social worker may conduct qualitative research to explore social problems and to develop interventions to address them. This may involve conducting in-depth interviews with individuals and families to understand their experiences with poverty, discrimination, and other social problems.
  • Anthropology : An anthropologist may conduct qualitative research to understand different cultures and societies. This may involve conducting ethnographic studies and observing and interviewing members of different cultural groups to gain insights into their beliefs, practices, and social structures.
  • Psychology : A psychologist may conduct qualitative research to understand human behavior and mental processes. This may involve conducting in-depth interviews with individuals to explore their thoughts, feelings, and experiences.
  • Public Policy: A government agency or non-profit organization may conduct qualitative research to explore public attitudes and to inform policy decisions. This may involve conducting focus groups and one-on-one interviews with members of the public to gather insights into their perspectives on different policy issues.

Purpose of Qualitative Research

The purpose of qualitative research is to explore and understand the subjective experiences, behaviors, and perspectives of individuals or groups in a particular context. Unlike quantitative research, which focuses on numerical data and statistical analysis, qualitative research aims to provide in-depth, descriptive information that can help researchers develop insights and theories about complex social phenomena.

Qualitative research can serve multiple purposes, including:

  • Exploring new or emerging phenomena : Qualitative research can be useful for exploring new or emerging phenomena, such as new technologies or social trends. This type of research can help researchers develop a deeper understanding of these phenomena and identify potential areas for further study.
  • Understanding complex social phenomena : Qualitative research can be useful for exploring complex social phenomena, such as cultural beliefs, social norms, or political processes. This type of research can help researchers develop a more nuanced understanding of these phenomena and identify factors that may influence them.
  • Generating new theories or hypotheses: Qualitative research can be useful for generating new theories or hypotheses about social phenomena. By gathering rich, detailed data about individuals’ experiences and perspectives, researchers can develop insights that may challenge existing theories or lead to new lines of inquiry.
  • Providing context for quantitative data: Qualitative research can be useful for providing context for quantitative data. By gathering qualitative data alongside quantitative data, researchers can develop a more complete understanding of complex social phenomena and identify potential explanations for quantitative findings.

When to use Qualitative Research

Here are some situations where qualitative research may be appropriate:

  • Exploring a new area: If little is known about a particular topic, qualitative research can help to identify key issues, generate hypotheses, and develop new theories.
  • Understanding complex phenomena: Qualitative research can be used to investigate complex social, cultural, or organizational phenomena that are difficult to measure quantitatively.
  • Investigating subjective experiences: Qualitative research is particularly useful for investigating the subjective experiences of individuals or groups, such as their attitudes, beliefs, values, or emotions.
  • Conducting formative research: Qualitative research can be used in the early stages of a research project to develop research questions, identify potential research participants, and refine research methods.
  • Evaluating interventions or programs: Qualitative research can be used to evaluate the effectiveness of interventions or programs by collecting data on participants’ experiences, attitudes, and behaviors.

Characteristics of Qualitative Research

Qualitative research is characterized by several key features, including:

  • Focus on subjective experience: Qualitative research is concerned with understanding the subjective experiences, beliefs, and perspectives of individuals or groups in a particular context. Researchers aim to explore the meanings that people attach to their experiences and to understand the social and cultural factors that shape these meanings.
  • Use of open-ended questions: Qualitative research relies on open-ended questions that allow participants to provide detailed, in-depth responses. Researchers seek to elicit rich, descriptive data that can provide insights into participants’ experiences and perspectives.
  • Sampling-based on purpose and diversity: Qualitative research often involves purposive sampling, in which participants are selected based on specific criteria related to the research question. Researchers may also seek to include participants with diverse experiences and perspectives to capture a range of viewpoints.
  • Data collection through multiple methods: Qualitative research typically involves the use of multiple data collection methods, such as in-depth interviews, focus groups, and observation. This allows researchers to gather rich, detailed data from multiple sources, which can provide a more complete picture of participants’ experiences and perspectives.
  • Inductive data analysis: Qualitative research relies on inductive data analysis, in which researchers develop theories and insights based on the data rather than testing pre-existing hypotheses. Researchers use coding and thematic analysis to identify patterns and themes in the data and to develop theories and explanations based on these patterns.
  • Emphasis on researcher reflexivity: Qualitative research recognizes the importance of the researcher’s role in shaping the research process and outcomes. Researchers are encouraged to reflect on their own biases and assumptions and to be transparent about their role in the research process.

Advantages of Qualitative Research

Qualitative research offers several advantages over other research methods, including:

  • Depth and detail: Qualitative research allows researchers to gather rich, detailed data that provides a deeper understanding of complex social phenomena. Through in-depth interviews, focus groups, and observation, researchers can gather detailed information about participants’ experiences and perspectives that may be missed by other research methods.
  • Flexibility : Qualitative research is a flexible approach that allows researchers to adapt their methods to the research question and context. Researchers can adjust their research methods in real-time to gather more information or explore unexpected findings.
  • Contextual understanding: Qualitative research is well-suited to exploring the social and cultural context in which individuals or groups are situated. Researchers can gather information about cultural norms, social structures, and historical events that may influence participants’ experiences and perspectives.
  • Participant perspective : Qualitative research prioritizes the perspective of participants, allowing researchers to explore subjective experiences and understand the meanings that participants attach to their experiences.
  • Theory development: Qualitative research can contribute to the development of new theories and insights about complex social phenomena. By gathering rich, detailed data and using inductive data analysis, researchers can develop new theories and explanations that may challenge existing understandings.
  • Validity : Qualitative research can offer high validity by using multiple data collection methods, purposive and diverse sampling, and researcher reflexivity. This can help ensure that findings are credible and trustworthy.

Limitations of Qualitative Research

Qualitative research also has some limitations, including:

  • Subjectivity : Qualitative research relies on the subjective interpretation of researchers, which can introduce bias into the research process. The researcher’s perspective, beliefs, and experiences can influence the way data is collected, analyzed, and interpreted.
  • Limited generalizability: Qualitative research typically involves small, purposive samples that may not be representative of larger populations. This limits the generalizability of findings to other contexts or populations.
  • Time-consuming: Qualitative research can be a time-consuming process, requiring significant resources for data collection, analysis, and interpretation.
  • Resource-intensive: Qualitative research may require more resources than other research methods, including specialized training for researchers, specialized software for data analysis, and transcription services.
  • Limited reliability: Qualitative research may be less reliable than quantitative research, as it relies on the subjective interpretation of researchers. This can make it difficult to replicate findings or compare results across different studies.
  • Ethics and confidentiality: Qualitative research involves collecting sensitive information from participants, which raises ethical concerns about confidentiality and informed consent. Researchers must take care to protect the privacy and confidentiality of participants and obtain informed consent.

Also see Research Methods

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Questionnaire

Questionnaire – Definition, Types, and Examples

Case Study Research

Case Study – Methods, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Quantitative Research

Quantitative Research – Methods, Types and...

Qualitative Research Methods

Qualitative Research Methods

Explanatory Research

Explanatory Research – Types, Methods, Guide

Qualitative study design: Methodologies

  • Qualitative study design
  • Phenomenology
  • Grounded theory
  • Ethnography
  • Narrative inquiry
  • Action research
  • Case Studies
  • Field research
  • Focus groups
  • Observation
  • Surveys & questionnaires
  • Study Designs Home

A methodology is the system of methods used in a discipline area and the justification for using a particular method in research.

Click to the specific methodologies you are interested in:

  • Case studies
  • << Previous: Qualitative study design
  • Next: Phenomenology >>
  • Last Updated: Apr 8, 2024 11:12 AM
  • URL: https://deakin.libguides.com/qualitative-study-designs

Designing a Research Proposal in Qualitative Research

  • First Online: 27 October 2022

Cite this chapter

research design in research methodology qualitative

  • Md. Ismail Hossain 4 ,
  • Nafiul Mehedi 4 &
  • Iftakhar Ahmad 4  

2414 Accesses

The chapter discusses designing a research proposal in qualitative research. The main objective is to outline the major components of a qualitative research proposal with example(s) so that the students and novice scholars easily get an understanding of a qualitative proposal. The chapter highlights the major components of a qualitative research proposal and discusses the steps involved in designing a proposal. In each step, an example is given with some essential tips. Following these steps and tips, a novice researcher can easily prepare a qualitative research proposal. Readers, especially undergraduate and master’s students, might use this as a guideline while preparing a thesis proposal. After reading this chapter, they can easily prepare a qualitative proposal.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abdulai, R. T., & Owusu-Ansah, A. (2014). Essential ingredients of a good research proposal for undergraduate and postgraduate students in the social sciences. SAGE Open, 4 (3), 2158244014548178.

Article   Google Scholar  

Ahmad, S., Wasim, S., Irfan, S., Gogoi, S., Srivastava, A., & Farheen, Z. (2019). Qualitative versus quantitative research. Population, 1 , 2.

Google Scholar  

Al-Riyami, A. (2008). How to prepare a research proposal. Oman Medical Journal, 23 (2), 66.

Aspers, P., & Corte, U. (2019). What is qualitative in qualitative research? Qualitative Sociology, 42 (2), 139–160.

Balakumar, P., Inamdar, M. N., & Jagadeesh, G. (2013). The critical steps for successful research: The research proposal and scientific writing (A report on the pre-conference workshop held in conjunction with the 64th annual conference of the Indian Pharmaceutical Congress-2012). Journal of Pharmacology & Pharmacotherapeutics, 4 (2), 130.

Becker, H. (1996). The epistemology of qualitative research. In R. Jessor, A. Colby & R. A Shweder (Eds.), Ethnography and human development: Context and meaning in social inquiry .

Boeije, H. (2010). Analysis in qualitative research . Los Angeles Sage Publications.

Bryman, A., Bresnen, M., Beardsworth, A., & Keil, T. (1988). Qualitative research and the study of leadership. Human Relations, 41 (1), 13–29.

Campbell, D. T., & Stanley, J. C. (2015). Experimental and quasi-experimental designs for research . Ravenio Books.

Creswell, J. W. (1994). Research design: Qualitative and quantitative approach . London: Publications.

Creswell, J. W. (2013). Research design: Qualitative, quantitative, and mixed methods approaches (4th Edn.). London: SAGE Publications, Inc.

Davis, B. (2021). What is the scope of the study in research proposal? Retrieved from https://www.mvorganizing.org/what-is-the-scope-of-the-study-in-research-proposal-4/#What_are_strengths_and_limitations . Accessed on August 28, 2021.

Denzin, N. K., & Lincoln, Y. S. (2005). Introduction: The discipline and practice of qualitative research. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (pp. 1–32). Sage Publications Ltd.

DJS Research. (2021). Qualitative research design . Retrieved from https://www.djsresearch.co.uk/glossary/item/Qualitative-Research-Design . Accessed on September 12, 2021.

Durrheim, K. (2006). Research design. In M. T. Blanche, M. J. T. Blanche, K. Durrheim, & D. Painter (Eds.), Research in practice: Applied methods for the social sciences (Vol. 2, pp. 33–59). Juta and Company Ltd.

Editage Insights. (2019). How do I present the scope of my study? Retrieved from https://www.editage.com/insights/how-do-i-present-scope-of-my-study . Accessed on August 31, 2021.

Fry, J., Scammell, J., & Barker, S. (2017). Drowning in muddied waters or swimming downstream? A critical analysis of literature reviewing in a phenomenological study through an exploration of the lifeworld, reflexivity and role of the researcher. Indo-Pacific Journal of Phenomenology , 17 (1).

Grove, S. K., Burns, N., & Gray, J. (2012). The practice of nursing research: Appraisal, synthesis, and generation of evidence . Elsevier Health Sciences.

Islam, M. R. (2019). Designing a Ph.D. proposal in qualitative research. In Social research methodology and new techniques in analysis, interpretation, and writing (pp. 1–22). IGI Global.

James, N., & Busher, H. (2009). Epistemological dimensions in qualitative research: The construction of knowledge online. SAGE Internet Research Methods , 5–18.

Liamputtong, P., & Ezzy, D. (2005). Qualitative research methods. Second . Oxford University Press.

Morse, J. M., & Field, P. A. (1996). The purpose of qualitative research. In Nursing research (pp. 1–17). Springer.

Mouton, J., & Marais, H. C. (1990). Basic concepts in the methodology of the social sciences (Revised). Human Sciences Research Council.

Parahoo, K. (2014). Nursing research: principles, process and issues (3rd ed.). Palgrave.

Pathak, V., Jena, B., & Kalra, S. (2013). Qualitative research. Perspectives in Clinical Research, 4 (3), 192. https://doi.org/10.4103/2229-3485.115389

Patton, A. J. (2001). Modelling time-varying exchange rate dependence using the conditional copula.

Pietilä, A. M., Nurmi, S. M., Halkoaho, A., & Kyngäs, H. (2020). Qualitative research: Ethical considerations. In The application of content analysis in nursing science research (pp. 49–69). Springer.

Rosenthal, M. (2016). Qualitative research methods: Why, when, and how to conduct interviews and focus groups in pharmacy research. Currents in Pharmacy Teaching and Learning, 8 (4), 509–516.

Russell, C. K., & Gregory, D. M. (2003). Evaluation of qualitative research studies. Evidence-Based Nursing, 6 (2), 36–40.

Sandelowski, M., & Barroso, J. (2003). Writing the proposal for a qualitative research methodology project. Qualitative Health Research, 13 (6), 781–820.

Strauss, A., & Corbin, J. (1990). Basics of qualitative research . Sage publications.

Walker, W. (2007). Ethical considerations in phenomenological research. Nurse researcher , 14 (3).

Wilson, A. (2015). A guide to phenomenological research. Nursing Standard, 29 (34), 38–43.

Download references

Author information

Authors and affiliations.

Department of Social Work, Shahjalal University of Science and Technology, Sylhet, Bangladesh

Md. Ismail Hossain, Nafiul Mehedi & Iftakhar Ahmad

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Md. Ismail Hossain .

Editor information

Editors and affiliations.

Centre for Family and Child Studies, Research Institute of Humanities and Social Sciences, University of Sharjah, Sharjah, United Arab Emirates

M. Rezaul Islam

Department of Development Studies, University of Dhaka, Dhaka, Bangladesh

Niaz Ahmed Khan

Department of Social Work, School of Humanities, University of Johannesburg, Johannesburg, South Africa

Rajendra Baikady

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Hossain, M.I., Mehedi, N., Ahmad, I. (2022). Designing a Research Proposal in Qualitative Research. In: Islam, M.R., Khan, N.A., Baikady, R. (eds) Principles of Social Research Methodology. Springer, Singapore. https://doi.org/10.1007/978-981-19-5441-2_18

Download citation

DOI : https://doi.org/10.1007/978-981-19-5441-2_18

Published : 27 October 2022

Publisher Name : Springer, Singapore

Print ISBN : 978-981-19-5219-7

Online ISBN : 978-981-19-5441-2

eBook Packages : Social Sciences

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

research design in research methodology qualitative

  • Politics & Social Sciences
  • Social Sciences

Buy new: .savingPriceOverride { color:#CC0C39!important; font-weight: 300!important; } .reinventMobileHeaderPrice { font-weight: 400; } #apex_offerDisplay_mobile_feature_div .reinventPriceSavingsPercentageMargin, #apex_offerDisplay_mobile_feature_div .reinventPricePriceToPayMargin { margin-right: 4px; } $74.40 $ 74 . 40 $3.99 delivery Thursday, May 9 Ships from: HonestJack4 Sold by: HonestJack4

Save with used - good .savingpriceoverride { color:#cc0c39important; font-weight: 300important; } .reinventmobileheaderprice { font-weight: 400; } #apex_offerdisplay_mobile_feature_div .reinventpricesavingspercentagemargin, #apex_offerdisplay_mobile_feature_div .reinventpricepricetopaymargin { margin-right: 4px; } $11.99 $ 11 . 99 free delivery may 12 - 17 on orders shipped by amazon over $35 ships from: amazon sold by: zoom books company, return this item for free.

Free returns are available for the shipping address you chose. You can return the item for any reason in new and unused condition: no shipping charges

  • Go to your orders and start the return
  • Select the return method

Kindle app logo image

Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required .

Read instantly on your browser with Kindle for Web.

Using your mobile phone camera - scan the code below and download the Kindle app.

QR code to download the Kindle App

Image Unavailable

Research Design: Qualitative, Quantitative and Mixed Methods Approaches

  • To view this video download Flash Player

research design in research methodology qualitative

Follow the author

John W. Creswell

Research Design: Qualitative, Quantitative and Mixed Methods Approaches 4th Edition

Purchase options and add-ons.

  • ISBN-10 1452226105
  • ISBN-13 978-1452226101
  • Edition 4th
  • Publisher SAGE Publications, Inc
  • Publication date January 1, 2014
  • Language English
  • Dimensions 7.5 x 0.75 x 10.5 inches
  • Print length 273 pages
  • See all details

Amazon First Reads | Editors' picks at exclusive prices

Frequently bought together

Research Design: Qualitative, Quantitative and Mixed Methods Approaches

Similar items that may ship from close to you

Davis Advantage for Psychiatric Mental Health Nursing

Editorial Reviews

About the author.

John W. Creswell, PhD, is a Professor of Family Medicine and Senior Research Scientist of

the Michigan Mixed Methods Program. He has authored numerous articles and 34 books on

mixed methods research, qualitative research, and research design. While at the University of

Nebraska–Lincoln, he held the Clifton Endowed Professor Chair, served as Director of the

Mixed Methods Research Office, co-founded SAGE’s Journal of Mixed Methods Research , and

was an Adjunct Professor of Family Medicine at the University of Michigan and a consultant to

the Veterans Administration Health Services Research Center in Ann Arbor, Michigan. He was

a Senior Fulbright Scholar to South Africa in 2008 and to Thailand in 2012. In 2011, he co-led

a National Institutes of Health working group on the “best practices of mixed methods research

in the health sciences,” served as a Visiting Professor at Harvard’s School of Public Health and

received an honorary doctorate from the University of Pretoria, South Africa. In 2014, he was

the founding President of the Mixed Methods International Research Association. In 2015, he

joined the staff of Family Medicine at the University of Michigan to Co-Direct the Michigan

Mixed Methods Program. In 2017, he coauthored the American Psychological Association

“standards” on qualitative and mixed methods research. The fourth edition of this book on

Qualitative Inquiry & Research Design won the 2018 McGuffey Longevity Award from the U.S.

Textbook & Academic Authors Association. During the COVID-19 pandemic, he gave virtual

keynote presentations to many countries from his office in Osaka, Japan. Updates on his work

can be found on his website at johnwcreswell.com.

Product details

  • Publisher ‏ : ‎ SAGE Publications, Inc; 4th edition (January 1, 2014)
  • Language ‏ : ‎ English
  • Paperback ‏ : ‎ 273 pages
  • ISBN-10 ‏ : ‎ 1452226105
  • ISBN-13 ‏ : ‎ 978-1452226101
  • Item Weight ‏ : ‎ 1.2 pounds
  • Dimensions ‏ : ‎ 7.5 x 0.75 x 10.5 inches
  • #184 in Sociology (Books)
  • #285 in Social Sciences Research
  • #9,990 in Unknown

Videos for this product

Video Widget Card

Click to play video

Video Widget Video Title Section

Great quality, helpful content! Check it out!

research design in research methodology qualitative

About the author

John w. creswell.

John W. Creswell is a Professor of Educational Psychology at Teachers College, University of Nebraska-Lincoln. He is affiliated with a graduate program in educational psychology that specializes in quantitative and qualitative methods in education. In this program, he specializes in qualitative and quantitative research designs and methods, multimethod research, and faculty and academic leadership issues in colleges and universities.

Customer reviews

Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.

To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.

  • Sort reviews by Top reviews Most recent Top reviews

Top reviews from the United States

There was a problem filtering reviews right now. please try again later..

research design in research methodology qualitative

Top reviews from other countries

research design in research methodology qualitative

  • Amazon Newsletter
  • About Amazon
  • Accessibility
  • Sustainability
  • Press Center
  • Investor Relations
  • Amazon Devices
  • Amazon Science
  • Sell on Amazon
  • Sell apps on Amazon
  • Supply to Amazon
  • Protect & Build Your Brand
  • Become an Affiliate
  • Become a Delivery Driver
  • Start a Package Delivery Business
  • Advertise Your Products
  • Self-Publish with Us
  • Become an Amazon Hub Partner
  • › See More Ways to Make Money
  • Amazon Visa
  • Amazon Store Card
  • Amazon Secured Card
  • Amazon Business Card
  • Shop with Points
  • Credit Card Marketplace
  • Reload Your Balance
  • Amazon Currency Converter
  • Your Account
  • Your Orders
  • Shipping Rates & Policies
  • Amazon Prime
  • Returns & Replacements
  • Manage Your Content and Devices
  • Recalls and Product Safety Alerts
  • Conditions of Use
  • Privacy Notice
  • Consumer Health Data Privacy Disclosure
  • Your Ads Privacy Choices

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

Writing Survey Questions

Perhaps the most important part of the survey process is the creation of questions that accurately measure the opinions, experiences and behaviors of the public. Accurate random sampling will be wasted if the information gathered is built on a shaky foundation of ambiguous or biased questions. Creating good measures involves both writing good questions and organizing them to form the questionnaire.

Questionnaire design is a multistage process that requires attention to many details at once. Designing the questionnaire is complicated because surveys can ask about topics in varying degrees of detail, questions can be asked in different ways, and questions asked earlier in a survey may influence how people respond to later questions. Researchers are also often interested in measuring change over time and therefore must be attentive to how opinions or behaviors have been measured in prior surveys.

Surveyors may conduct pilot tests or focus groups in the early stages of questionnaire development in order to better understand how people think about an issue or comprehend a question. Pretesting a survey is an essential step in the questionnaire design process to evaluate how people respond to the overall questionnaire and specific questions, especially when questions are being introduced for the first time.

For many years, surveyors approached questionnaire design as an art, but substantial research over the past forty years has demonstrated that there is a lot of science involved in crafting a good survey questionnaire. Here, we discuss the pitfalls and best practices of designing questionnaires.

Question development

There are several steps involved in developing a survey questionnaire. The first is identifying what topics will be covered in the survey. For Pew Research Center surveys, this involves thinking about what is happening in our nation and the world and what will be relevant to the public, policymakers and the media. We also track opinion on a variety of issues over time so we often ensure that we update these trends on a regular basis to better understand whether people’s opinions are changing.

At Pew Research Center, questionnaire development is a collaborative and iterative process where staff meet to discuss drafts of the questionnaire several times over the course of its development. We frequently test new survey questions ahead of time through qualitative research methods such as  focus groups , cognitive interviews, pretesting (often using an  online, opt-in sample ), or a combination of these approaches. Researchers use insights from this testing to refine questions before they are asked in a production survey, such as on the ATP.

Measuring change over time

Many surveyors want to track changes over time in people’s attitudes, opinions and behaviors. To measure change, questions are asked at two or more points in time. A cross-sectional design surveys different people in the same population at multiple points in time. A panel, such as the ATP, surveys the same people over time. However, it is common for the set of people in survey panels to change over time as new panelists are added and some prior panelists drop out. Many of the questions in Pew Research Center surveys have been asked in prior polls. Asking the same questions at different points in time allows us to report on changes in the overall views of the general public (or a subset of the public, such as registered voters, men or Black Americans), or what we call “trending the data”.

When measuring change over time, it is important to use the same question wording and to be sensitive to where the question is asked in the questionnaire to maintain a similar context as when the question was asked previously (see  question wording  and  question order  for further information). All of our survey reports include a topline questionnaire that provides the exact question wording and sequencing, along with results from the current survey and previous surveys in which we asked the question.

The Center’s transition from conducting U.S. surveys by live telephone interviewing to an online panel (around 2014 to 2020) complicated some opinion trends, but not others. Opinion trends that ask about sensitive topics (e.g., personal finances or attending religious services ) or that elicited volunteered answers (e.g., “neither” or “don’t know”) over the phone tended to show larger differences than other trends when shifting from phone polls to the online ATP. The Center adopted several strategies for coping with changes to data trends that may be related to this change in methodology. If there is evidence suggesting that a change in a trend stems from switching from phone to online measurement, Center reports flag that possibility for readers to try to head off confusion or erroneous conclusions.

Open- and closed-ended questions

One of the most significant decisions that can affect how people answer questions is whether the question is posed as an open-ended question, where respondents provide a response in their own words, or a closed-ended question, where they are asked to choose from a list of answer choices.

For example, in a poll conducted after the 2008 presidential election, people responded very differently to two versions of the question: “What one issue mattered most to you in deciding how you voted for president?” One was closed-ended and the other open-ended. In the closed-ended version, respondents were provided five options and could volunteer an option not on the list.

When explicitly offered the economy as a response, more than half of respondents (58%) chose this answer; only 35% of those who responded to the open-ended version volunteered the economy. Moreover, among those asked the closed-ended version, fewer than one-in-ten (8%) provided a response other than the five they were read. By contrast, fully 43% of those asked the open-ended version provided a response not listed in the closed-ended version of the question. All of the other issues were chosen at least slightly more often when explicitly offered in the closed-ended version than in the open-ended version. (Also see  “High Marks for the Campaign, a High Bar for Obama”  for more information.)

research design in research methodology qualitative

Researchers will sometimes conduct a pilot study using open-ended questions to discover which answers are most common. They will then develop closed-ended questions based off that pilot study that include the most common responses as answer choices. In this way, the questions may better reflect what the public is thinking, how they view a particular issue, or bring certain issues to light that the researchers may not have been aware of.

When asking closed-ended questions, the choice of options provided, how each option is described, the number of response options offered, and the order in which options are read can all influence how people respond. One example of the impact of how categories are defined can be found in a Pew Research Center poll conducted in January 2002. When half of the sample was asked whether it was “more important for President Bush to focus on domestic policy or foreign policy,” 52% chose domestic policy while only 34% said foreign policy. When the category “foreign policy” was narrowed to a specific aspect – “the war on terrorism” – far more people chose it; only 33% chose domestic policy while 52% chose the war on terrorism.

In most circumstances, the number of answer choices should be kept to a relatively small number – just four or perhaps five at most – especially in telephone surveys. Psychological research indicates that people have a hard time keeping more than this number of choices in mind at one time. When the question is asking about an objective fact and/or demographics, such as the religious affiliation of the respondent, more categories can be used. In fact, they are encouraged to ensure inclusivity. For example, Pew Research Center’s standard religion questions include more than 12 different categories, beginning with the most common affiliations (Protestant and Catholic). Most respondents have no trouble with this question because they can expect to see their religious group within that list in a self-administered survey.

In addition to the number and choice of response options offered, the order of answer categories can influence how people respond to closed-ended questions. Research suggests that in telephone surveys respondents more frequently choose items heard later in a list (a “recency effect”), and in self-administered surveys, they tend to choose items at the top of the list (a “primacy” effect).

Because of concerns about the effects of category order on responses to closed-ended questions, many sets of response options in Pew Research Center’s surveys are programmed to be randomized to ensure that the options are not asked in the same order for each respondent. Rotating or randomizing means that questions or items in a list are not asked in the same order to each respondent. Answers to questions are sometimes affected by questions that precede them. By presenting questions in a different order to each respondent, we ensure that each question gets asked in the same context as every other question the same number of times (e.g., first, last or any position in between). This does not eliminate the potential impact of previous questions on the current question, but it does ensure that this bias is spread randomly across all of the questions or items in the list. For instance, in the example discussed above about what issue mattered most in people’s vote, the order of the five issues in the closed-ended version of the question was randomized so that no one issue appeared early or late in the list for all respondents. Randomization of response items does not eliminate order effects, but it does ensure that this type of bias is spread randomly.

Questions with ordinal response categories – those with an underlying order (e.g., excellent, good, only fair, poor OR very favorable, mostly favorable, mostly unfavorable, very unfavorable) – are generally not randomized because the order of the categories conveys important information to help respondents answer the question. Generally, these types of scales should be presented in order so respondents can easily place their responses along the continuum, but the order can be reversed for some respondents. For example, in one of Pew Research Center’s questions about abortion, half of the sample is asked whether abortion should be “legal in all cases, legal in most cases, illegal in most cases, illegal in all cases,” while the other half of the sample is asked the same question with the response categories read in reverse order, starting with “illegal in all cases.” Again, reversing the order does not eliminate the recency effect but distributes it randomly across the population.

Question wording

The choice of words and phrases in a question is critical in expressing the meaning and intent of the question to the respondent and ensuring that all respondents interpret the question the same way. Even small wording differences can substantially affect the answers people provide.

[View more Methods 101 Videos ]

An example of a wording difference that had a significant impact on responses comes from a January 2003 Pew Research Center survey. When people were asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule,” 68% said they favored military action while 25% said they opposed military action. However, when asked whether they would “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule  even if it meant that U.S. forces might suffer thousands of casualties, ” responses were dramatically different; only 43% said they favored military action, while 48% said they opposed it. The introduction of U.S. casualties altered the context of the question and influenced whether people favored or opposed military action in Iraq.

There has been a substantial amount of research to gauge the impact of different ways of asking questions and how to minimize differences in the way respondents interpret what is being asked. The issues related to question wording are more numerous than can be treated adequately in this short space, but below are a few of the important things to consider:

First, it is important to ask questions that are clear and specific and that each respondent will be able to answer. If a question is open-ended, it should be evident to respondents that they can answer in their own words and what type of response they should provide (an issue or problem, a month, number of days, etc.). Closed-ended questions should include all reasonable responses (i.e., the list of options is exhaustive) and the response categories should not overlap (i.e., response options should be mutually exclusive). Further, it is important to discern when it is best to use forced-choice close-ended questions (often denoted with a radio button in online surveys) versus “select-all-that-apply” lists (or check-all boxes). A 2019 Center study found that forced-choice questions tend to yield more accurate responses, especially for sensitive questions.  Based on that research, the Center generally avoids using select-all-that-apply questions.

It is also important to ask only one question at a time. Questions that ask respondents to evaluate more than one concept (known as double-barreled questions) – such as “How much confidence do you have in President Obama to handle domestic and foreign policy?” – are difficult for respondents to answer and often lead to responses that are difficult to interpret. In this example, it would be more effective to ask two separate questions, one about domestic policy and another about foreign policy.

In general, questions that use simple and concrete language are more easily understood by respondents. It is especially important to consider the education level of the survey population when thinking about how easy it will be for respondents to interpret and answer a question. Double negatives (e.g., do you favor or oppose  not  allowing gays and lesbians to legally marry) or unfamiliar abbreviations or jargon (e.g., ANWR instead of Arctic National Wildlife Refuge) can result in respondent confusion and should be avoided.

Similarly, it is important to consider whether certain words may be viewed as biased or potentially offensive to some respondents, as well as the emotional reaction that some words may provoke. For example, in a 2005 Pew Research Center survey, 51% of respondents said they favored “making it legal for doctors to give terminally ill patients the means to end their lives,” but only 44% said they favored “making it legal for doctors to assist terminally ill patients in committing suicide.” Although both versions of the question are asking about the same thing, the reaction of respondents was different. In another example, respondents have reacted differently to questions using the word “welfare” as opposed to the more generic “assistance to the poor.” Several experiments have shown that there is much greater public support for expanding “assistance to the poor” than for expanding “welfare.”

We often write two versions of a question and ask half of the survey sample one version of the question and the other half the second version. Thus, we say we have two  forms  of the questionnaire. Respondents are assigned randomly to receive either form, so we can assume that the two groups of respondents are essentially identical. On questions where two versions are used, significant differences in the answers between the two forms tell us that the difference is a result of the way we worded the two versions.

research design in research methodology qualitative

One of the most common formats used in survey questions is the “agree-disagree” format. In this type of question, respondents are asked whether they agree or disagree with a particular statement. Research has shown that, compared with the better educated and better informed, less educated and less informed respondents have a greater tendency to agree with such statements. This is sometimes called an “acquiescence bias” (since some kinds of respondents are more likely to acquiesce to the assertion than are others). This behavior is even more pronounced when there’s an interviewer present, rather than when the survey is self-administered. A better practice is to offer respondents a choice between alternative statements. A Pew Research Center experiment with one of its routinely asked values questions illustrates the difference that question format can make. Not only does the forced choice format yield a very different result overall from the agree-disagree format, but the pattern of answers between respondents with more or less formal education also tends to be very different.

One other challenge in developing questionnaires is what is called “social desirability bias.” People have a natural tendency to want to be accepted and liked, and this may lead people to provide inaccurate answers to questions that deal with sensitive subjects. Research has shown that respondents understate alcohol and drug use, tax evasion and racial bias. They also may overstate church attendance, charitable contributions and the likelihood that they will vote in an election. Researchers attempt to account for this potential bias in crafting questions about these topics. For instance, when Pew Research Center surveys ask about past voting behavior, it is important to note that circumstances may have prevented the respondent from voting: “In the 2012 presidential election between Barack Obama and Mitt Romney, did things come up that kept you from voting, or did you happen to vote?” The choice of response options can also make it easier for people to be honest. For example, a question about church attendance might include three of six response options that indicate infrequent attendance. Research has also shown that social desirability bias can be greater when an interviewer is present (e.g., telephone and face-to-face surveys) than when respondents complete the survey themselves (e.g., paper and web surveys).

Lastly, because slight modifications in question wording can affect responses, identical question wording should be used when the intention is to compare results to those from earlier surveys. Similarly, because question wording and responses can vary based on the mode used to survey respondents, researchers should carefully evaluate the likely effects on trend measurements if a different survey mode will be used to assess change in opinion over time.

Question order

Once the survey questions are developed, particular attention should be paid to how they are ordered in the questionnaire. Surveyors must be attentive to how questions early in a questionnaire may have unintended effects on how respondents answer subsequent questions. Researchers have demonstrated that the order in which questions are asked can influence how people respond; earlier questions can unintentionally provide context for the questions that follow (these effects are called “order effects”).

One kind of order effect can be seen in responses to open-ended questions. Pew Research Center surveys generally ask open-ended questions about national problems, opinions about leaders and similar topics near the beginning of the questionnaire. If closed-ended questions that relate to the topic are placed before the open-ended question, respondents are much more likely to mention concepts or considerations raised in those earlier questions when responding to the open-ended question.

For closed-ended opinion questions, there are two main types of order effects: contrast effects ( where the order results in greater differences in responses), and assimilation effects (where responses are more similar as a result of their order).

research design in research methodology qualitative

An example of a contrast effect can be seen in a Pew Research Center poll conducted in October 2003, a dozen years before same-sex marriage was legalized in the U.S. That poll found that people were more likely to favor allowing gays and lesbians to enter into legal agreements that give them the same rights as married couples when this question was asked after one about whether they favored or opposed allowing gays and lesbians to marry (45% favored legal agreements when asked after the marriage question, but 37% favored legal agreements without the immediate preceding context of a question about same-sex marriage). Responses to the question about same-sex marriage, meanwhile, were not significantly affected by its placement before or after the legal agreements question.

research design in research methodology qualitative

Another experiment embedded in a December 2008 Pew Research Center poll also resulted in a contrast effect. When people were asked “All in all, are you satisfied or dissatisfied with the way things are going in this country today?” immediately after having been asked “Do you approve or disapprove of the way George W. Bush is handling his job as president?”; 88% said they were dissatisfied, compared with only 78% without the context of the prior question.

Responses to presidential approval remained relatively unchanged whether national satisfaction was asked before or after it. A similar finding occurred in December 2004 when both satisfaction and presidential approval were much higher (57% were dissatisfied when Bush approval was asked first vs. 51% when general satisfaction was asked first).

Several studies also have shown that asking a more specific question before a more general question (e.g., asking about happiness with one’s marriage before asking about one’s overall happiness) can result in a contrast effect. Although some exceptions have been found, people tend to avoid redundancy by excluding the more specific question from the general rating.

Assimilation effects occur when responses to two questions are more consistent or closer together because of their placement in the questionnaire. We found an example of an assimilation effect in a Pew Research Center poll conducted in November 2008 when we asked whether Republican leaders should work with Obama or stand up to him on important issues and whether Democratic leaders should work with Republican leaders or stand up to them on important issues. People were more likely to say that Republican leaders should work with Obama when the question was preceded by the one asking what Democratic leaders should do in working with Republican leaders (81% vs. 66%). However, when people were first asked about Republican leaders working with Obama, fewer said that Democratic leaders should work with Republican leaders (71% vs. 82%).

The order questions are asked is of particular importance when tracking trends over time. As a result, care should be taken to ensure that the context is similar each time a question is asked. Modifying the context of the question could call into question any observed changes over time (see  measuring change over time  for more information).

A questionnaire, like a conversation, should be grouped by topic and unfold in a logical order. It is often helpful to begin the survey with simple questions that respondents will find interesting and engaging. Throughout the survey, an effort should be made to keep the survey interesting and not overburden respondents with several difficult questions right after one another. Demographic questions such as income, education or age should not be asked near the beginning of a survey unless they are needed to determine eligibility for the survey or for routing respondents through particular sections of the questionnaire. Even then, it is best to precede such items with more interesting and engaging questions. One virtue of survey panels like the ATP is that demographic questions usually only need to be asked once a year, not in each survey.

U.S. Surveys

Other research methods, sign up for our weekly newsletter.

Fresh data delivered Saturday mornings

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Politics & Policy
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Terms & Conditions

Privacy Policy

Cookie Settings

Reprints, Permissions & Use Policy

  • Open access
  • Published: 27 April 2024

Exploring health care providers’ engagement in prevention and management of multidrug resistant Tuberculosis and its factors in Hadiya Zone health care facilities: qualitative study

  • Bereket Aberham Lajore 1   na1   nAff5 ,
  • Yitagesu Habtu Aweke 2   na1   nAff6 ,
  • Samuel Yohannes Ayanto 3   na1   nAff7 &
  • Menen Ayele 4   nAff5  

BMC Health Services Research volume  24 , Article number:  542 ( 2024 ) Cite this article

42 Accesses

Metrics details

Engagement of healthcare providers is one of the World Health Organization strategies devised for prevention and provision of patient centered care for multidrug resistant tuberculosis. The need for current research question rose because of the gaps in evidence on health professional’s engagement and its factors in multidrug resistant tuberculosis service delivery as per the protocol in the prevention and management of multidrug resistant tuberculosis.

The purpose of this study was to explore the level of health care providers’ engagement in multidrug resistant tuberculosis prevention and management and influencing factors in Hadiya Zone health facilities, Southern Ethiopia.

Descriptive phenomenological qualitative study design was employed between 02 May and 09 May, 2019. We conducted a key informant interview and focus group discussions using purposely selected healthcare experts working as directly observed treatment short course providers in multidrug resistant tuberculosis treatment initiation centers, program managers, and focal persons. Verbatim transcripts were translated to English and exported to open code 4.02 for line-by-line coding and categorization of meanings into same emergent themes. Thematic analysis was conducted based on predefined themes for multidrug resistant tuberculosis prevention and management and core findings under each theme were supported by domain summaries in our final interpretation of the results. To maintain the rigors, Lincoln and Guba’s parallel quality criteria of trustworthiness was used particularly, credibility, dependability, transferability, confirmability and reflexivity.

Total of 26 service providers, program managers, and focal persons were participated through four focus group discussion and five key informant interviews. The study explored factors for engagement of health care providers in the prevention and management of multidrug resistant tuberculosis in five emergent themes such as patients’ causes, perceived susceptibility, seeking support, professional incompetence and poor linkage of the health care facilities. Our findings also suggest that service providers require additional training, particularly in programmatic management of drug-resistant tuberculosis.

The study explored five emergent themes: patient’s underlying causes, seeking support, perceived susceptibility, professionals’ incompetence and health facilities poor linkage. Community awareness creation to avoid fear of discrimination through provision of support for those with multidrug resistant tuberculosis is expected from health care providers using social behavioral change communication strategies. Furthermore, program managers need to follow the recommendations of World Health Organization for engaging healthcare professionals in the prevention and management of multidrug resistant tuberculosis and cascade trainings in clinical programmatic management of the disease for healthcare professionals.

Peer Review reports

Introduction

Mycobacterium tuberculosis, the infectious agent that causes multi-drug resistant tuberculosis (MDR-TB), is resistant to at least rifampicin and isoniazid. Direct infection can cause the disease to spread, or it can develop secondary to improper management of tuberculosis among drug susceptible tuberculosis cases and associated poor adherence [ 1 ].

Multidrug-resistant strains of mycobacterium tuberculosis have recently emerged, which makes achieving “End TB Strategy” more difficult [ 2 ]. Multi drug resistant tuberculosis (MDR-TB) has been found to increasingly pose a serious threat to global and Ethiopian public health sector. Despite the fact that a number of risk factors for MDR-TB have been identified through various research designs, the epidemiology of this disease is complex, contextual, and multifaceted [ 1 ]. Quantitative studies demonstrate that prior treatment history [ 3 , 4 , 5 , 6 , 7 ], interrupted drug supply [ 8 ], inappropriate treatments and poor patient compliance [ 3 , 7 , 9 ], poor quality directly observed treatment short course (DOTS), poor treatment adherence [ 10 ], age [ 5 ], and malnutrition [ 11 ] were factors associated with multi drug resistant TB.

Globally, an estimated 20% of previously treated cases and 3.3% of new cases are thought to have MDR-TB; these levels have essentially not changed in recent years. Globally, 160,684 cases of multidrug-resistant TB and rifampicin-resistant TB (MDR/RR-TB) were notified in 2017, and 139,114 cases were enrolled into treatment in 2017 [ 12 ]. A systematic review in Ethiopia reported 2% prevalence of MDR-TB [ 3 ] that is higher than what is observed in Sub-Saharan Africa, 1.5% [ 13 ]. The prevalence of MDR-TB, according to the national drug-resistant tuberculosis (DR-TB) sentinel report, was 2.3% among newly diagnosed cases of TB and 17.8% among cases of TB who had already received treatment,. This suggests a rising trend in the prevalence of TB drug resistance compared to the results of the initial drug-resistant TB survey carried out in Ethiopia from 2003 to 2005 [ 14 ].

Ethiopia has placed strategies into place that emphasize political commitment, case finding, appropriate treatment, a continuous supply of second-line anti-TB medications of high quality, and a recording system. Due to other competing health priorities, the nation is having difficulty accelerating the scale-up of the detection, enrollment and treatment of drug-resistant TB patients [ 15 , 16 ]. To address these issues, the nation switched from a hospital-based to a clinic-based ambulatory model of care, which has allowed MDR-TB services to quickly decentralize and become more accessible. Accordingly, the nation has set up health facilities to act as either treatment initiating centers (TIC) or treatment follow-up centers (TFC) or both for improved referral and communication methods [ 15 ].

One of the key components of the “End TB strategy” is engagement of health care professionals in the prevention and management of multidrug resistant tuberculosis [ 17 ]. Inadequate engagement of healthcare providers is one aspect of the healthcare system that negatively influences MDR-TB prevention and control efforts [ 17 ]. This may be manifested in a number of ways, including inadequate understanding of drug-resistant tuberculosis, improper case identification, failure to initiate treatment again, placement of the wrong regimens, improper management of side effects and poor infection prevention [ 1 ]. These contributing factors are currently being observed in Ethiopia [ 18 ], Nigeria [ 7 , 19 , 20 ] and other countries [ 21 , 22 ]. According to a study conducted in Ethiopia, MDR-TB was linked to drug side effects from first-line treatments, being not directly observed, stopping treatment for at least a day, and retreating with a category II regimen [ 17 ].

This may be the result of a synergy between previously investigated and other contextual factors that have not yet been fully explored, such as professional engagement, beliefs, and poor preventive practices. The engagement of health professionals in MDR-TB prevention and control is assessed using a number of composite indicators. Health professionals may interact primarily inside the healthcare facilities. Typically, they play a significant role in connecting healthcare services with neighborhood-based activities [ 17 ]. One of the main research areas that have not sufficiently addressed is evidence indicating the status of healthcare professionals’ engagement and contextual factors in MDR-TB prevention and management.

It is increasingly urgent to identify additional and existing factors operating in a particular context that contribute to the development of the disease in light of the epidemic of drug resistance, including multi-drug resistance (MDR-TB) and extensively drug resistant TB (XDR-TB) in both new and previously treated cases of the disease [ 23 ]. In order to develop and implement control measures, it is therefore essential to operationally identify a number of contextual factors operating at the individual, community, and health system level.

Therefore, the overall purpose of this study was to explore the level of engagement of health care providers and contextual factors hindering/enabling the prevention and provision of patient-centered care for MDR-TB in health facilities, DOTS services centers and MDR-TB treatment initiation center [TIC], in Hadiya Zone, Southern Ethiopia.

Qualitative approach and research paradigm

Descriptive phenomenological qualitative study design was employed to explore factors influencing engagement of health professionals in MDR-TB prevention and management and thematic technique was employed for the analysis of the data.

Researchers’ characteristics and reflexivity

Three Principal investigators conducted this study. Two of them had Masters of public health in Epidemiology and Reproductive health and PhD candidates and the third one had Bachelor’s degree in public health with clinical experience in the area of Tuberculosis prevention and management and MPH in Biostatistics. The principal investigators have research experience with published articles in different reputable journals. There were no prior contacts between researchers and participants before the study whereas researchers have built positive rapport with study participants during data collection to foster open communication and trust and had no any assumptions and presuppositions about the research topic and result.

Context/ study setting and period

The study was conducted between 2 and 9 May, 2019 in Hadiya Zone with more than 1.7 million people residing in the Zone. There are 300 health posts, 63 health centers, 3 functional primary hospitals and 1 comprehensive specialized hospital in the Zone. Also, there are more than 350 private clinics and 1 private hospital in the Zone. All of the public health facilities and some private health facilities provide directly observed short course treatment (DOTS) service for tuberculosis patients. There are more than eight treatment initiation centers (TICs) for MDR-TB patients in Hadiya Zone. MDR-TB (Multidrug-resistant tuberculosis) treatment initiation centers are specialized facilities that provide comprehensive care, diagnosis and treatment initiation, psychosocial support, and follow up services to individuals with MDR-TB. The linkage between MDR-TB treatment initiation centers and other healthcare facilities lies in the coordination of care, referral pathways, and collaboration to ensure comprehensive and integrated care for individuals with MDR-TB. Overall, healthcare providers play a crucial role in the management of MDR-TB by providing specialized care, ensuring treatment adherence, monitoring progress and outcomes, and supporting individuals in achieving successful treatment outcomes and improved health.

Units of study and sampling strategy

Our study participants were health care professionals working in MDR-TB TICs in both private and public health facilities, and providing DOTS services, MDR-TB program leaders in treatment initiation centers, as well as TB focal persons, disease prevention and health promotion focal person, and project partners from district health offices. The study involved four focus group discussion (FGDs) and five key informants’ interview (KII) with a total of 26 participants to gather the necessary information. Expert purposive sampling technique was employed and sample size was determined based on the saturation of idea required during data collection process.

Data collection methods and instruments

Focus group discussion and face to face key informants’ interviews were employed to collect the data. We conducted a total of four FGD and five key informants’ interviews with participants chosen from DOTS providing health facilities and MDR-TB program leaders in treatment initiation centers, as well as TB focal persons and project partners from district health offices and disease prevention and health promotion focal person. One of the FGDs was conducted among health professionals from the public MDR-TB treatment initiation centers. Three FGDs were conducted among disease prevention and health promotion focal persons, TB focal persons and DOTS providers in public health facilities (health centers).

An observation checklist was developed to assess the general infection prevention and control measures used by specific healthcare facilities in the study area. We used unstructured FGD guide, key informant interview guide, observation checklist and audio recorders to collect primary data and it was collected using local language called Amharic. Prior to data collection, three people who are not among principal investigators with at least a master’s degree in public health and prior experience with qualitative research were trained by principal investigators. Three of them acts as a tape recorder, a moderator, and as a note taker alternatively. The length of FGD ranged from 58 to 82 min and that of key informants’ interview lasted from 38 to 56 min.

Data processing and data analysis

Memos were written immediately after interviews followed by initial analysis. Transcription of audio records was performed by principal investigators. The audio recordings and notes were refined, cleaned and matched at the end of each data collection day to check for inconsistencies, correct errors, and modify the procedures in response to evolving study findings for subsequent data collection. Transcribed interviews, memos, and notes from investigator’s observation were translated to English and imported to Open Code 4.02 [ 2 ] for line by line coding of data, and categorizing important codes (sub theming). The pre-defined themes for MDR-TB prevention and control engagement were used to thematize the line-by-line codes, categories, and meanings using thematic analysis. Finally, the phenomenon being studied was explained by emerging categories and themes. Explanations in themes were substantiated by participants’ direct quotations when necessary.

Trustworthiness

Phone calls and face to face briefing were requested from study participants when some expressions in the audio seems confusing while transcripts were performed. To ensure the credibility of the study, prolonged engagement was conducted, including peer debriefing with colleagues of similar status during data analysis and inviting available study participants to review findings to ensure as it is in line with their view or not. Memos of interviews and observation were crosschecked while investigator was transcribing to ensure credibility of data as well as to triangulate investigator’s categorizing and theming procedures. For transferability, clear outlines of research design and processes were provided, along with a detailed study context for reader judgment. Dependability was ensured through careful recording and transcription of verbal and non-verbal data, and to minimize personal bias, scientific procedures were followed in all research stages. Conformability was maintained by conducting data transcription, translation, and interpretation using scientific methods. Researchers did all the best to show a range of realities, fairly and faithfully. Finally, an expert was invited to put sample of codes and categories to emerged corresponding categories and themes respectively.

Demographic characteristics of study participants

Four focus group discussions and five key informants’ interviews were conducted successfully. There were 26 participants in four focus group discussions, and key informants’ interview. Ages of participants ranges from 20 to 50 years with an average age of 33.4  ±  6.24 SD years. Participants have five to ten years of professional experience with DOTS services (Table  1 ).

Emergent themes and subthemes

The study explored how health care providers’ engagement in MDR-TB prevention and management was influenced. The investigation uncovered five major themes. These themes were the patient’s underlying causes, seeking support, perceived susceptibility, healthcare providers’ incompetence, and poor linkage between health facilities. Weak community TB prevention, health system support, and support from colleagues were identified subthemes in the search for help by health professionals whereas socioeconomic constraints, lack of awareness, and fear of discrimination were subthemes under patients underlying factors (Fig.  1 ).

figure 1

Themes and subthemes emerged from the analysis of health professionals’ engagement in MDR-TB prevention and management study in Hadiya zone’s health facilities, 2019

The patient’s underlying causes

This revealed why TB/MDR-TB treatment providers believe health professionals are unable to provide standard MDR-TB services. The subthemes include TB/MDR-TB awareness, fear of discrimination, and patients’ socioeconomic constraints.

Socioeconomic constraints

According to our research, the majority of healthcare professionals who provided directly observed short-course treatment services mentioned socioeconomic constraints as barriers to engage per standard and provide MDR-TB prevention and management service. More than half of the participants stated that patients’ primary reasons include lack of money for house rental close to the treatment centers, inability to afford food and other expenses, and financial constraints to cover transportation costs.

In addition to this, patients might have additional responsibilities to provide food and cover other costs for their families’ need. The majority of health care professionals thought that these restrictions led to their poor engagement in MDR-TB prevention and management. One of the focus groups’ discussants provided description of the scenario in the following way:

“…. I have many conversations with my TB/MDR-TB patients. They fail to complete DOTS or treatment intensive care primarily as a result of the requirement of prolonged family separation. They might provide most of the family needs, including food and other expenses” (FGD-P01).

Lack of awareness about MDR-TB

This subtheme explains how MDR-TB patients’ knowledge of the illness can make it more difficult for health professionals to provide DOTS or TICs services. The majority of DOTS providers stated that few TB or MDR-TB patients were aware of how MDR-TB spreads, how it is treated, and how much medication is required. Additionally, despite the fact that they had been educated for the disease, majority of patients did not want to stop contact with their families or caregivers. A health care provider stated,

“…. I provided health education for MDR-TB patients on how the disease is transmitted and how they should care for their family members. They don’t care; however, give a damn about their families .” (FGD-P05).

Some healthcare professionals reported that some patients thought that MDR-TB could not be cured by modern medication. One medical professional described the circumstance as follows:

“…. I noticed an MDR-TB patient who was unwilling to be screened. He concluded that modern medication is not effective and he went to spiritual and traditional healers” (FGD-P02).

As a result, almost all participants agreed on the extent to which patient knowledge of TB and MDR-TB can influence a provider’s engagement to MDR-TB services. The majority suggested that in order to improve treatment outcomes and preventive measures, the media, community leaders, health development armies, one-to-five networks, non-governmental organizations, treatment supporters, and other bodies with access to information need to put a lot of efforts.

Fear of discrimination

According to our research, about a quarter of healthcare professionals recognized that patients’ fear of discrimination prevents them from offering MDR-TB patients the DOTS services they need, including counseling index cases and tracing contact histories.

HEWs, HDAs, and 1-to-5 network members allegedly failed to monitor and counsel the index cases after their immediate return to their homes, according to the opinions from eight out of twenty-six healthcare professionals. The patients began to engage in routine social and political activities with neighbors while hiding their disease status. A healthcare professional described this situation as follows:

“…. I understood from my MDR-TB patient’s words that he kept to himself and avoided social interaction. He made this decision as a result of stigmatization by locals, including health extension workers. As a result, the patient can’t attend social gatherings. …. In addition, medical professionals exclude MDR-TB patients due to fear of exposures. As a result, patients are unwilling to undergo early screening” (FGD-P04).

Professionals’ perceived risk of occupational exposure

This theme highlights the anxiety that healthcare workers experience because of MDR-TB exposure when providing patient care. Our research shows that the majority of health professionals viewed participation as “taking coupons of death.” They believed that regardless of how and where they engaged in most healthcare facilities, the risk of exposure would remain the same. According to our discussion and interview, lack of health facility’s readiness takes paramount shares for the providers’ risk of exposures and their susceptibility.

According to the opinion from the majority of FGD discussants and in-depth interviewees, participants’ self-judgment score and our observation, the majority of healthcare facilities that offer DOTS for DS-TB and MDR-TB did not create or uphold standards in infection prevention in the way that could promote better engagement. These include poor maintenance of care facilities, lack of personal protective equipment, unsuitable facility design for service provision, lack of patient knowledge regarding the method of MDR-TB transmission, and lack of dedication on the part of health care staff.

As one of our key informant interviewees [District Disease Prevention Head], described health professionals’ low engagement has been due to fear of perceived susceptibility. He shared with us what he learned from a community forum he moderated.

Community forum participant stated that “… There was a moment a health professional run-away from the TB unit when MDR-TB patient arrived. At least they must provide the necessary service, even though they are not willing to demonstrate respectful, compassionate, or caring attitude to MDR-TB patients” (KII-P01). Besides , one of the FGD discussants described the circumstance as follows:

“…. Emm…. Because most health facilities or MDR-TB TIC are not standardized, I am concerned about the risk of transmission. They are crammed together and poor ventilation is evident as well as their configuration is improper. Other medical services are causing the TICs to become overcrowded. Most patients and some medical professionals are unconcerned with disease prevention ” (FGD-P19).

Participants’ general fear of susceptibility may be a normal psychological reaction and may serve as a motivation for taking preventative actions. However, almost all participants were concerned that the main reasons for their fear were brought up by the improper application of programmatic management and MDR-TB treatment standards and infection prevention protocols in healthcare facilities.

Health care providers’ incompetence

This theme illustrates how professionalism and dedication impact participation in MDR-TB prevention and management. The use of DS-TB prevention and management by health professionals was also taken into account because it is a major factor in the development of MDR-TB. This theme includes the participants’ perspectives towards other healthcare workers involved in and connected to MDR-TB.

Nearly all of the participants were aware of the causes and danger signs of MDR-TB. The majority of the defined participants fit to the current guidelines. However, participants in focus groups and key informant interviews have brought up shortcomings in MDR-TB service delivery practice and attitude. We looked at gaps among healthcare professionals’ knowledge, how they use the national recommendations for programmatic management and prevention of MDR-TB, prevent infections, take part in community MDR-TB screenings, and collaborate with other healthcare professionals for better engagement.

More than half of the participants voiced concerns about their attitudes and skill sets when using MDR-TB prevention and management guideline. When asked about his prior experiences, one of the focus group participants said:

“…. Ok, let me tell you my experience, I was new before I attended a training on MDR-TB. I was unfamiliar with the MDR-TB definition given in the recommendations. When I was hired, the health center’s director assigned me in the TB unit. I faced difficulties until I received training” (FGD-P24). Furthermore , one of the key informant interview participants shared a story: “…. In my experience, the majority of newly graduated health professionals lack the required skill. I propose that pre-service education curricula to include TB/MDR-TB prevention and management guideline trainings” (KII-P01).

The majority of participants mentioned the skill gap that was seen among health extension workers and laboratory technicians in the majority of healthcare facilities. Some of the participants in the in-depth interviews and FGD described the gaps as follows:

“…. According to repeated quality assurance feedbacks, there are many discordant cases in our [ District TB Focal Person ] case. Laboratory technicians who received a discrepant result (KII-P01) are not given training which is augmented by shared story from FGD discussants, “According to the quality assurance system, laboratory technicians lack skill and inconsistent results are typical necessitating training for newly joining laboratory technicians” (FGD-P20).

Through our discussions, we explored the level of DOTS providers’ adherence to the current TB/MDR-TB guideline. As a result, the majority of participants pointed out ineffective anti-TB management and follow-up care. One of the participants remembered her practical experience as follows:

“…. In my experience, the majority of health professionals fail to inform patients about the drug’s side effects, follow-up procedures, and other techniques for managing the burden of treatment. Only the anti-TB drug is provided, and the patient is left alone. The national treatment recommendation is not properly implemented by them” (FGD-P04).

Many barriers have been cited as reasons that might have hindered competencies for better engagement of health professionals. Training shortage is one of the major reasons mentioned by many of the study participants. One of discussants from private health facility described the problem as

“…. We are incompetent, in my opinion. Considering that we don’t attend update trainings. Many patients who were diagnosed negative at private medical facilities turned out to be positive, and vice versa which would be risky for drug resistance” (FGD-P14) which was supported by idea from a participant in our in-depth interview: “…. We [Program managers] are running short of training for our health care providers at different health centers and revealed that four out of every five healthcare professionals who work in various health centers are unaware of the TB/MDR-TB new guideline” (KII-P02).

Seeking support

This theme focuses on the significance and effects of workplace support in the engagement of MDR-TB prevention and control. This also explains the enabling and impeding elements in the engagement condition of health professionals. Three elements make up the theme: coworkers (other health professionals) in the workplace, support from community TB prevention actors, and a healthcare system.

Support from community TB prevention actors

This subtheme includes the assistance provided to study participants by important parties such as community leaders, the health development army, and other stakeholders who were involved in a community-based TB case notification, treatment adherence, and improved patient outcomes.

Many of the study participants reported that health extension workers have been poorly participating in MDR-TB and TB-related community-based activities like contact tracing, defaulter tracing, community forums, health promotion, and treatment support. One study participant described their gap as follows:

“…. I understood that people in the community were unaware of MDR-TB. The majority of health extension workers do not prioritize raising community awareness of MDR-TB” (FGD-P13). This was supported by idea from a district disease prevention head and stated as: “…. There is no active system for contacts tracing. Health educators send us information if they find suspected cases. However, some patients might not show up as expected. We have data on three family members who tested positive for MDR-TB” (KII-P3).

Support from a health system

The prime focus of this subtheme is on the enabling elements that DOTS providers require assistance from the current healthcare system for better engagement. All study participants expressed at least two needs to be met from the health system in order for them to effectively participate in MDR-TB prevention, treatment, and management. All study participants agreed that issues with the health system had a negative impact on their engagement in the prevention, treatment, diagnosis, and management of MDR-TB in almost all healthcare facilities. Poor conditions in infrastructure, resources (supplies, equipment, guidelines, and other logistics), capacity building (training), supportive supervision, establishment of public-private partnerships, and assignment of motivated and trained health professionals are some of the barriers that needs to be worked out in order to make them engage better. One of the participants pronounces supplies and logistics problems as:

“…. The health center I worked in is listed as a DOTS provider. However, it lacks constant electricity, a working microscope, lab supplies, medications, etc, and we refer suspected cases to nearby health centers or district hospitals for AFB-examination and, “Sometimes we use a single kit for many patients and wait for the medication supply for three or more weeks and patients stops a course of therapy that might induce drug resistance” (FGD-PI04) which was augmented by statement from FGD participant who works at a treatment initiation center: “…. We faced critical shortage of supplies and hospital administrators don’t care about funding essential supplies for patient care. For instance, this hospital (the hospital in which this FGD was conducted) can easily handle N-95 masks. Why then they (hospital administrators working in some TIC) can’t do it?” (FGD-P18).”

Regarding in-service training on MDR-TB, almost all participants pointed out shortage of on-job training mechanisms. One of our FGD participants said:

“…. I missed the new training on MDRTB programmatic management guidelines. I’ve heard that new updates are available. I still work using the old standard” (FGD-PI05). A health professional working in private clinic heightens the severity of training shortage as: “…. We have not participated in TB/MDR-TB guidelines training. You know, most of for-profit healthcare facilities do not provide any training for their staff. I’m not sure if I’m following the (TB/MDR-TB) guideline” (FGD-P14). One of our key informant interview participants; MDR-TB center focal person suggested the need for training as: “…. I’ve received training on the MDR-TB services and public-private partnership strategy. It was crucial in my opinion for better engagement. It is provided for our staff [MDRTB center focal person]. However, this has not yet been expanded to other health facilities” (KII-P04).

Concerning infrastructures, transportation problem was one of the frequently mentioned obstacles by many participants that hinder engagement in MDR-TB/TB service. This factor had a negative impact to both sides (health professionals and patients). One of discussants said:

“…. I face obstacles such as transport cost to perform effective TB/MDR-TB outreach activities like health education, tracing family contacts and defaulters and community mobilization. Rural kebeles are far apart from each other. How can I support 6 rural Kebeles?” (FGD-P01). One of the participants; MDR-TB treatment centers supervisor/program partner seconded the above idea as: “…. I suggest government must establish a system to support health professionals working in remote health care facilities in addition to MDR-TB centers. I guess there are more than 30 government health centers and additional private clinics. We can’t reach them all due to transportation challenges” (KII-P05). One of the participants , a district disease prevention head added: “…. Our laboratory technicians take sample from MDR-TB suspects to the post office and then, the post office sends to MDR-TB site. Sometimes, feedback may not reach timely. There is no any system to cover transportation cost. That would make case detection challenging” (FGD-P02).

Support from colleagues

Study participants stated the importance of having coworker with whom they could interconnect. However, eight participants reported that they were discriminated by their workmates for various reasons, such as their perceived fear of exposure to infection and their perception as if health professionals working in TB/MDR-TB unit get more training opportunities and other incentives. One of the focus group discussants said:

“…. My colleagues [health professional working out of MDR-TB TICs] stigmatize us only due to our work assignment in MDR-TB clinic. I remember that one of my friends who borrowed my headscarf preferred to throw it through a window than handing-over it back safely. Look, how much other health professionals are scared of working in MDR-TB unit. This makes me very upset. I am asking myself that why have I received such training on MDR-TB?” (FGD-P04).

Some of the participants also perceived that, health professionals working in MDR-TB/TB unit are the only responsible experts regarding MDR-TB care and treatment. Because, other health professionals consider training as if it is an incentive to work in such units. One of the FGD discussants described:

“… Health professionals who work in other service units are not volunteer to provide DOTS if TB focal person/previously trained staffs are not available. Patients wait for longer time” (FGD-P11).

Health facilities’ poor linkage

This theme demonstrates how various healthcare facilities, including private and public healthcare facilities such as, health posts, health care centers and hospitals, and healthcare professionals working at various levels of the healthcare system in relation to TB/MDR-TB service, are inter-linked or communicating with one another.

Many study participants noted a lack of coordination between higher referral hospitals, TB clinics, health posts, and health centers. Additionally, the majority of the assigned healthcare professionals had trouble communicating with patients and their coworkers. A focus group discussant also supported this idea as

“…. There is a lack of communication between us [DOTS providers at treatment initiation centers] and health posts, health centers, and private clinics. We are expected to support about 30 public health facilities. It’s of too much number, you know. They are out of our reach. We only took action when a problem arose” (FGD-P16).

Significant number of participants had raised the problem of poor communication between health facilities and treatment initiation centers. One of the interviewees [program manager] said:

“…. I see that one of our challenges is the weak referral connections between treatment initiation centers and health centers. As a result, improper sample transfer to Gene- Xpert sites and irregular postal delivery are frequent” . “Our; DOTS staff at the MDR-TB center, DOTS staff at the health center, and health extension workers are not well connected to one another. Many patients I encountered came to this center [MDR-TB center] after bypassing both health post and health center. Poor linkage and communication, in my opinion, could be one of the causes. The same holds true for medical facilities that are both public and private ” (KII-P02).

Engagement of individual healthcare providers is one of the peculiar interventions to achieve the goal of universal access to drug resistance tuberculosis care and services [ 17 ]. Healthcare providers engagement in detecting cases, treating and caring for multidrug resistant tuberculosis (MDR-TB) may be influenced by various intrinsic (individual provider factors) and extrinsic (peer, health system, political and other factors) [ 15 ]. Our study explored engagement of individual DOTS providers and factors that influence their engagement in MDR-TB prevention and management service. This is addressed through five emergent themes and subthemes as clearly specified in our results section.

The findings showed patients’ socioeconomic constraints were important challenges that influence health professionals’ engagement, and provision of MDR-TB prevention and management services. Although approaches differ, studies in Ethiopia [ 24 ], South Africa [ 25 ] and India [ 26 , 27 ] documented that such factors influence health providers’ engagement in the prevention and management of multi drug resistant tuberculosis. Again, the alleviation of these factors demands the effort from patients, stakeholders working on TB, others sectors, and the healthcare system so that healthcare providers can deliver the service more effectively in their day-to-day activities and will be more receptive to the other key factors.

We explored participants’ experiences on how patients’ awareness about drug sensitive or multi drug resistant tuberculosis influenced their engagement. Accordingly, participants encountered numerous gaps that restricted their interactions with TB/MDR-TB patients. In fact, our study design and purposes vary, studies [ 28 , 29 , 30 ] indicated that patients awareness influenced providers decision in relation to MDR-TB services and patients’ awareness status is among factors influencing healthcare providers’ decision making about the care the MDR-TB patient receives. As to our knowledge, patients’ perceived fear of discrimination was not documented whether it had direct negative impact on reducing providers’ engagement. Therefore, patients’ awareness creation is an important responsibility that needs to be addressed by the community health development army, health extension workers, all other healthcare providers and stakeholder for better MDR-TB services and patient outcomes.

Our study indicates that healthcare providers perceived that they would be exposed to MDR-TB while they are engaged. Some of the participants were more concerned about the disadvantages of engagement in providing care to MDR-TB patients which were predominantly psychological and physical pressure. In this context, the participants emphasized that engagement in MDR-TB patient care is “always being at risk” and expressed a negative attitude. This finding is similar to what has been demonstrated in a cross-sectional study conducted in South Africa in which majority of healthcare providers believed their engagement in MDR-TB services would risk their health [ 21 ].

However, majority of the healthcare providers demonstrated perceived fear of exposures mainly due to poor infection prevention practices and substandard organization of work environment in most TB/MDR-TB units. This is essentially reasonable fear, and needs urgent intervention to protect healthcare providers from worsening/reducing their effective engagement in MDR-TB patient care. On the other side of the coin, perceived risk of occupational exposure to infection could be source for taking care of oneself to combat the spread of the infection.

In our study, healthcare provider’s capability (competence) also had an impact on their ability to engage in prevention and management of MDR-TB. Here, participants had frequently raised their and other healthcare providers’ experience regarding skill gaps, negative attitude towards the service unit they were working in, ineffective use of MDR-TB guideline, poor infection prevention practices and commitment. In addition, many health professionals report serious problems regarding case identification and screening, drug administration, and side effect management. These findings were supported by other studies in Ethiopia [ 7 ] and in Nigeria [ 19 , 20 ]. This implies an urgent need for training of health care worker on how to engage in prevention and management of multidrug resistant TB.

Moreover, our findings provide insights into the role of community TB prevention actors, currently functioning health system, and colleagues and other stakeholders’ regarding healthcare providers’ engagement. Participants emphasized that support from community TB prevention actors is a key motivation to effectively engage on management and prevention roles towards MDR-TB. Evidence shows that community TB prevention is one of the prominent interventions that study participants would expect in DOTS provision as community is the closest source of information regarding the patients [ 31 , 32 ].

Similarly, all participants had pointed out the importance of support from a health system directly or indirectly influence their engagement in the prevention, diagnosis, treatment, and management of MDR-TB. Researches indicated that health system supports are enabling factors for healthcare providers in decision making regarding TB/MDR-TB prevention and treatment [ 33 ]. This problem is documented by the study done in Ethiopia [ 22 ]. In addition, support from colleagues and other stakeholders was also a felt need to engage in MDR-TB which was supported by the World Health Organization guideline which put engagement in preventing MDR-TB and providing patients centered care needs collaborative endeavor among healthcare providers, patients, and other stakeholders [ 17 ].

Participants showed that there were poor linkage among/within DOTS providers working in health post (extension workers), health centers, hospitals and MDRTB treatment initiation centers. Our finding is consistent with a research in South Africa which shows poor health care attitude is linked to poor treatment adherence [ 34 ]. Our study implies the need for further familiarization especially on clinical programmatic management of drug resistant tuberculosis. Moreover, program managers need to follow health professionals’ engagement approaches recommended by the World Health Organization: End TB strategy [ 17 ].

Limitations of the study

There are some limitations that must be explicitly acknowledged. First, participants from private health facilities were very few, which might have restricted the acquisition and incorporation of perspectives from health care providers from private health care facilities. Second, healthcare providers’ engagement was not measured from patient side given that factors for engagement may differ from what has been said by the healthcare provides. Third, power relationships especially among focus group discussant in MDR-TB treatment initiation centers might have influenced open disclosures of some sensitive issues.

The study showed how healthcare provider’s engagement in MDR-TB management and prevention was influenced. Accordingly, patient’s underlying causes, seeking support, perceived occupational exposure, healthcare provider’s incompetence and health facilities poor linkage were identified from the analysis. Weak community TB prevention efforts, poor health system support and support from colleagues, health care providers’ incompetence and health facilities poor linkage were among identified factors influencing engagement in MDR – TB prevention and management. Therefore, measures need to be in place that avert the observed obstacles to health professionals’ engagement including further quantitative studies to determine the effects of the identified reasons and potential factors in their engagement status.

Furthermore, our findings pointed out the need for additional training of service providers, particularly in clinical programmatic management of drug-resistant tuberculosis. Besides, program managers must adhere to the World Health Organization’s recommendations for health professional engagement. Higher officials in the health sector needs to strengthen the linkage between health facilities and service providers at different levels. Community awareness creation to avoid fear of discrimination including provision of support for those with MDR-TB is expected from health experts through implementation of social behavioral change communication activities.

Abbreviations

Directly observed treatment short-course

Drug susceptible tuberculosis

Millennium development goals

Multidrug resistant tuberculosis

Sustainable development goals

Tuberculosis

Treatment initiation center

World Health Organization

Extensively drug resistant TB

WHO. Companion handbook to the WHO guidelines for the programmatic management of drug-resistant tuberculosis. WHO; Geneva, Switherland. 2014.

WHO. Implementing the end TB Strategy: The Essentials. WHO, Geneva. Switherland; 2015.

Eshetie S, Gizaw M, Dagnew M, Kumera G, Woldie H, Fikadu A. et.al. Multidrug-resistant tuberculosis in Ethiopian settings and its association with previous history of anti-tuberculosis treatment: a systematic review and meta-analysis Ethiopia: efforts to expand diagnostic services, treatment and care. BMC Antimicrobial Resist Infect Control. 2014;3:31.

Biadglegne F, Sack U, Rodloff AC. Multidrug-resistant tuberculosis in Ethiopia: efforts to expand diagnostic services, treatment and care. BMC Antimicro Resist Infect Control. 2014;3:31.

Lema NA, Majigo M, Mbelele PM, Abade AMIM. Risk factors associated with multidrug resistant tuberculosis among patients referred to Kibong’oto Infectious Disease Hospital in northern Tanzania. Tanzania J Health Res. 2016;18:4.

Mekonnen F, Tessema B, Moges F, Gelaw A, Eshetie S. G. K. Multidrug resistant tuberculosis: prevalence and risk factors in districts of metema and west armachiho, Northwest Ethiopia. BMC Infect Dis. 2015;15:461.

Selamawit H, Girmay M, Belaineh G, Muluken M, Alemayehu M, et al. PS. Determinants of multidrug-resistant tuberculosis in patients who underwent first-line treatment in Addis Ababa: a case control study. BMC Public Health 2013. 2013;13:782.

Google Scholar  

Safwat TM, Elmasry AA, Mohamed A. Prevalence of multi-drug resistant tuberculosis in abbassia Chest hospital from july 2006 to december 2009. Egyptian J Bronchol. 2011;5(2):124–30.

World Health Organization. WHO Global tuberculosis report 2015. Geneva: WHO. 2015.

Rifat M, Hall J, Oldmeadow C, Husain A, Hinderaker SG. AH. M. Factors related to previous tuberculosis treatment of patients with multidrug resistant tuberculosis in Bangladesh. BMJ Open. 2015;5.

Gupta KB, Gupta R, Atreja A, Verma MSV. Tuberculosis and nutrition. Lung India: Offic Organ of Indian Chest Soc. 2009;26(1):9–16. https://doi.org/10.4103/0970-2113.45198 . 2009.

World Health Organization. WHO Global tuberculosis report 2018. Geneva: WHO. 2018.

Deus L, Willy S, Kenneth M, George WK, Frank JCMJ. et al. Variation and risk factors of drug resistant tuberculosis in sub-Saharan Africa: a systematic review and meta-analysis. BMC Public Health 2015;15.

Federal Ministry of Health. Guidelines for Clinical and Programmatic Management of TB, TB/HIV and Leprosy in Ethiopia. Fifth Edition. Addis Ababa. March, 2013.

FDRE/MOH. Guidelines on programmatic management of drug resistant tuberculosis in Ethiopia. 2ND ED.2014 ADDIS ABABA. 2014.

CSIS. Center for Strategic and International Studies. As Ethiopia moves toward tuberculosis elimination, success requires higher investment: A report of CSIS global health policy center. CSIS, Washinton DC, America. 2016.

World Health organization. Framework for the engagement of all health care providers in the management of drug resistant tuberculosis. Geneva; Switherland. WHO 2015.

Vries G, Tsolova S, Anderson LF, Gebhard AC, Helda E, Hollo V. Health system factors influencing management of multidrug-resistant tuberculosis in four European Union countries - learning from country experiences. BMC Public Health. 2017;17:334.

Isara AR, Akpodiete A. Concerns about the knowledge and attitude of multidrug resistant tuberculosis among health care workers and patients in Delta State, Nigeria. Nigerian J Clin Pract. 2015;18:5.

Luka MI, Idris SH, Patrick N, Ndadilnasiya EW, Moses OA, Phillip P., et al. Health care workers’ knowledge and attitude towards TB patients under Direct Observation of Treatment in Plateau state Nigeria, 2011. Pan Afr Med J. 2014;18 (Supp1):8.

Malangu N, Adebanjo OD. Knowledge and practices about multidrug-resistant tuberculosis amongst healthcare workers in Maseru. Afr J Prm Health Care Fam Med. 2015;7:1.

Molla Y, Jerene D, Jema I, Nigussie G, Kebede T, Kassie Y. et.al. The experience of scaling up a decentralized, ambulatory model of care for management of multidrug-resistant tuberculosis in two regions of Ethiopia. J Clinic Tubercul Other Mycobacter Dis. 2017;7:28–33.

Merza MA, Farnia P, Tabarsi P, Khazampour M, Masjedi MR. AA. V. Anti-tuberculosis drug resistance and associated risk factors in a tertiary level TB centre in Iran: a retrospective analysis. J Infect Dev Ctries. 2011;5(7):511–19.

Gebremariam MK, Bjune GA. FJ. Barriers and facilitators of adherence to TB treatment in patients on concomitant TB and HIV treatment: a qualitative study. BMC Public Health. 2010;10:651.

Suri AGK. S. C. Voices from the Field: Perspectives from Community Health Workers on Health Care Delivery in Rural KwaZulu-Natal, South Africa. J Infect Dis. 2007;196:S505–11.

Deshmukh RD, Dhande DJ, Sachdeva KS, Sreenivas AN, Kumar AMV. M. P. Social support a key factor for adherence to multidrug-resistant tuberculosis treatment. Indian J Tubercul. 2017.

Samuel B, Volkmann T, Cornelius S, Mukhopadhay S, Mitra K, Kumar AM., et al. Relationship between Nutritional Support and Tuberculosis Treatment Outcomes in West Bengal, India. J Tuberc Res. 2016;4(4):213–19.

NC. E. The making of a public health problem: multi-drug resistant tuberculosis in India. Health Policy Plan 2013;28:375–85.

Gler MT, Podewils LJ, Munez N, Galipot M, Quelapio MID, Tupasi TE. Impact of patient and program factors on default during treatment of multidrug-resistant tuberculosis. Int J Tuberc Lung Dis. 2012;16(7):955–60. https://doi.org/10.5588/ijtld.11.0502 .

Article   CAS   Google Scholar  

Horter S, Stringer B, Greig J, Amangeldiev A, Tillashaikhov MN, Parpieva N., et al. Where there is hope: a qualitative study examining patients’ adherence to multidrug resistant tuberculosis treatment in Karakalpakstan, Uzbekistan. BMC Infect Dis. 2016;16:362.

Naidoo P, Niekerk MV, Toit ED, Beyers N, Leon N. Pathways to multidrug-resistant tuberculosis diagnosis and treatment initiation: a qualitative comparison of patients’ experiences in the era of rapid molecular diagnostic tests. BMC Health Serv Res. 2015;15:488.

WHO. ENGAGE-TB. Training of community health workers and community volunteers. Integrating community-based tuberculosis activities into the work of nongovernmental and other civil society organizations. 2015.

Gerard de Vries S, Tsolova FL, Anderson AC, Gebhard E, Heldal et al. Health system factors influencing management of multidrug-resistant tuberculosis in four European Union countries - learning from country experiences BMC Public Health. 2017;17:334. https://doi.org/10.1186/s12889-017-4216-9 .

Finlay A, Lancaster JHT, Weyer K, Miranda A. M. W. Patient- and Provider level risk factors associated with default from tuberculosis treatment, South Africa: a case control study. BMC Publich Health. 2012;12:56.

Download references

Acknowledgements

We would like to acknowledge Hosanna College of Health Sciences Research and community service directorate for providing us the opportunity and necessary fund to conduct this research. Our appreciation also goes to heads of various health centers, hospitals, district health and Hadiya Zone Health office for unreserved cooperation throughout data collection.

The authors declare that this study received funding from Hosanna College of Health Sciences. The funder was not involved in the study design, collection, analysis, interpretation of data, the writing of this article or the decision to submit it for publication.

Author information

Bereket Aberham Lajore & Menen Ayele

Present address: Hossana College of Health Sciences, Hosanna, SNNPR, Ethiopia

Yitagesu Habtu Aweke

Present address: College of Health Sciences, School of Public Health, Addis Ababa University, Addis Ababa, Ethiopia

Samuel Yohannes Ayanto

Present address: College of Health Sciences, Institute of Public Health, Department of -Population and Family Health, Jimma University, Jimma, Ethiopia

Bereket Aberham Lajore, Yitagesu Habtu Aweke and Samuel Yohannes Ayanto contributed equally to this work.

Authors and Affiliations

Department of Family Health, Hossana College of health sciences, Hossana, Ethiopia

Bereket Aberham Lajore

Department of Health informatics, Hossana College of Health Sciences, Hossana, Ethiopia

Department of Midwifery, Hossana College of Health Sciences, Hossana, Ethiopia

Department of Clinical Nursing, Hossana College of Health Sciences, Hossana, Ethiopia

Menen Ayele

You can also search for this author in PubMed   Google Scholar

Contributions

Bereket Aberham Lajore, Yitagesu Habtu Aweke, and Samuel Yohannes Ayanto conceived the idea and wrote the proposal, participated in data management, analyzed the data and drafted the paper and revised the analysis and subsequent draft of the paper. Menen Ayele revised and approved the proposal, revised the analysis and subsequent draft of the paper. Yitagesu Habtu and Bereket Aberham Lajore wrote the main manuscript text and prepared all tables. All authors reviewed and approved the final manuscript.

Corresponding author

Correspondence to Bereket Aberham Lajore .

Ethics declarations

All methods and report contents were performed in accordance with the standards for reporting qualitative research.

Ethics approval and consent to participate

Ethical approval was obtained from Institutional review board [IRB] of Hossana College of health sciences after reviewing the protocol for ethical issues and provided a formal letter of permission to concerned bodies in the health system. Accordingly, permission to conduct this study was granted by respective health facilities in Hadiya zone. Confidentiality of the information was assured and participants’ autonomy not to participate or to opt-out at any stage of the interview were addressed. Finally, informed consent was obtained from the study participants after detailed information.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Lajore, B.A., Aweke, Y.H., Ayanto, S.Y. et al. Exploring health care providers’ engagement in prevention and management of multidrug resistant Tuberculosis and its factors in Hadiya Zone health care facilities: qualitative study. BMC Health Serv Res 24 , 542 (2024). https://doi.org/10.1186/s12913-024-10911-6

Download citation

Received : 27 February 2023

Accepted : 27 March 2024

Published : 27 April 2024

DOI : https://doi.org/10.1186/s12913-024-10911-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Healthcare providers

BMC Health Services Research

ISSN: 1472-6963

research design in research methodology qualitative

More From Forbes

How will you know if your market research methods are outdated.

Forbes Agency Council

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

Founder and CEO of market research consultancy, Alter Agents ; believer that powerful insights can change businesses.

If you’re still focusing on the brand, and not the consumers you want to reach, that’s your first clue that you may need to reimagine your insights approach. My company's research has shown that just about half of consumers start their purchase journey feeling ambivalent about brand, and it is low on their list of considerations. They’re making decisions that are grounded in their personal needs at that moment, influenced by a wide variety of contextual circumstances. So why do researchers continue to ask so many questions about brand?

Part of it is just plain habit. We "grew up" asking questions that related back to the pillars of brand awareness, consideration and preference. While this model has been updated slightly to reflect the massive changes in the consumer landscape, all most have done is simply tart up old ways of doing things. There’s a desire to make sense of the shopping landscape by using neat, familiar categories, but as researchers, we really need to start from scratch with our approach.

It's Time To Shift Our Thinking

Nothing is the same as it used to be. For example, our surveys used to show that shoppers were focused on price, and somewhat on brand consideration. However, back then, shoppers didn't have as many choices as they do now: They went to the store, saw a set number of brands on the shelf and had to choose among them. When choices are restricted, our stories and our possibilities become quite narrow. But in today’s economy, things have changed. Unlimited options, a huge abundance of information, a growing number of purchase channels and even personal values now come into play.

Think about the massive direct-to-consumer market, which some say will make up 50% of all sales within the next three years. And that’s just one example. Social media platforms like Instagram, TikTok, YouTube and others now have e-commerce initiatives, some with native payment solutions. Predicting what brands and shopping channels will emerge over the next decade is impossible. But we know shoppers are becoming more accustomed to the new and unorthodox. We're seeing them adapt to new channels and technologies in real time. And this just touches the surface of all the changes occurring in the consumer and shopping ecosystem. Shoppers are becoming more and more promiscuous with all of these choices.

Apple s iPhone 16 Pro Design Revealed In New Leak

Charlotte shooting 4 officers killed while serving warrant, wwe raw results winners and grades after wwe draft night 2.

We need to start shifting our thinking as researchers by asking ourselves the question: Is my brand narcissistic? Many of us have a visceral negative reaction to the concept of narcissism—ever since we heard the myth of Narcissus peering into the pool of water, becoming entranced by his own reflection and ultimately perishing. But here, I'm not talking about anything quite so drastic as fatal self-love … or am I? Focusing on the brand in market research can be fatal to success.

Reimagining Your Approach To Insights

So how do you update your research methods for today’s consumer? Here are some good practices to implement:

• Prioritize the consumer: This is about how you ask questions and how you approach the research problem as a whole. If you run a tracker or consumer satisfaction surveys, take a look at whether your metrics are inherently narcissistic. Always ask yourself whether the data you seek is focused on your shoppers—their likes, wants, needs, barriers, motivations, circumstances and values—and less on your brand. Everything you do should feed into your understanding of consumer and shopper needs.

• Use a mix of research methods: Consumers are complex. The shopping landscape is complex. A single methodology, like a survey, is usually not enough to build understanding. Consider employing multimodal studies that blend quantitative data, neuroscience and qualitative interviews to tackle the questions you need answered. The more methods you employ, the better your understanding will be.

• Adapt research practices for the digital age: Just like we need to evolve away from narcissistic models, we must also embrace emerging technologies and online platforms to gather insights in new ways. For example, reevaluate the necessity of in-person research post-pandemic and explore innovative qualitative alternatives to maximize efficiency and expand sample sizes while preserving the depth of understanding.

When insights no longer positively affect business outcomes, our jobs as researchers become obsolete. Staying a step ahead of your consumer audience by seeking a deep understanding of them—not your brand—can give you powerful insights that engage stakeholders and raise the value of research programs across organizations. As my company's chief strategy officer and I wrote in our book, your insights can ensure that “the voice of the shopper is not the last place brands come to but the first.”

Forbes Agency Council is an invitation-only community for executives in successful public relations, media strategy, creative and advertising agencies. Do I qualify?

Rebecca Brooks

  • Editorial Standards
  • Reprints & Permissions

IMAGES

  1. What is Research Design in Qualitative Research

    research design in research methodology qualitative

  2. Understanding Qualitative Research: An In-Depth Study Guide

    research design in research methodology qualitative

  3. 6 Types of Qualitative Research Methods

    research design in research methodology qualitative

  4. 15 Research Methodology Examples (2023)

    research design in research methodology qualitative

  5. Qualitative Research: Definition, Types, Methods and Examples

    research design in research methodology qualitative

  6. Qualitative Research: Definition, Types, Methods and Examples

    research design in research methodology qualitative

VIDEO

  1. part2: Types of Research Designs-Qualitative Research Designs|English

  2. Research Designs: Part 2 of 3: Qualitative Research Designs (ሪሰርች ዲዛይን

  3. WRITING THE CHAPTER 3|| Research Methodology (Research Design and Method)

  4. Research design

  5. Exploratory research design

  6. Experiment research design

COMMENTS

  1. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  2. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  3. Chapter 2. Research Design

    Chapter 2. Research Design Getting Started. When I teach undergraduates qualitative research methods, the final product of the course is a "research proposal" that incorporates all they have learned and enlists the knowledge they have learned about qualitative research methods in an original design that addresses a particular research question.

  4. What is Qualitative Research Design? Definition, Types, Methods and

    Qualitative research design is defined as a type of research methodology that focuses on exploring and understanding complex phenomena and the meanings attributed to them by individuals or groups. It is commonly used in social sciences, psychology, anthropology, and other fields where subjective experiences and interpretations are of interest.

  5. 20

    In other words, qualitative research uncovers social processes and mechanisms undergirding human behavior. In this chapter, we will discuss how to design a qualitative research project using two of the most common qualitative research methods: in-depth interviewing and ethnographic observations (also known as ethnography or participant ...

  6. How to use and assess qualitative research methods

    Abstract. This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions ...

  7. Research Design

    Step 2: Choose a type of research design. Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research. Types of quantitative research designs. Quantitative designs can be split into four main types.

  8. Research Design in Qualitative Research

    A research design is based on an integration of the theories, concepts, goals, contexts, beliefs, and sets of relationships that shape a specific topic. In addition, it is shaped by responding to the realities and perspectives of participants and contexts of a study. In a solid qualitative research design, framing theory and key constructs are ...

  9. PDF Research Design and Research Methods

    Research Design and Research Methods CHAPTER 3 This chapter uses an emphasis on research design to discuss qualitative, quantitative, and mixed methods research as three major approaches to research in the social sciences. The first major section considers the role of research methods in each of these approaches. This discussion then

  10. Qualitative Design Research Methods

    The Origins of Design-Based Research. Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown and Allan Collins ().For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory ...

  11. Key Concepts in Qualitative Research Design

    Keywords. This chapter provides an outline of key concepts in qualitative research design for healthcare simulation. It explores three landmarks that provide orientation to researchers: (1) a defined purpose for the research; (2) an articulation of the researcher's worldview; and (3) an overarching approach to research design.

  12. Research Design

    The Sixth Edition of the bestselling Research Design: Qualitative, Quantitative, and Mixed Methods Approaches provides clear and concise instruction for designing research projects or developing research proposals. This user-friendly text walks readers through research methods, from reviewing the literature to writing a research question and stating a hypothesis to designing the study.

  13. What is Qualitative in Qualitative Research

    A fourth issue is that the "implicit use of methods in qualitative research makes the field far less standardized than the quantitative paradigm" (Goertz and Mahoney 2012:9). Relatedly, the National Science Foundation in the US organized two workshops in 2004 and 2005 to address the scientific foundations of qualitative research involving ...

  14. Choosing a Qualitative Research Approach

    In this Rip Out, we describe 3 different qualitative research approaches commonly used in medical education: grounded theory, ethnography, and phenomenology. Each acts as a pivotal frame that shapes the research question (s), the method (s) of data collection, and how data are analyzed. 4, 5. Go to:

  15. PDF A Guide to Using Qualitative Research Methodology

    qualitative methodology is and when to use it in the field (understand people's belief system, perspectives, experiences). It also flags the most important ethical issues that you will encounter (consent and confidentiality). The second part of the guide tackles how you can concretely develop qualitative research designs; starting from clearly

  16. (PDF) Qualitative research design: An interactive approach

    3. It provides a model for the structure of a proposal for a qualitative study, one. that clearly communicates and justifies the major design decisions and the. connections among these (see ...

  17. Case Study Methodology of Qualitative Research: Key Attributes and

    Research design is the key that unlocks before the both the researcher and the audience all the primary elements of the research—the purpose of the research, the research questions, the type of case study research to be carried out, the sampling method to be adopted, the sample size, the techniques of data collection to be adopted and the ...

  18. Research design: qualitative, quantitative, and mixed methods

    This review examines John W. Creswell and David Creswell's sixth edition, which covers the most popular research methods, offering readers a comprehensive understanding and practical guidance in qualitative, quantitative, and mixed methods. The review includes observations on existing drawbacks, gaps, and ideas on potential areas for improvement in the book. The book is an excellent entry ...

  19. Qualitative Research

    Qualitative Research. Qualitative research is a type of research methodology that focuses on exploring and understanding people's beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus ...

  20. LibGuides: Qualitative study design: Methodologies

    Qualitative study design. A methodology is the system of methods used in a discipline area and the justification for using a particular method in research. Click to the specific methodologies you are interested in: Phenomenology. Grounded theory. Ethnography. Historical. Narrative inquiry. Action research.

  21. Designing a Research Proposal in Qualitative Research

    The chapter discusses designing a research proposal in qualitative research. The main objective is to outline the major components of a qualitative research proposal with example (s) so that the students and novice scholars easily get an understanding of a qualitative proposal. The chapter highlights the major components of a qualitative ...

  22. (PDF) CHAPTER FIVE RESEARCH DESIGN AND METHODOLOGY 5.1. Introduction

    Research Design A research design is the 'procedures for collecting, analyzing, interpreting and reporting data in research studies' (Creswell & Plano Clark 2007, p.58).

  23. Zooming into qualitative research: online adaptation of the action

    ABSTRACT. The Action-Project Method (A-PM) is a comprehensive qualitative research method guided by Contextual Action Theory, which has been successfully employed to study the goal-directed actions of dyads (e.g. parent-child pairs; couples).

  24. Qualitative Methods

    This seminar provides a broad introduction to qualitative research design and analysis techniques. Topics include case selection, variants of process tracing, small-n comparative case design, comparative-historical analysis, the design and implementation of field research, in-depth interviewing, and archival research. The emphasis is on applicat...

  25. Qualitative research during the COVID19 pandemic: the impact of remote

    Adapting longitudinal qualitative research in response to a global pandemic brought specific challenges. In addition to moving data collection to remote methods, longitudinal studies also needed to maintain a focus on retention of research participants and sustain an analysis that informs the longitudinal design of the study.

  26. Research Design: Qualitative, Quantitative and Mixed Methods Approaches

    The eagerly anticipated Fourth Edition of the title that pioneered the comparison of qualitative, quantitative, and mixed methods research design is here! For all three approaches, Creswell includes a preliminary consideration of philosophical assumptions, a review of the literature, an assessment of the use of theory in research approaches, and refl ections about the importance of writing and ...

  27. Writing Survey Questions

    We frequently test new survey questions ahead of time through qualitative research methods such as focus groups, cognitive interviews, pretesting (often using an online, opt-in sample), or a combination of these approaches. Researchers use insights from this testing to refine questions before they are asked in a production survey, such as on ...

  28. Exploring health care providers' engagement in prevention and

    Qualitative approach and research paradigm. Descriptive phenomenological qualitative study design was employed to explore factors influencing engagement of health professionals in MDR-TB prevention and management and thematic technique was employed for the analysis of the data. Researchers' characteristics and reflexivity

  29. How Will You Know If Your Market Research Methods Are Outdated?

    • Use a mix of research methods: Consumers are complex. The shopping landscape is complex. The shopping landscape is complex. A single methodology, like a survey, is usually not enough to build ...