‍ ‍ the future

For more than 40 years, IDEO has helped the world's leading organizations make the future. Find out how design can set you apart, help you grow, and solve your toughest challenges.

Breakthrough Products

We design magical products, brands, services, and experiences that people truly value.

Strategic Futures

We make your strategy tangible by building visions your teams can see, feel, and believe in.

Creative Capabilities

We build critical skills that help you lead organizational transformation, build resilience, and create change.

From Barbie Playhouses to Sonic Jets

research design company

From New Venture to Population-Level Impact

An IDEO project grows into a healthcare provider fighting chronic disease.

research design company

Reinventing Supersonic Flight With Boom

Ultrafast air travel is coming back, with an all-new passenger experience.

research design company

Changing the Conversation on Menopause

How QVC is reframing midlife as a time for women to thrive.

From electrified fleets to plastic-free packaging

research design company

Activating Strategy

Human-centered approaches for bringing strategy to life

research design company

Realizing a Waste Free Future Together

A coalition of retail leaders join forces to address single-use plastic-bag waste.

research design company

Planetary Protection at a Human Scale

A conservation leader rewrites the role humans play in the narrative of climate change.

From catalytic mindsets to innovation labs

research design company

Leading for Creativity

Learn to scale creative problem solving from Tim Brown, Chair of IDEO.

research design company

Helping Students Navigate Campus Life at NYU

How an organizational culture change is helping newbies at NYU navigate campus life.

research design company

Building an Accelerator for Climate Leaders

An innovation accelerator bolsters the impact of climate startups.

research design company

Get updates from IDEO

Shape the Future with IDEO

Inspire, inform, and co-create with IDEO , the world’s leading design firm. We tackle challenges across the globe, spanning diverse topics and communities. From designing waste out of the food system to simplifying prescription home delivery , your voice, ideas, feedback, and partnership is central to having a lasting positive impact.

IDEOParticipation copy

Share more information about you so we can tailor our design research opportunity recommendations. You'll receive customized invitations to apply to design research opportunities.

View Opportunities

Browse public opportunities you can apply to directly. You can also see examples of past opportunities to understand what our opportunities and application process really looks like.

OPPORTUNITIES

Read through our frequently asked questions to understand the basics and particulars of participating in design research with IDEO and get easy access to our team for support or help.

DrBrodie

“Working with IDEO gave me the opportunity to think creatively about some of the everyday challenges I face as a clinician. Doing it collaboratively with their team of smart creative professionals was not only satisfying but really fun! It was exciting to be part of developing a solution for the problems my patients face.”

Frank Brodie, MD, MBA

Image of three people in a conversation

Participate in an interview or small group conversation

Share your experience, opinions, or expertise about various topics to inspire design or provide feedback in a remote zoom video call, on the phone, or in-person.

IDEO_Group

Be an advisor or part of a co-creation or co-design council

Either individually or as part of a small group, provide feedback, expertise, or perspectives throughout the life of an IDEO project.

IDEO_Asyncrounous

Complete independent design research activities

Contribute your ideas, expertise, or feedback via paid research surveys, usability tests, or written and multimedia diary studies.

IDEO_Prototype

Experience a live prototype of a product, service, or experience

Walkthrough and provide feedback on life-size mockups of designs like voting booths , airplane cabins , or school cafeterias .

Image of four people standing around a table looking at a collaborative activity together

Contribute to connecting with and recruiting from your community

Connect IDEO to your network, lead local initiatives, or help shape IDEO projects and partnerships.

IDEOCollaboration_3_Small

Deeply collaborate or embed alongside IDEO designers

Work with an IDEO project team as a part or full-time designer or expert-in-resident to experience design thinking and push our way of working.

research design company

Want to browse projects you can apply for directly? Check out our public Opportunities  to see our calls for participation.

DESIGN RESEARCH QUESTIONS

Do you have feedback about our process? Questions about a project or initiative? Review our Frequently Asked Questions or send us an email . 

OTHER INQUIRIES

Looking to partner with us, apply for a job, book a speaker, or find out more about our company or work? Learn about other ways to connect with IDEO.  

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • Research Design

Try Qualtrics for free

Research design for business.

17 min read To get the information you need to drive key business decisions and answer burning questions, you need a research methodology that works — and it all starts with research design. But what is it? In our ultimate guide to research design for businesses, we breakdown the process, including research methods, examples, and best practice tips to help you get started.

If you have a business problem that you’re trying to solve — from product usage to customer engagement — doing research is a great way to understand what is going wrong.

Yet despite this, less than 40% of marketers use consumer research to drive decisions [1] .

So why are businesses missing out on vital business insights that could help their bottom line?

One reason is that many simply don’t know which research method to use to correctly investigate their problem and uncover insights.

This is where our ultimate guide to research design can help. But first…

What is research design?

Research design is the overall strategy (or research methodology) used to carry out a study. It defines the framework and plan to tackle established problems and/or questions through the collection, interpretation, analysis, and discussion of data.

While there are several types of research design (more on that later), the research problem defines which should be used — not the other way around. In working this way, researchers can be certain that their methods match their aims — and that they’re capturing useful and actionable data.

For example, you might want to know why sales are falling for a specific product. You already have your context and other research questions to help uncover further insights. So, you start with your research problem (or problem statement) and choose an approach to get the information you need.

Free eBook: Quantitative and qualitative research design

Key considerations before a research project

After you have your research problem and research questions to find out more information, you should always consider the following elements:

  • Do you want to use a qualitative or quantitative approach?
  • What type of research would you like to do (e.g. — create a survey or conduct telephone interviews)?
  • How will you choose your population and sample size fairly?
  • How will you choose a method to collect the data for ease of operation? The research tool you use will determine the validity of your study
  • How will you analyze data after collection to help the business concern?
  • How will you ensure your research is free from bias and neutral?
  • What’s your timeline?
  • In what setting will you conduct your research study?
  • Are there any challenges or objections to conducting your research — and if so, how can you address them?

Ultimately, the data received should be unambiguous, so that the analysts can find accurate and trustworthy insights to act upon. Neutrality is key!

Types of approaches in research design

There are two main approaches to research design that we’ll explore in more detail — quantitative and qualitative.

Qualitative research design

Qualitative research designs tend to be more flexible and inductive (broad generalizations rather than specific observations), allowing you to adjust your approach based on the information you find throughout the research process. It looks at customer or prospect data (X data).

For example, if you want to generate new ideas for content campaigns, a qualitative approach would make the most sense. You can use this approach to find out more about what your audience would like to see, the particular challenges they are facing (from a business perspective), their overall experiences, and if any topics are under-researched.

To put it simply, qualitative research design looks at the whys and hows — as well as participants’ thoughts, feelings, and beliefs. It seeks to find reasons to explain decisions using the data captured.

However, as the data collected from qualitative research is typically written rather than numerical, it can be difficult to quantify information using statistical techniques.

When should you use qualitative research design?

It is best used when you want to conduct a detailed investigation of a topic to understand a holistic view. For example, to understand cultural differences in society, qualitative research design would create a research plan that allowed as many people from different cultures to participate and provided space for elaboration and anecdotal evidence.

If you want to incorporate a qualitative research design, you may choose to use methods like semi-structured focus groups, surveys with open-ended questions, or in-depth interviews in person or on the phone.

Quantitative research design

Quantitative research design looks at data that helps answer the key questions beginning with ‘Who’, ‘How’, ‘How many’ and ‘What’. This can include business data that explores operation statistics and sales records and quantifiable data on preferences.

Unlike qualitative research design, quantitative research design can be more controlled and fixed. It establishes variables, hypotheses, and correlations and tests participants against this knowledge. The aim is to explore the numerical data and understand its value against other sets of data, providing us with a data-driven way to measure the level of something.

When should you use quantitative research design?

If you want to quantify attitudes, opinions, behaviors, or any other defined variable (and general results from a large sample population), a quantitative approach is a way to go.

You could use quantitative research to validate findings from qualitative research. One provides depth and insight into the whys and hows, while the other delivers data to support them.

If you want to incorporate a quantitative research design, you may choose to use methods like secondary research collection or surveys with closed-ended questions.

Now that you know the differences between the two research approaches , we can go further and address their sub-categories.

Research methods: the subsets of qualitative and quantitative research

Depending on the aim/objective of your research, there are several research methods (for both qualitative and quantitative research) for you to choose from:

Types of quantitative research design

  • Descriptive – provides information on the current state of affairs, by observing participants in a natural situation
  • Experimental – provides causal relationship information between variables within a controlled situation
  • Quasi-experimental – attempts to build a cause and effect relationship between an independent variable and a dependent variable
  • Correlational – as the name suggests, correlational design allows the researcher to establish some kind of relation between two closely related topics or variables

Types of qualitative research design

  • Case studies – a detailed study of a specific subject (place, event, organization)
  • Ethnographic research – in-depth observational studies of people in their natural environment (this research aims to understand the cultures, challenges, motivations and settings of those involved)
  • Grounded theory – collecting rich data on a topic of interest and developing theories inductively
  • Phenomenology – investigating a phenomenon or event by describing and interpreting the shared experiences of participants
  • Narrative research – examining how stories are told to understand how participants perceive and make sense of their experiences

Other subsets of qualitative and quantitative research design

  • Exploratory – explores a new subject area by taking a holistic viewpoint and gathering foundational insights
  • Cross-sectional – provides a snapshot of a moment in time to reflect the state
  • Longitudinal – provides several snapshots of the same sample over a period to understand causal relationships
  • Mixed methods – provide a bespoke application of design subsets to create more precise and nuanced results
  • Observational – involves observing participants’ ongoing behavior in a natural situation

Let’s talk about these research methods in more detail.

Experimental

As a subset of quantitative research design types, experimental research design aims to control variables in an experiment to test a hypothesis. Researchers will alter one of the variables to see how it affects the others.

Experimental research design provides an understanding of the causal relationships between two variables – which variable impacts the other, to what extent they are affected, and how consistent is the effect if the experiment is repeated.

To incorporate experimental research design, researchers create an artificial environment to more easily control the variables affecting participants. This can include creating two groups of participants – one acting as a control group to provide normal data readings, and another that has a variable altered. Therefore, having representative and random groups of participants can give better results to compare.

Sample population split into intervention and control groups

Image source: World Bank Blogs

Descriptive

Descriptive research design is a subset of qualitative design research and, unlike experimental design research, it provides descriptive insights on participants by observing participants in an uncontrolled, geographically-bound natural environment.

This type gives information on the current state of participants when faced with variables or changing circumstances. It helps answer who, what, when, where, and how questions on behavior, but it can’t provide a clear understanding of the why.

To incorporate a descriptive research design, researchers create situations where observation of participants can happen without notice. In capturing the information, researchers can analyze data to understand the different variables at play or find additional research areas to investigate.

Exploratory

Exploratory research design aims to investigate an area where little is known about the subject and there are no prior examples to draw insight from. Researchers want to gain insights into the foundational data (who, what, when, where, and how) and the deeper level data (the why).

Therefore, an exploratory research design is flexible and a subset of both quantitative and qualitative research design.

Like descriptive research design, this type of research method is used at the beginning stages of research to get a broader view, before proceeding with further research.

To incorporate exploratory research design, researchers will use several methods to gain the right data. These can include focus groups, surveys, interviews in person or on the phone, secondary desk research, controlled experiments, and observation in a natural setting.

Cross-sectional

Just like slicing through a tomato gives us a slice of the whole fruit, cross-sectional research design gives us a slice representing a specific point in time. Researchers can observe different groups at the same time to discover what makes the participant behavior different from one another and how behavior correlates. This is then used to form assumptions that can be further tested.

There are two types to consider. In descriptive cross-sectional research design, researchers do not get involved or influence the participants through any controls, so this research design type is a subset of quantitative research design. Researchers will use methods that provide a descriptive (who, what, when, where, and how) understanding of the cross-section. This can be done by survey or observation, though researcher bias can be an undesirable outcome if the method is not conscious of this.

Analytical cross-sectional research design looks at the why behind the outcome found in the cross-section, aligning this as a subset of qualitative research design. This understanding can be gained through emailed surveys. To gain stronger insights, group sample selection can be altered from a random selection of participants to researchers selecting participants into groups based on their differences.

Since only one cross-section is taken, this can be a cheaper and quicker way to carry out research when resources are limited. Yet, no causal relationships can be gained by comparing data across time, unlike longitudinal research design.

Longitudinal

Longitudinal research design takes multiple measures from the same participants or groups over an extended period. These repeated observations enable researchers to track variables, identify correlations and see if there are causal relationships that can confirm hypothesis predictions.

As the research design is focused on understanding the why behind the data, this is a subset of qualitative research design. However, the real-time data collection at each point in time will also require analysis based on the quantitative markers found through quantitative research design.

Researchers can incorporate longitudinal research design by using methods like panel studies for collecting primary data first-hand. The study can be retrospective (based on event data that has already occurred) or prospective (based on event data that is yet to happen).

While being the most useful method to get the data you need to address your business concern, this can be time-consuming and there can be issues with maintaining the integrity of the sample over time. Alternatively, you can use existing data sets to provide historical trends (which could be verified through a cross-sectional research design).

Mixed methods

Mixed methods aim to provide an advanced and bespoke response to solving your business problem. It combines the methods and subsets above to create a tailored method that gives researchers flexibility and options for carrying out research.

The mixed-method research design gives a thorough holistic view of the layers of data through quantitative and qualitative subset design methods. The resulting data is strengthened by the application of context and scale (quantitative) in alignment with the meaning behind behavior (qualitative), giving a richer picture of participants.

Mixed method research design is useful for getting greater ‘texture’ to your data, resulting in precise and meaningful information for analysis. The disadvantages and boundaries of a single subset can be offset by the benefits of using another to complement the investigation.

This subset does place more responsibility on the researcher to apply the subset designs appropriately to gain the right information. The data is interpreted and assessed by the researcher for its validity to the end results, so there is potential for researcher bias if they miss out on vital information that skews results.

Visual Graphs of mixed methods

Image Source: Full Stack Researcher

Find the research design method(s) that work for you

No matter what information you want to find out — there’s a research design method that’s right for you.

However, it’s up to you to determine which of the methods above are the most viable and can deliver the insight you need. Remember, each research method has its advantages and disadvantages.

It’s also important to bear in mind (at all times), the key considerations before your research project:

  • Are there any challenges or objections to conducting your research — and if so, how can you address them?.

But if you’re unsure about where to begin, start by answering these questions with our decision tree:

research design diagram

Image Source: Research Gate

If you need more help, why not try speaking to one of our Qualtrics team members?

Our team of experts can help you with all your market research needs — from designing your study and finding respondents, to fielding it and reporting on the results.

[1] https://www.google.com/url?q=https://www.thinkwithgoogle.com/consumer-insights/consumer-trends/marketing-consumer-research-statistics/&sa=D&source=editors&ust=1629103799724264&usg=AOvVaw3H6LaHl4EJ4KQURNplqL31

Related resources

Market intelligence 10 min read, marketing insights 11 min read, ethnographic research 11 min read, qualitative vs quantitative research 13 min read, qualitative research questions 11 min read, qualitative research design 12 min read, primary vs secondary research 14 min read, request demo.

Ready to learn more about Qualtrics?

Grad Coach

Research Design 101

Everything You Need To Get Started (With Examples)

By: Derek Jansen (MBA) | Reviewers: Eunice Rautenbach (DTech) & Kerryn Warren (PhD) | April 2023

Research design for qualitative and quantitative studies

Navigating the world of research can be daunting, especially if you’re a first-time researcher. One concept you’re bound to run into fairly early in your research journey is that of “ research design ”. Here, we’ll guide you through the basics using practical examples , so that you can approach your research with confidence.

Overview: Research Design 101

What is research design.

  • Research design types for quantitative studies
  • Video explainer : quantitative research design
  • Research design types for qualitative studies
  • Video explainer : qualitative research design
  • How to choose a research design
  • Key takeaways

Research design refers to the overall plan, structure or strategy that guides a research project , from its conception to the final data analysis. A good research design serves as the blueprint for how you, as the researcher, will collect and analyse data while ensuring consistency, reliability and validity throughout your study.

Understanding different types of research designs is essential as helps ensure that your approach is suitable  given your research aims, objectives and questions , as well as the resources you have available to you. Without a clear big-picture view of how you’ll design your research, you run the risk of potentially making misaligned choices in terms of your methodology – especially your sampling , data collection and data analysis decisions.

The problem with defining research design…

One of the reasons students struggle with a clear definition of research design is because the term is used very loosely across the internet, and even within academia.

Some sources claim that the three research design types are qualitative, quantitative and mixed methods , which isn’t quite accurate (these just refer to the type of data that you’ll collect and analyse). Other sources state that research design refers to the sum of all your design choices, suggesting it’s more like a research methodology . Others run off on other less common tangents. No wonder there’s confusion!

In this article, we’ll clear up the confusion. We’ll explain the most common research design types for both qualitative and quantitative research projects, whether that is for a full dissertation or thesis, or a smaller research paper or article.

Free Webinar: Research Methodology 101

Research Design: Quantitative Studies

Quantitative research involves collecting and analysing data in a numerical form. Broadly speaking, there are four types of quantitative research designs: descriptive , correlational , experimental , and quasi-experimental . 

Descriptive Research Design

As the name suggests, descriptive research design focuses on describing existing conditions, behaviours, or characteristics by systematically gathering information without manipulating any variables. In other words, there is no intervention on the researcher’s part – only data collection.

For example, if you’re studying smartphone addiction among adolescents in your community, you could deploy a survey to a sample of teens asking them to rate their agreement with certain statements that relate to smartphone addiction. The collected data would then provide insight regarding how widespread the issue may be – in other words, it would describe the situation.

The key defining attribute of this type of research design is that it purely describes the situation . In other words, descriptive research design does not explore potential relationships between different variables or the causes that may underlie those relationships. Therefore, descriptive research is useful for generating insight into a research problem by describing its characteristics . By doing so, it can provide valuable insights and is often used as a precursor to other research design types.

Correlational Research Design

Correlational design is a popular choice for researchers aiming to identify and measure the relationship between two or more variables without manipulating them . In other words, this type of research design is useful when you want to know whether a change in one thing tends to be accompanied by a change in another thing.

For example, if you wanted to explore the relationship between exercise frequency and overall health, you could use a correlational design to help you achieve this. In this case, you might gather data on participants’ exercise habits, as well as records of their health indicators like blood pressure, heart rate, or body mass index. Thereafter, you’d use a statistical test to assess whether there’s a relationship between the two variables (exercise frequency and health).

As you can see, correlational research design is useful when you want to explore potential relationships between variables that cannot be manipulated or controlled for ethical, practical, or logistical reasons. It is particularly helpful in terms of developing predictions , and given that it doesn’t involve the manipulation of variables, it can be implemented at a large scale more easily than experimental designs (which will look at next).

That said, it’s important to keep in mind that correlational research design has limitations – most notably that it cannot be used to establish causality . In other words, correlation does not equal causation . To establish causality, you’ll need to move into the realm of experimental design, coming up next…

Need a helping hand?

research design company

Experimental Research Design

Experimental research design is used to determine if there is a causal relationship between two or more variables . With this type of research design, you, as the researcher, manipulate one variable (the independent variable) while controlling others (dependent variables). Doing so allows you to observe the effect of the former on the latter and draw conclusions about potential causality.

For example, if you wanted to measure if/how different types of fertiliser affect plant growth, you could set up several groups of plants, with each group receiving a different type of fertiliser, as well as one with no fertiliser at all. You could then measure how much each plant group grew (on average) over time and compare the results from the different groups to see which fertiliser was most effective.

Overall, experimental research design provides researchers with a powerful way to identify and measure causal relationships (and the direction of causality) between variables. However, developing a rigorous experimental design can be challenging as it’s not always easy to control all the variables in a study. This often results in smaller sample sizes , which can reduce the statistical power and generalisability of the results.

Moreover, experimental research design requires random assignment . This means that the researcher needs to assign participants to different groups or conditions in a way that each participant has an equal chance of being assigned to any group (note that this is not the same as random sampling ). Doing so helps reduce the potential for bias and confounding variables . This need for random assignment can lead to ethics-related issues . For example, withholding a potentially beneficial medical treatment from a control group may be considered unethical in certain situations.

Quasi-Experimental Research Design

Quasi-experimental research design is used when the research aims involve identifying causal relations , but one cannot (or doesn’t want to) randomly assign participants to different groups (for practical or ethical reasons). Instead, with a quasi-experimental research design, the researcher relies on existing groups or pre-existing conditions to form groups for comparison.

For example, if you were studying the effects of a new teaching method on student achievement in a particular school district, you may be unable to randomly assign students to either group and instead have to choose classes or schools that already use different teaching methods. This way, you still achieve separate groups, without having to assign participants to specific groups yourself.

Naturally, quasi-experimental research designs have limitations when compared to experimental designs. Given that participant assignment is not random, it’s more difficult to confidently establish causality between variables, and, as a researcher, you have less control over other variables that may impact findings.

All that said, quasi-experimental designs can still be valuable in research contexts where random assignment is not possible and can often be undertaken on a much larger scale than experimental research, thus increasing the statistical power of the results. What’s important is that you, as the researcher, understand the limitations of the design and conduct your quasi-experiment as rigorously as possible, paying careful attention to any potential confounding variables .

The four most common quantitative research design types are descriptive, correlational, experimental and quasi-experimental.

Research Design: Qualitative Studies

There are many different research design types when it comes to qualitative studies, but here we’ll narrow our focus to explore the “Big 4”. Specifically, we’ll look at phenomenological design, grounded theory design, ethnographic design, and case study design.

Phenomenological Research Design

Phenomenological design involves exploring the meaning of lived experiences and how they are perceived by individuals. This type of research design seeks to understand people’s perspectives , emotions, and behaviours in specific situations. Here, the aim for researchers is to uncover the essence of human experience without making any assumptions or imposing preconceived ideas on their subjects.

For example, you could adopt a phenomenological design to study why cancer survivors have such varied perceptions of their lives after overcoming their disease. This could be achieved by interviewing survivors and then analysing the data using a qualitative analysis method such as thematic analysis to identify commonalities and differences.

Phenomenological research design typically involves in-depth interviews or open-ended questionnaires to collect rich, detailed data about participants’ subjective experiences. This richness is one of the key strengths of phenomenological research design but, naturally, it also has limitations. These include potential biases in data collection and interpretation and the lack of generalisability of findings to broader populations.

Grounded Theory Research Design

Grounded theory (also referred to as “GT”) aims to develop theories by continuously and iteratively analysing and comparing data collected from a relatively large number of participants in a study. It takes an inductive (bottom-up) approach, with a focus on letting the data “speak for itself”, without being influenced by preexisting theories or the researcher’s preconceptions.

As an example, let’s assume your research aims involved understanding how people cope with chronic pain from a specific medical condition, with a view to developing a theory around this. In this case, grounded theory design would allow you to explore this concept thoroughly without preconceptions about what coping mechanisms might exist. You may find that some patients prefer cognitive-behavioural therapy (CBT) while others prefer to rely on herbal remedies. Based on multiple, iterative rounds of analysis, you could then develop a theory in this regard, derived directly from the data (as opposed to other preexisting theories and models).

Grounded theory typically involves collecting data through interviews or observations and then analysing it to identify patterns and themes that emerge from the data. These emerging ideas are then validated by collecting more data until a saturation point is reached (i.e., no new information can be squeezed from the data). From that base, a theory can then be developed .

As you can see, grounded theory is ideally suited to studies where the research aims involve theory generation , especially in under-researched areas. Keep in mind though that this type of research design can be quite time-intensive , given the need for multiple rounds of data collection and analysis.

research design company

Ethnographic Research Design

Ethnographic design involves observing and studying a culture-sharing group of people in their natural setting to gain insight into their behaviours, beliefs, and values. The focus here is on observing participants in their natural environment (as opposed to a controlled environment). This typically involves the researcher spending an extended period of time with the participants in their environment, carefully observing and taking field notes .

All of this is not to say that ethnographic research design relies purely on observation. On the contrary, this design typically also involves in-depth interviews to explore participants’ views, beliefs, etc. However, unobtrusive observation is a core component of the ethnographic approach.

As an example, an ethnographer may study how different communities celebrate traditional festivals or how individuals from different generations interact with technology differently. This may involve a lengthy period of observation, combined with in-depth interviews to further explore specific areas of interest that emerge as a result of the observations that the researcher has made.

As you can probably imagine, ethnographic research design has the ability to provide rich, contextually embedded insights into the socio-cultural dynamics of human behaviour within a natural, uncontrived setting. Naturally, however, it does come with its own set of challenges, including researcher bias (since the researcher can become quite immersed in the group), participant confidentiality and, predictably, ethical complexities . All of these need to be carefully managed if you choose to adopt this type of research design.

Case Study Design

With case study research design, you, as the researcher, investigate a single individual (or a single group of individuals) to gain an in-depth understanding of their experiences, behaviours or outcomes. Unlike other research designs that are aimed at larger sample sizes, case studies offer a deep dive into the specific circumstances surrounding a person, group of people, event or phenomenon, generally within a bounded setting or context .

As an example, a case study design could be used to explore the factors influencing the success of a specific small business. This would involve diving deeply into the organisation to explore and understand what makes it tick – from marketing to HR to finance. In terms of data collection, this could include interviews with staff and management, review of policy documents and financial statements, surveying customers, etc.

While the above example is focused squarely on one organisation, it’s worth noting that case study research designs can have different variation s, including single-case, multiple-case and longitudinal designs. As you can see in the example, a single-case design involves intensely examining a single entity to understand its unique characteristics and complexities. Conversely, in a multiple-case design , multiple cases are compared and contrasted to identify patterns and commonalities. Lastly, in a longitudinal case design , a single case or multiple cases are studied over an extended period of time to understand how factors develop over time.

As you can see, a case study research design is particularly useful where a deep and contextualised understanding of a specific phenomenon or issue is desired. However, this strength is also its weakness. In other words, you can’t generalise the findings from a case study to the broader population. So, keep this in mind if you’re considering going the case study route.

Case study design often involves investigating an individual to gain an in-depth understanding of their experiences, behaviours or outcomes.

How To Choose A Research Design

Having worked through all of these potential research designs, you’d be forgiven for feeling a little overwhelmed and wondering, “ But how do I decide which research design to use? ”. While we could write an entire post covering that alone, here are a few factors to consider that will help you choose a suitable research design for your study.

Data type: The first determining factor is naturally the type of data you plan to be collecting – i.e., qualitative or quantitative. This may sound obvious, but we have to be clear about this – don’t try to use a quantitative research design on qualitative data (or vice versa)!

Research aim(s) and question(s): As with all methodological decisions, your research aim and research questions will heavily influence your research design. For example, if your research aims involve developing a theory from qualitative data, grounded theory would be a strong option. Similarly, if your research aims involve identifying and measuring relationships between variables, one of the experimental designs would likely be a better option.

Time: It’s essential that you consider any time constraints you have, as this will impact the type of research design you can choose. For example, if you’ve only got a month to complete your project, a lengthy design such as ethnography wouldn’t be a good fit.

Resources: Take into account the resources realistically available to you, as these need to factor into your research design choice. For example, if you require highly specialised lab equipment to execute an experimental design, you need to be sure that you’ll have access to that before you make a decision.

Keep in mind that when it comes to research, it’s important to manage your risks and play as conservatively as possible. If your entire project relies on you achieving a huge sample, having access to niche equipment or holding interviews with very difficult-to-reach participants, you’re creating risks that could kill your project. So, be sure to think through your choices carefully and make sure that you have backup plans for any existential risks. Remember that a relatively simple methodology executed well generally will typically earn better marks than a highly-complex methodology executed poorly.

research design company

Recap: Key Takeaways

We’ve covered a lot of ground here. Let’s recap by looking at the key takeaways:

  • Research design refers to the overall plan, structure or strategy that guides a research project, from its conception to the final analysis of data.
  • Research designs for quantitative studies include descriptive , correlational , experimental and quasi-experimenta l designs.
  • Research designs for qualitative studies include phenomenological , grounded theory , ethnographic and case study designs.
  • When choosing a research design, you need to consider a variety of factors, including the type of data you’ll be working with, your research aims and questions, your time and the resources available to you.

If you need a helping hand with your research design (or any other aspect of your research), check out our private coaching services .

research design company

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Survey Design 101: The Basics

10 Comments

Wei Leong YONG

Is there any blog article explaining more on Case study research design? Is there a Case study write-up template? Thank you.

Solly Khan

Thanks this was quite valuable to clarify such an important concept.

hetty

Thanks for this simplified explanations. it is quite very helpful.

Belz

This was really helpful. thanks

Imur

Thank you for your explanation. I think case study research design and the use of secondary data in researches needs to be talked about more in your videos and articles because there a lot of case studies research design tailored projects out there.

Please is there any template for a case study research design whose data type is a secondary data on your repository?

Sam Msongole

This post is very clear, comprehensive and has been very helpful to me. It has cleared the confusion I had in regard to research design and methodology.

Robyn Pritchard

This post is helpful, easy to understand, and deconstructs what a research design is. Thanks

kelebogile

how to cite this page

Peter

Thank you very much for the post. It is wonderful and has cleared many worries in my mind regarding research designs. I really appreciate .

ali

how can I put this blog as my reference(APA style) in bibliography part?

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

The Four Types of Research Design — Everything You Need to Know

Jenny Romanchuk

Updated: December 11, 2023

Published: January 18, 2023

When you conduct research, you need to have a clear idea of what you want to achieve and how to accomplish it. A good research design enables you to collect accurate and reliable data to draw valid conclusions.

research design used to test different beauty products

In this blog post, we'll outline the key features of the four common types of research design with real-life examples from UnderArmor, Carmex, and more. Then, you can easily choose the right approach for your project.

Table of Contents

What is research design?

The four types of research design, research design examples.

Research design is the process of planning and executing a study to answer specific questions. This process allows you to test hypotheses in the business or scientific fields.

Research design involves choosing the right methodology, selecting the most appropriate data collection methods, and devising a plan (or framework) for analyzing the data. In short, a good research design helps us to structure our research.

Marketers use different types of research design when conducting research .

There are four common types of research design — descriptive, correlational, experimental, and diagnostic designs. Let’s take a look at each in more detail.

Researchers use different designs to accomplish different research objectives. Here, we'll discuss how to choose the right type, the benefits of each, and use cases.

Research can also be classified as quantitative or qualitative at a higher level. Some experiments exhibit both qualitative and quantitative characteristics.

research design company

Free Market Research Kit

5 Research and Planning Templates + a Free Guide on How to Use Them in Your Market Research

  • SWOT Analysis Template
  • Survey Template
  • Focus Group Template

You're all set!

Click this link to access this resource at any time.

Experimental

An experimental design is used when the researcher wants to examine how variables interact with each other. The researcher manipulates one variable (the independent variable) and observes the effect on another variable (the dependent variable).

In other words, the researcher wants to test a causal relationship between two or more variables.

In marketing, an example of experimental research would be comparing the effects of a television commercial versus an online advertisement conducted in a controlled environment (e.g. a lab). The objective of the research is to test which advertisement gets more attention among people of different age groups, gender, etc.

Another example is a study of the effect of music on productivity. A researcher assigns participants to one of two groups — those who listen to music while working and those who don't — and measure their productivity.

The main benefit of an experimental design is that it allows the researcher to draw causal relationships between variables.

One limitation: This research requires a great deal of control over the environment and participants, making it difficult to replicate in the real world. In addition, it’s quite costly.

Best for: Testing a cause-and-effect relationship (i.e., the effect of an independent variable on a dependent variable).

Correlational

A correlational design examines the relationship between two or more variables without intervening in the process.

Correlational design allows the analyst to observe natural relationships between variables. This results in data being more reflective of real-world situations.

For example, marketers can use correlational design to examine the relationship between brand loyalty and customer satisfaction. In particular, the researcher would look for patterns or trends in the data to see if there is a relationship between these two entities.

Similarly, you can study the relationship between physical activity and mental health. The analyst here would ask participants to complete surveys about their physical activity levels and mental health status. Data would show how the two variables are related.

Best for: Understanding the extent to which two or more variables are associated with each other in the real world.

Descriptive

Descriptive research refers to a systematic process of observing and describing what a subject does without influencing them.

Methods include surveys, interviews, case studies, and observations. Descriptive research aims to gather an in-depth understanding of a phenomenon and answers when/what/where.

SaaS companies use descriptive design to understand how customers interact with specific features. Findings can be used to spot patterns and roadblocks.

For instance, product managers can use screen recordings by Hotjar to observe in-app user behavior. This way, the team can precisely understand what is happening at a certain stage of the user journey and act accordingly.

Brand24, a social listening tool, tripled its sign-up conversion rate from 2.56% to 7.42%, thanks to locating friction points in the sign-up form through screen recordings.

different types of research design: descriptive research example.

Carma Laboratories worked with research company MMR to measure customers’ reactions to the lip-care company’s packaging and product . The goal was to find the cause of low sales for a recently launched line extension in Europe.

The team moderated a live, online focus group. Participants were shown w product samples, while AI and NLP natural language processing identified key themes in customer feedback.

This helped uncover key reasons for poor performance and guided changes in packaging.

research design example, tweezerman

© Interaction Design Foundation, CC BY-SA 4.0

How to Leverage Ethnography to Do Proper Design Research

Whatever your method or combination of methods (e.g., semi-structured interviews and video ethnography), the “golden rules” are:

Build rapport – Your “test users” will only open up in trusting, relaxed, informal, natural settings. Simple courtesies such as thanking them and not pressuring them to answer will go a long way. Remember, human users want a human touch, and as customers they will have the final say on a design’s success.

Hide/Forget your own bias – This is a skill that will show in how you ask questions, which can subtly tell users what you might want to hear. Instead of asking (e.g.) “The last time you used a pay app on your phone, what was your worst security concern?”, try “Can you tell me about the last time you used an app on your phone to pay for something?”. Questions that betray how you might view things can make people distort their answers.

Embrace the not-knowing mindset and a blank-slate approach – to help you find users’ deep motivations and why they’ve created workarounds. Trying to forget—temporarily—everything you’ve learned about one or more things can be challenging. However, it can pay big dividends if you can ignore the assumptions that naturally creep into our understanding of our world.

Accept ambiguity – Try to avoid imposing a rigid binary (black-and-white/“yes”-or-“no”) scientific framework over your users’ human world.

Don’t jump to conclusions – Try to stay objective. The patterns we tend to establish to help us make sense of our world more easily can work against you as an observer if you let them. It’s perfectly human to rely on these patterns so we can think on our feet. But your users/customers already will be doing this with what they encounter. If you add your own subjectivity, you’ll distort things.

Keep an open mind to absorb the users’ world as present it – hence why it’s vital to get some proper grounding in user research. It takes a skilled eye, ear and mouth to zero in on everything there is to observe, without losing sight of anything by catering to your own agendas, etc.

Gentle encouragement helps; Silence is golden – a big part of keeping a naturalistic setting means letting your users stay comfortable at their own pace (within reason). Your “Mm-mmhs” of encouragement and appropriate silent stretches can keep your research safe from users’ suddenly putting politeness ahead of honesty if they feel (or feel that you’re) uncomfortable.

Overall, remember that two people can see the same thing very differently, and it takes an open-minded, inquisitive, informal approach to find truly valuable insights to understand users’ real problems.

Learn More about Design Research

Take our Service Design course, featuring many helpful templates: Service Design: How to Design Integrated Service Experiences

This Smashing Magazine piece nicely explores the human dimensions of design research: How To Get To Know Your Users

Let Invision expand your understanding of design research’s value, here: 4 types of research methods all designers should know .

Literature on Design Research

Here’s the entire UX literature on Design Research by the Interaction Design Foundation, collated in one place:

Learn more about Design Research

Take a deep dive into Design Research with our course Service Design: How to Design Integrated Service Experiences .

Services are everywhere! When you get a new passport, order a pizza or make a reservation on AirBnB, you're engaging with services. How those services are designed is crucial to whether they provide a pleasant experience or an exasperating one. The experience of a service is essential to its success or failure no matter if your goal is to gain and retain customers for your app or to design an efficient waiting system for a doctor’s office.

In a service design process, you use an in-depth understanding of the business and its customers to ensure that all the touchpoints of your service are perfect and, just as importantly, that your organization can deliver a great service experience every time . It’s not just about designing the customer interactions; you also need to design the entire ecosystem surrounding those interactions.

In this course, you’ll learn how to go through a robust service design process and which methods to use at each step along the way. You’ll also learn how to create a service design culture in your organization and set up a service design team . We’ll provide you with lots of case studies to learn from as well as interviews with top designers in the field. For each practical method, you’ll get downloadable templates that guide you on how to use the methods in your own work.

This course contains a series of practical exercises that build on one another to create a complete service design project . The exercises are optional, but you’ll get invaluable hands-on experience with the methods you encounter in this course if you complete them, because they will teach you to take your first steps as a service designer. What’s equally important is that you can use your work as a case study for your portfolio to showcase your abilities to future employers! A portfolio is essential if you want to step into or move ahead in a career in service design.

Your primary instructor in the course is Frank Spillers . Frank is CXO of award-winning design agency Experience Dynamics and a service design expert who has consulted with companies all over the world. Much of the written learning material also comes from John Zimmerman and Jodi Forlizzi , both Professors in Human-Computer Interaction at Carnegie Mellon University and highly influential in establishing design research as we know it today.

You’ll earn a verifiable and industry-trusted Course Certificate once you complete the course. You can highlight it on your resume, CV, LinkedIn profile or on your website.

All open-source articles on Design Research

Adding quality to your design research with an ssqs checklist.

research design company

  • 8 years ago

Open Access—Link to us!

We believe in Open Access and the  democratization of knowledge . Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.

If you want this to change , cite this page , link to us, or join us to help us democratize design knowledge !

Privacy Settings

Our digital services use necessary tracking technologies, including third-party cookies, for security, functionality, and to uphold user rights. Optional cookies offer enhanced features, and analytics.

Experience the full potential of our site that remembers your preferences and supports secure sign-in.

Governs the storage of data necessary for maintaining website security, user authentication, and fraud prevention mechanisms.

Enhanced Functionality

Saves your settings and preferences, like your location, for a more personalized experience.

Referral Program

We use cookies to enable our referral program, giving you and your friends discounts.

Error Reporting

We share user ID with Bugsnag and NewRelic to help us track errors and fix issues.

Optimize your experience by allowing us to monitor site usage. You’ll enjoy a smoother, more personalized journey without compromising your privacy.

Analytics Storage

Collects anonymous data on how you navigate and interact, helping us make informed improvements.

Differentiates real visitors from automated bots, ensuring accurate usage data and improving your website experience.

Lets us tailor your digital ads to match your interests, making them more relevant and useful to you.

Advertising Storage

Stores information for better-targeted advertising, enhancing your online ad experience.

Personalization Storage

Permits storing data to personalize content and ads across Google services based on user behavior, enhancing overall user experience.

Advertising Personalization

Allows for content and ad personalization across Google services based on user behavior. This consent enhances user experiences.

Enables personalizing ads based on user data and interactions, allowing for more relevant advertising experiences across Google services.

Receive more relevant advertisements by sharing your interests and behavior with our trusted advertising partners.

Enables better ad targeting and measurement on Meta platforms, making ads you see more relevant.

Allows for improved ad effectiveness and measurement through Meta’s Conversions API, ensuring privacy-compliant data sharing.

LinkedIn Insights

Tracks conversions, retargeting, and web analytics for LinkedIn ad campaigns, enhancing ad relevance and performance.

LinkedIn CAPI

Enhances LinkedIn advertising through server-side event tracking, offering more accurate measurement and personalization.

Google Ads Tag

Tracks ad performance and user engagement, helping deliver ads that are most useful to you.

Share Knowledge, Get Respect!

or copy link

Cite according to academic standards

Simply copy and paste the text below into your bibliographic reference list, onto your blog, or anywhere else. You can also just hyperlink to this page.

New to UX Design? We’re Giving You a Free ebook!

The Basics of User Experience Design

Download our free ebook The Basics of User Experience Design to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

We’re a strategic design company that helps people live better and work smarter.

Smart design is a strategic design company that helps people live better and work smarter., innovation 2.0: healthcare.

Seeking the perfect pour

Empower her health: FemTech interview series

Gx ecosystem, fueling the future of athletic performance, meet the smarties, six steps to co-design: what it is and why your company needs it m..., let’s design a smarter world together, design for digital well-being, elevating the voices of young people through co-design, design thinking isn’t design. time to shift gears..

Bringing tech to the next level

Stay up to date on our latest insights

A new customer experience for hearing health.

research design company

2023 Good Design Awards

A great case study for cgos: the 10-yr innovation journey of gatorade’s gx ecosystem, 2023 fast company innovation by design awards, the 2023 gq home awards, 2023 core77 design awards, the problem with refillable beauty.

Guiding principles

Research in practice.

Our practice

We are not our users. Design research guides teams to uncover insights and inform the experiences we create. It begins with the rigorous study of the people we serve and their context. This is the heart of Enterprise Design Thinking . While in the Loop , design research leads teams to continuously build understanding and empathy through observation, prototyping possible solutions, and reflecting on the feedback from our users themselves.

image1

Ethics and Responsibility

image3

Sponsor User Program

image4

Latest Articles

Baby in a hospital.

More than just medicine: how design thinking uncovered modern patient needs

Enterprise Design Thinking

1 minute read

Two designers doing some designing.

The Total Economic Impact™ Of IBM’s Design Thinking Practice

60 minute read

Two designers intensely examining something important.

Project Monocle: a case for design research

5 minute read

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 3 June 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research design company

Home Market Research Research Tools and Apps

Research Design: What it is, Elements & Types

Research Design

Can you imagine doing research without a plan? Probably not. When we discuss a strategy to collect, study, and evaluate data, we talk about research design. This design addresses problems and creates a consistent and logical model for data analysis. Let’s learn more about it.

What is Research Design?

Research design is the framework of research methods and techniques chosen by a researcher to conduct a study. The design allows researchers to sharpen the research methods suitable for the subject matter and set up their studies for success.

Creating a research topic explains the type of research (experimental,  survey research ,  correlational , semi-experimental, review) and its sub-type (experimental design, research problem , descriptive case-study). 

There are three main types of designs for research:

  • Data collection
  • Measurement
  • Data Analysis

The research problem an organization faces will determine the design, not vice-versa. The design phase of a study determines which tools to use and how they are used.

The Process of Research Design

The research design process is a systematic and structured approach to conducting research. The process is essential to ensure that the study is valid, reliable, and produces meaningful results.

  • Consider your aims and approaches: Determine the research questions and objectives, and identify the theoretical framework and methodology for the study.
  • Choose a type of Research Design: Select the appropriate research design, such as experimental, correlational, survey, case study, or ethnographic, based on the research questions and objectives.
  • Identify your population and sampling method: Determine the target population and sample size, and choose the sampling method, such as random , stratified random sampling , or convenience sampling.
  • Choose your data collection methods: Decide on the data collection methods , such as surveys, interviews, observations, or experiments, and select the appropriate instruments or tools for collecting data.
  • Plan your data collection procedures: Develop a plan for data collection, including the timeframe, location, and personnel involved, and ensure ethical considerations.
  • Decide on your data analysis strategies: Select the appropriate data analysis techniques, such as statistical analysis , content analysis, or discourse analysis, and plan how to interpret the results.

The process of research design is a critical step in conducting research. By following the steps of research design, researchers can ensure that their study is well-planned, ethical, and rigorous.

Research Design Elements

Impactful research usually creates a minimum bias in data and increases trust in the accuracy of collected data. A design that produces the slightest margin of error in experimental research is generally considered the desired outcome. The essential elements are:

  • Accurate purpose statement
  • Techniques to be implemented for collecting and analyzing research
  • The method applied for analyzing collected details
  • Type of research methodology
  • Probable objections to research
  • Settings for the research study
  • Measurement of analysis

Characteristics of Research Design

A proper design sets your study up for success. Successful research studies provide insights that are accurate and unbiased. You’ll need to create a survey that meets all of the main characteristics of a design. There are four key characteristics:

Characteristics of Research Design

  • Neutrality: When you set up your study, you may have to make assumptions about the data you expect to collect. The results projected in the research should be free from research bias and neutral. Understand opinions about the final evaluated scores and conclusions from multiple individuals and consider those who agree with the results.
  • Reliability: With regularly conducted research, the researcher expects similar results every time. You’ll only be able to reach the desired results if your design is reliable. Your plan should indicate how to form research questions to ensure the standard of results.
  • Validity: There are multiple measuring tools available. However, the only correct measuring tools are those which help a researcher in gauging results according to the objective of the research. The  questionnaire  developed from this design will then be valid.
  • Generalization:  The outcome of your design should apply to a population and not just a restricted sample . A generalized method implies that your survey can be conducted on any part of a population with similar accuracy.

The above factors affect how respondents answer the research questions, so they should balance all the above characteristics in a good design. If you want, you can also learn about Selection Bias through our blog.

Research Design Types

A researcher must clearly understand the various types to select which model to implement for a study. Like the research itself, the design of your analysis can be broadly classified into quantitative and qualitative.

Qualitative research

Qualitative research determines relationships between collected data and observations based on mathematical calculations. Statistical methods can prove or disprove theories related to a naturally existing phenomenon. Researchers rely on qualitative observation research methods that conclude “why” a particular theory exists and “what” respondents have to say about it.

Quantitative research

Quantitative research is for cases where statistical conclusions to collect actionable insights are essential. Numbers provide a better perspective for making critical business decisions. Quantitative research methods are necessary for the growth of any organization. Insights drawn from complex numerical data and analysis prove to be highly effective when making decisions about the business’s future.

Qualitative Research vs Quantitative Research

Here is a chart that highlights the major differences between qualitative and quantitative research:

In summary or analysis , the step of qualitative research is more exploratory and focuses on understanding the subjective experiences of individuals, while quantitative research is more focused on objective data and statistical analysis.

You can further break down the types of research design into five categories:

types of research design

1. Descriptive: In a descriptive composition, a researcher is solely interested in describing the situation or case under their research study. It is a theory-based design method created by gathering, analyzing, and presenting collected data. This allows a researcher to provide insights into the why and how of research. Descriptive design helps others better understand the need for the research. If the problem statement is not clear, you can conduct exploratory research. 

2. Experimental: Experimental research establishes a relationship between the cause and effect of a situation. It is a causal research design where one observes the impact caused by the independent variable on the dependent variable. For example, one monitors the influence of an independent variable such as a price on a dependent variable such as customer satisfaction or brand loyalty. It is an efficient research method as it contributes to solving a problem.

The independent variables are manipulated to monitor the change it has on the dependent variable. Social sciences often use it to observe human behavior by analyzing two groups. Researchers can have participants change their actions and study how the people around them react to understand social psychology better.

3. Correlational research: Correlational research  is a non-experimental research technique. It helps researchers establish a relationship between two closely connected variables. There is no assumption while evaluating a relationship between two other variables, and statistical analysis techniques calculate the relationship between them. This type of research requires two different groups.

A correlation coefficient determines the correlation between two variables whose values range between -1 and +1. If the correlation coefficient is towards +1, it indicates a positive relationship between the variables, and -1 means a negative relationship between the two variables. 

4. Diagnostic research: In diagnostic design, the researcher is looking to evaluate the underlying cause of a specific topic or phenomenon. This method helps one learn more about the factors that create troublesome situations. 

This design has three parts of the research:

  • Inception of the issue
  • Diagnosis of the issue
  • Solution for the issue

5. Explanatory research : Explanatory design uses a researcher’s ideas and thoughts on a subject to further explore their theories. The study explains unexplored aspects of a subject and details the research questions’ what, how, and why.

Benefits of Research Design

There are several benefits of having a well-designed research plan. Including:

  • Clarity of research objectives: Research design provides a clear understanding of the research objectives and the desired outcomes.
  • Increased validity and reliability: To ensure the validity and reliability of results, research design help to minimize the risk of bias and helps to control extraneous variables.
  • Improved data collection: Research design helps to ensure that the proper data is collected and data is collected systematically and consistently.
  • Better data analysis: Research design helps ensure that the collected data can be analyzed effectively, providing meaningful insights and conclusions.
  • Improved communication: A well-designed research helps ensure the results are clean and influential within the research team and external stakeholders.
  • Efficient use of resources: reducing the risk of waste and maximizing the impact of the research, research design helps to ensure that resources are used efficiently.

A well-designed research plan is essential for successful research, providing clear and meaningful insights and ensuring that resources are practical.

QuestionPro offers a comprehensive solution for researchers looking to conduct research. With its user-friendly interface, robust data collection and analysis tools, and the ability to integrate results from multiple sources, QuestionPro provides a versatile platform for designing and executing research projects.

Our robust suite of research tools provides you with all you need to derive research results. Our online survey platform includes custom point-and-click logic and advanced question types. Uncover the insights that matter the most.

FREE TRIAL         LEARN MORE

MORE LIKE THIS

research design company

Why Multilingual 360 Feedback Surveys Provide Better Insights

Jun 3, 2024

Raked Weighting

Raked Weighting: A Key Tool for Accurate Survey Results

May 31, 2024

Data trends

Top 8 Data Trends to Understand the Future of Data

May 30, 2024

interactive presentation software

Top 12 Interactive Presentation Software to Engage Your User

May 29, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence
  • Artificial Intelligence
  • Product Management
  • UX Research

5 Best Design and Innovation Consulting Firms to Work With

research design company

Victoria Kurichenko

UX studio is one of the global innovation design consulting firms focusing on UX design, research, and strategic consulting. We checked the top design consulting firms’ competency level, experience, and portfolio to define the most reliable design thinking companies you can trust and partner up with in 2024.

Finding a design and innovation consulting firms is a challenging task. Many local and international design thinking companies offer similar services. So how to choose the right one?

We, at UX studio , a product design and consulting firm, often get questions from our clients who look for reliable design, development, or marketing partners. Therefore our recommendations are not only based on research, but years of experience.

UX studio is here to help with UX research and data-driven design

Our UX experts have analyzed the top innovation design firms to help you save time and make the right decision faster. We hope this article will help you better understand what the leading companies in design thinking . 

If you are looking for a Design and Innovation Consulting Agency, get in touch with UX studio , and let’s discuss what we can do for you.

Top design and innovation consulting firms

An award-winning UX design consulting firm working with clients worldwide

UX studio award-winning UX Design and Research Company,

We are honored to include UX studio in the list of the top design consulting companies that use design thinking. 

David Pasztor, a true UX design ambassador and a former TEDx talk speaker, founded UX studio in 2013. We became a global product design company and created impactful digital experiences. 

As a design thinking consulting firm , we advise on product strategy, lead workshops, and offer customized training to equip your business with the best UX practices. We help startups and established brands optimize their time and resources, prioritize their product design and development efforts, and discover untapped business opportunities.

Since 2013, UX studio has worked with over 250 companies worldwide, and our portfolio lists notable clients such as Netflix, Google, HBO, and the United Nations World Food Programme, just to name a few.

UX studio helps businesses to grow

We know that many companies choose their path and manage the challenges themselves. That’s one approach to problem-solving. However, we can step in if you are stuck with your business and need expert advice from someone who has already successfully handled challenges like yours. 

With over ten years of experience working with international clients and designing and developing our own SaaS products — UXfolio and Copyfolio — we possess a unique set of skills, which enable us to share the best design practices with our partners.

We are recognized by the reputable B2B services rating platform, Clutch. Besides, UX studio has recently become an exclusive Zeplin design agency partner.

Is there anything we can help you with now? If you want to level up the effectiveness of your product development processes, measure the ROI on design, or get a clear understanding of your target audience, fill out our contact form to book a free consultation . We are ready to help you with any design, research, or strategic challenges you might have. 

Global design and innovation consulting firm with a long history

ideo - innovation and consulting firm

IDEO’s history dates back to 1978, when David Kelley, a founder of Stanford University’s Hasso Plattner Institute of Design, established his design firm in 1978 and called it DKD (David Kelley Design). In 1991, David Kelley, together with two more partners, Bill Moggridge and Mike Nuttal, merged their existing companies to form a new one. This is how IDEO was born in Palo Alto, California. 

Since its beginnings, IDEO has been applying design thinking to create digital and physical products with a positive impact. From manufacturing Apple’s first mouse and Palve TV to practicing human-centered design, IDEO is one of the design industry’s pioneers.  

Since 1991, IDEO has grown to one of the top global design thinking companies with over 700 employees, five offices across the United States, and four international branches in London, Munich, Shanghai, and Tokyo.

IDEO strives to ask big global questions that affect everyone and solve the challenges together with their clients. IDEO partnered with the Rockefeller Foundation to reduce food waste, Toronto Pearson Airport to develop an agility mindset and deliver exceptional services at Canada’s busiest airport, and the Gates Foundation to develop better classroom technologies. These are just a few examples of IDEO’s notable clients and the positive impact created by a company with design thinking. 

International design thinking company headquartered in New York developing innovative solutions to serve people

R/GA agency

Like IDEO, R/GA is a global innovation consulting company with a long history. Since their journey started in 1977, R/GA transformed from being a computer-assisted film-making company to global innovation and consulting company with 17 branches all over the world.

R/GA unites innovation design thinkers and consultants, data scientists, and marketing experts to create world-class digital experiences and help companies with a strategic vision and business transformation. According to R/GA, they worked with companies worldwide to develop strategies for new digital products and established brands. 

Enterprise innovation, growth strategy, new business models development, brand strategy, and design are among R/GA core services. 

R/GA partnered with notable brands such as Walmart, PepsiCo, AdventHealth, Shiseido, and Nike, just to name a few.

A human-centered product design and innovation company headquartered in New York

Fantasy agency

David Hugh Martin founded Fantasy in 1999 with a mighty mission to impact user experience through bold design. Fantasy is a team of more than seventy professionals of different backgrounds and experiences working across three offices in NY, San Francisco, and London.

As a top design thinking company, Fantasy works with companies worldwide to create seamless user experiences, reimagine digital products from the ground up, build complex B2B platforms, innovate existing products, and develop the next generation of intelligent platforms. 

Fantasy has expertise in the healthcare industry, hospitality and leisure, entertainment and media, finance, and technology, to name a few. Its client portfolio is diverse and impressive. Being among the pioneers in the design industry, Fantasy has partnered with notable clients such as Tesla, Google, Marriott, Dolby, Netflix, Walmart, and Balenciaga, just to name a few.

Global design and innovation consultancy agency headquartered in Los Angeles

Design thinking and innovation consulting firm

Established in 2018,  Whoa is a young research and strategy consulting company that aims to transform challenges into innovation opportunities. 

According to Whoa, their experts “ approach everything they do with razor-sharp precision, unwavering honesty, and an unapologetic POV (point of view) from seasoned veterans and industry experts.”

As a design thinking and innovation consulting firm , Whoa worked with companies across seventeen different countries to help them tackle their innovation challenges. Their services include design research, market research, business and design strategy, ethnographic research, user experience design and industrial design. Even though this company with design thinking is quite young, Whoa has partnered with notable companies such as Google, Samsung, appliedVR, SanDisk, to name a few.

UX studio UX agency

What do design and innovation consulting firms do?

By now, we’ve talked about the top design thinking and innovation consulting firms. However, we know that the design thinking approach to innovation can be confusing for someone not involved in the design processes daily. Hence, here comes our easy-to-grasp definition of design thinking.

Why work with an innovation design consultancy?

Design thinking is a methodology used to solve complex problems and find innovative solutions. Design thinking companies aim to understand users’ pain points before developing a technically feasible solution that addresses their needs. However, some companies and individual entrepreneurs are reluctant to work with consultants and strategists. They claim there is no value in working with innovation and design consulting firms. They choose to rely on in-house resources to drive their product design processes. 

It takes time, energy, and resources to discover the issue and develop the right solution to a problem. For instance, it might take years for particular companies to optimize their internal processes, restructure in-house departments, implement new tools, and adopt a customer-centered approach. What if something fails? It would mean wasted resources, delayed progress, and lost opportunities. It’s an expensive alternative for any organization looking to create impactful digital products and maximize its revenue. This is where product design and innovation companies step in. 

How can design thinking and innovation firms help you?

Design thinking and innovation consulting companies help develop and execute ideas in the right way . They’re not necessarily the ones who will be designing and developing your products. Instead, they are your strategic partners and advisers with solid expertise in your industry. 

Best design thinking companies run workshops and training to teach businesses how to implement design solutions and software to modernize and maximize their performance. From global organizations to startups, expert innovation consulting firms help companies establish a strong online presence and develop corporate identity through internal transformation and adoption of the right methods, tools, and processes.

If someone has already figured out what you currently struggle with, does it make sense to keep struggling alone? Would you instead prefer learning from someone way ahead of you?

Our example

At UX studio, for instance, we offer UX consulting and training for our partners, apart from product design and research services. We help companies understand and measure the value of design investments and shape their future product design strategy. Our dedicated UX teams are not solely designers and researchers. They are strategic partners working closely with clients towards a common goal.

We know that the right design solution is essential for successful product performance. Hence, we stay long-term and share our expertise and learnings through workshops, training, and consultations to help our clients reach their business goals and level up their UX maturity. 

If you are looking for a design thinking company and product design and innovation consultants , contact us at UX studio . We are ready to discuss your challenges and come up with a customized solution tailored to your needs. 

How can companies improve their design thinking and foster innovation ?

Continuous innovation helps significantly improve our quality of life and find solutions to challenges that seemed impossible to solve in the past. Rising startups and established tech giants work daily to develop faster and better solutions that address our current and future needs. However, they also need to think about in-house innovation to optimize the available resources and keep up with the rising competition. 

The design thinking approach is one of the methods businesses can use to reduce the uncertainty risk of building products that are not market fit. According to research conducted by McKinsey and Forrester, companies that implemented design thinking:

  • Have higher revenues and stakeholder returns;
  • Get products to market faster and with lower costs.

Now, the following question comes: How can a company improve its design thinking and foster innovation? This question has already been asked and researched by various institutions, like the Parsons School of Design in New York. They conducted an exploratory study in 2015 and looked at the different forms of design thinking adoption in organizations. 

According to the report “ Parts Without a Whole? – The Current State of Design Thinking Practice in Organizations,” only 20% of companies reported learning about digital transformation and innovation through self-help literature and various tutorials. At the same time, half of the organizations taking part in the study said they learned about design thinking methodology from coaches and design and innovation consulting companies . 

Establishing improvements

If you want to run digital transformation and implement innovations in your organization, you can educate your team members and deal with the challenges in-house, or you can hire a design and innovation consulting firm .  

It might be advantageous for your organization to hire individual consultants or companies that have already implemented design thinking due to the following reasons:

  • Fresh glance on conventional problems;
  • A team of consultants with different backgrounds;
  • Solid expertise in your industry.

If you are looking for a design thinking company , get in touch with UX studio . We are ready to help you with any challenges you might have. Our product design experts have solid experience in healthcare, fintech, SaaS, blockchain, education, and telecommunication industries, just to name a few. 

As a product design and consulting firm, we can help you adopt the latest UX practices, improve your product’s usability, help you shape your product design strategy, and measure the ROI on design investment. These are just a few services we continuously provide to our partners to help them level up their business performance. If you are unsure where to start with digital transformation and innovation, contact us and book a free consultation with our team . We will define the next steps to reach your goals together. 

H ow to introduce design thinking into large companies

According to the Harvard Business Reviews, successful design thinking transformation results in three outputs: superior solutions, lower risks and costs, and employee buy-in. To reach these outcomes, any company that has not adopted design thinking has to know the right methodology and processes. 

At UX studio, we are convinced that people and their needs should stand behind every business decision. Hence, we utilize the design-thinking approach to work on the clients’ challenges or manage in-house issues. 

If you want to introduce design thinking approach in your organization, we recommend using the traditional five-step design thinking model proposed by the Hasso-Plattner Institute of Design at Stanford. 

research design company

Step 1 – Empathize

Henry Ford once said his famous quote:

“If I had asked the public what they wanted, they would have said a faster horse.”

Thanks to Henry Ford and his creative approach to problem-solving, the car became an integral part of daily life for millions of people. To give an innovative solution to the conventional problem, Henry Ford started the process by gaining an empathic understanding of the issue. It is nowadays used as the first step of the design-thinking model.

The design thinking approach implementation starts with in-depth research and understanding of your target audiences their pain points, needs, wants, and motivations. The empathize phase also covers an analysis of the physical environment of your audience. This phase helps to feel and personally relate to your audience’s issues through empathy. 

Engaging directly with your audience helps you reveal why exactly they think and act in a certain way. 

Step 2 – Define

At this point, the team collects all the research insights, analyzes and synthesizes observations. This step is needed to gather all ideas in one place, prioritize them and define the most critical challenges to work on. 

If you do this phase in-house, it’s your responsibility to define the problems you’ll put your time and resources to work on. However, if you choose to partner with a product design consulting company , professional design and research experts will talk to your target audience and collect valuable insights. You should also expect them to analyze and prioritize the tasks and present the findings. This way, you can save your time and work on something else while a team of dedicated experts is crafting an actionable problem statement for you.

Step 3 – Ideate 

Identifying the right scope of problems makes half of the problem solved. At this phase, you focus on generating ideas to address the challenges. Don’t spend too much time looking for the winning idea. The ideation part is all about brainstorming, hearing all your team members’ voices, and collecting the broadest range of ideas to choose from. Later, you will be able to verify your ideas during the testing phase. 

Hasso-Plattner Institute of Design at Stanford advises using the following ideation techniques:

  • Combine your conscious and unconscious mind and rational thoughts with imagination.
  • Sketching your ideas.
  • Mindmapping.
  • Bodystorming, as a way of using your body to simulate real-case scenarios to generate ideas.

Step 4 – Prototype

At this stage, you are supposed to create an inexpensive, so-called “low-fidelity prototype” to test your ideas with potential users. 

You will have doubts and questions regarding the prototype functionality and usage. For example, you might wonder if people understand its interface, whether they can navigate the tool easily, fulfill their needs and perform business actions. The answers to these questions will help you get closer to the final and optimal customer-oriented solution. 

A prototype can be anything a user can interact with. It can be a design draft, a tablet or a mobile device with a few screens, or a simple storyboard. A prototype will help you:

  • Empirically validate your ideas and assumptions;
  • Start a conversation with your potential users and find out their opinions;
  • Save your time and resources;
  • Test multiple ideas at once and choose the best idea to focus on.

Step 5 – Test

The last, but not least important phase in the design thinking process is testing your solution with potential users. It will help you learn how they use it and whether it solves their problems as anticipated. 

When you test a physical object, ideally, you would want people to hold it, interact with it and share their feedback. In the case of a digital user interface, you can use a tablet, a mobile, or a laptop device to test web and mobile applications. Testing is your chance to detect flaws in the prototype and iterate your solution before you start developing the actual product. 

Experienced design and innovation consulting firms can effectively walk you through all the processes. At UX studio, for instance, we create prototypes for our clients, recruit test participants from the target audience, and test and ideate the solution right after. 

If you choose to do testing yourself, remember the following essential tips:

  • Show but don’t tell. Don’t explain to participants what your prototype is about. They have to figure it out themselves. If they don’t get it, it’s a red flag and a sign for you to keep working on the user interface. 
  • Thinking out loud protocol. Politely ask people to say out loud every thought and idea that pops up during the test. It will help you catch even the tiniest issues and refine the solution.
  • Compare. You can test several ideas and ask test participants to compare them. A comparison might reveal untapped opportunities which you did not know before. 

Searching for the design thinking and innovation consulting agency?

The future of any business lies in the hands of its users. If a company can’t meet its customer needs, it is doomed to fail. Flexible and forward-thinking companies implement the design thinking approach to minimize the risk of launching useless products, which, in turn, helps them save time and resources in the long run. 

If you are looking for a design and innovation consulting company , contact us . We’ve been helping companies worldwide create impactful design solutions for ten years. We’re ready to take your challenge and bring your company to the next level.

Let's talk

  • Privacy Policy

Research Method

Home » Research Design – Types, Methods and Examples

Research Design – Types, Methods and Examples

Table of Contents

Research Design

Research Design

Definition:

Research design refers to the overall strategy or plan for conducting a research study. It outlines the methods and procedures that will be used to collect and analyze data, as well as the goals and objectives of the study. Research design is important because it guides the entire research process and ensures that the study is conducted in a systematic and rigorous manner.

Types of Research Design

Types of Research Design are as follows:

Descriptive Research Design

This type of research design is used to describe a phenomenon or situation. It involves collecting data through surveys, questionnaires, interviews, and observations. The aim of descriptive research is to provide an accurate and detailed portrayal of a particular group, event, or situation. It can be useful in identifying patterns, trends, and relationships in the data.

Correlational Research Design

Correlational research design is used to determine if there is a relationship between two or more variables. This type of research design involves collecting data from participants and analyzing the relationship between the variables using statistical methods. The aim of correlational research is to identify the strength and direction of the relationship between the variables.

Experimental Research Design

Experimental research design is used to investigate cause-and-effect relationships between variables. This type of research design involves manipulating one variable and measuring the effect on another variable. It usually involves randomly assigning participants to groups and manipulating an independent variable to determine its effect on a dependent variable. The aim of experimental research is to establish causality.

Quasi-experimental Research Design

Quasi-experimental research design is similar to experimental research design, but it lacks one or more of the features of a true experiment. For example, there may not be random assignment to groups or a control group. This type of research design is used when it is not feasible or ethical to conduct a true experiment.

Case Study Research Design

Case study research design is used to investigate a single case or a small number of cases in depth. It involves collecting data through various methods, such as interviews, observations, and document analysis. The aim of case study research is to provide an in-depth understanding of a particular case or situation.

Longitudinal Research Design

Longitudinal research design is used to study changes in a particular phenomenon over time. It involves collecting data at multiple time points and analyzing the changes that occur. The aim of longitudinal research is to provide insights into the development, growth, or decline of a particular phenomenon over time.

Structure of Research Design

The format of a research design typically includes the following sections:

  • Introduction : This section provides an overview of the research problem, the research questions, and the importance of the study. It also includes a brief literature review that summarizes previous research on the topic and identifies gaps in the existing knowledge.
  • Research Questions or Hypotheses: This section identifies the specific research questions or hypotheses that the study will address. These questions should be clear, specific, and testable.
  • Research Methods : This section describes the methods that will be used to collect and analyze data. It includes details about the study design, the sampling strategy, the data collection instruments, and the data analysis techniques.
  • Data Collection: This section describes how the data will be collected, including the sample size, data collection procedures, and any ethical considerations.
  • Data Analysis: This section describes how the data will be analyzed, including the statistical techniques that will be used to test the research questions or hypotheses.
  • Results : This section presents the findings of the study, including descriptive statistics and statistical tests.
  • Discussion and Conclusion : This section summarizes the key findings of the study, interprets the results, and discusses the implications of the findings. It also includes recommendations for future research.
  • References : This section lists the sources cited in the research design.

Example of Research Design

An Example of Research Design could be:

Research question: Does the use of social media affect the academic performance of high school students?

Research design:

  • Research approach : The research approach will be quantitative as it involves collecting numerical data to test the hypothesis.
  • Research design : The research design will be a quasi-experimental design, with a pretest-posttest control group design.
  • Sample : The sample will be 200 high school students from two schools, with 100 students in the experimental group and 100 students in the control group.
  • Data collection : The data will be collected through surveys administered to the students at the beginning and end of the academic year. The surveys will include questions about their social media usage and academic performance.
  • Data analysis : The data collected will be analyzed using statistical software. The mean scores of the experimental and control groups will be compared to determine whether there is a significant difference in academic performance between the two groups.
  • Limitations : The limitations of the study will be acknowledged, including the fact that social media usage can vary greatly among individuals, and the study only focuses on two schools, which may not be representative of the entire population.
  • Ethical considerations: Ethical considerations will be taken into account, such as obtaining informed consent from the participants and ensuring their anonymity and confidentiality.

How to Write Research Design

Writing a research design involves planning and outlining the methodology and approach that will be used to answer a research question or hypothesis. Here are some steps to help you write a research design:

  • Define the research question or hypothesis : Before beginning your research design, you should clearly define your research question or hypothesis. This will guide your research design and help you select appropriate methods.
  • Select a research design: There are many different research designs to choose from, including experimental, survey, case study, and qualitative designs. Choose a design that best fits your research question and objectives.
  • Develop a sampling plan : If your research involves collecting data from a sample, you will need to develop a sampling plan. This should outline how you will select participants and how many participants you will include.
  • Define variables: Clearly define the variables you will be measuring or manipulating in your study. This will help ensure that your results are meaningful and relevant to your research question.
  • Choose data collection methods : Decide on the data collection methods you will use to gather information. This may include surveys, interviews, observations, experiments, or secondary data sources.
  • Create a data analysis plan: Develop a plan for analyzing your data, including the statistical or qualitative techniques you will use.
  • Consider ethical concerns : Finally, be sure to consider any ethical concerns related to your research, such as participant confidentiality or potential harm.

When to Write Research Design

Research design should be written before conducting any research study. It is an important planning phase that outlines the research methodology, data collection methods, and data analysis techniques that will be used to investigate a research question or problem. The research design helps to ensure that the research is conducted in a systematic and logical manner, and that the data collected is relevant and reliable.

Ideally, the research design should be developed as early as possible in the research process, before any data is collected. This allows the researcher to carefully consider the research question, identify the most appropriate research methodology, and plan the data collection and analysis procedures in advance. By doing so, the research can be conducted in a more efficient and effective manner, and the results are more likely to be valid and reliable.

Purpose of Research Design

The purpose of research design is to plan and structure a research study in a way that enables the researcher to achieve the desired research goals with accuracy, validity, and reliability. Research design is the blueprint or the framework for conducting a study that outlines the methods, procedures, techniques, and tools for data collection and analysis.

Some of the key purposes of research design include:

  • Providing a clear and concise plan of action for the research study.
  • Ensuring that the research is conducted ethically and with rigor.
  • Maximizing the accuracy and reliability of the research findings.
  • Minimizing the possibility of errors, biases, or confounding variables.
  • Ensuring that the research is feasible, practical, and cost-effective.
  • Determining the appropriate research methodology to answer the research question(s).
  • Identifying the sample size, sampling method, and data collection techniques.
  • Determining the data analysis method and statistical tests to be used.
  • Facilitating the replication of the study by other researchers.
  • Enhancing the validity and generalizability of the research findings.

Applications of Research Design

There are numerous applications of research design in various fields, some of which are:

  • Social sciences: In fields such as psychology, sociology, and anthropology, research design is used to investigate human behavior and social phenomena. Researchers use various research designs, such as experimental, quasi-experimental, and correlational designs, to study different aspects of social behavior.
  • Education : Research design is essential in the field of education to investigate the effectiveness of different teaching methods and learning strategies. Researchers use various designs such as experimental, quasi-experimental, and case study designs to understand how students learn and how to improve teaching practices.
  • Health sciences : In the health sciences, research design is used to investigate the causes, prevention, and treatment of diseases. Researchers use various designs, such as randomized controlled trials, cohort studies, and case-control studies, to study different aspects of health and healthcare.
  • Business : Research design is used in the field of business to investigate consumer behavior, marketing strategies, and the impact of different business practices. Researchers use various designs, such as survey research, experimental research, and case studies, to study different aspects of the business world.
  • Engineering : In the field of engineering, research design is used to investigate the development and implementation of new technologies. Researchers use various designs, such as experimental research and case studies, to study the effectiveness of new technologies and to identify areas for improvement.

Advantages of Research Design

Here are some advantages of research design:

  • Systematic and organized approach : A well-designed research plan ensures that the research is conducted in a systematic and organized manner, which makes it easier to manage and analyze the data.
  • Clear objectives: The research design helps to clarify the objectives of the study, which makes it easier to identify the variables that need to be measured, and the methods that need to be used to collect and analyze data.
  • Minimizes bias: A well-designed research plan minimizes the chances of bias, by ensuring that the data is collected and analyzed objectively, and that the results are not influenced by the researcher’s personal biases or preferences.
  • Efficient use of resources: A well-designed research plan helps to ensure that the resources (time, money, and personnel) are used efficiently and effectively, by focusing on the most important variables and methods.
  • Replicability: A well-designed research plan makes it easier for other researchers to replicate the study, which enhances the credibility and reliability of the findings.
  • Validity: A well-designed research plan helps to ensure that the findings are valid, by ensuring that the methods used to collect and analyze data are appropriate for the research question.
  • Generalizability : A well-designed research plan helps to ensure that the findings can be generalized to other populations, settings, or situations, which increases the external validity of the study.

Research Design Vs Research Methodology

About the author.

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Survey Instruments

Survey Instruments – List and Their Uses

Research Project

Research Project – Definition, Writing Guide and...

Research Paper Title Page

Research Paper Title Page – Example and Making...

Significance of the Study

Significance of the Study – Examples and Writing...

Research Techniques

Research Techniques – Methods, Types and Examples

Research Contribution

Research Contribution – Thesis Guide

Leave a comment x.

Save my name, email, and website in this browser for the next time I comment.

  • Get in touch
  • Enterprise & IT
  • Banking & Financial Services
  • News media & Entertainment
  • Healthcare & Lifesciences
  • Networks and Smart Devices
  • Education & EdTech
  • Service Design
  • UI UX Design
  • Data Visualization & Design
  • User & Design Research
  • In the News
  • Our Network
  • Voice Experiences
  • Golden grid

Critical Thinking

  • Enterprise UX
  • 20 Product performance metrics
  • Types of Dashboards
  • Interconnectivity and iOT
  • Healthcare and Lifesciences
  • Airtel XStream
  • Case studies

Data Design

  • UCD vs. Design Thinking

User & Design Research

Towards creating user-focused product & service innovations.

Organizations of the future are the ones that innovate rapidly and continuously. As markets get inundated with new products and services, differentiation would mean innovation; one which addresses the fundamental why’s for every User/ Customer Segment. User/ Design research can help you prepare for that future.

Insights of what exactly shapes User/ Customers’ experience, can empower us to delight them with relevant solutions along their journeys. Research by leading to those insights, drives us to create products which are relevant, accessible and applicable for the people we work with.

Insights generated through design research are realistic and tangible and they go a long way in leapfrogging you towards innovation. Is your organization ready?

Why take the route of primary research, 1. identify new opportunities.

We can market to users what they articulate as their perceived needs but market research cannot guide us in solving problems customers can’t conceive are solvable. In recent times, design practices significantly moved upstream compared to traditional practices, which means that you should engage a design research consultancy like ours before you delve into Market research.

Design research could help us identify opportunities ahead of the market through methods and tool sets oriented much towards your user, which would mean that we are just one step away from pioneering your next big move.

2. Validate a concept/product idea

In times when a plethora of applications inundate the app stores, we are left with a question of what might be a clear differentiator which could capture the attention of your user; what might prevent your idea from becoming a commodity. In order to create and own our innovation, we need to gather user insights before understanding the value or capitalization of innovation.

As design research practitioners, we put Why before What and that’s precisely the point.

Reasoning: The synapse between critical thinking and design thinking

20 product performance metrics you should not miss, our approach to user & design research, 1. the user is always at the center.

User research inherently puts the user at the core; orchestrating a user centered vision where unknowns are delved into to find opportunities around what’s not obvious at the moment.

2. An intent of creation is in place

Our practice is conducted in the context of creating something tangible and not just identifying a strategic direction. When there is an intent of creation, design research leads it to a creation/design.

3. The future is taken care of

Probing about the future, making some future predictions, our practice helps you in understanding the present and past to create a hypothesis for the future.

4. Perceptions lead to solutions

User research is as much about perception as facts; perceptions are essential to the extent that user research is incomplete without gathering them.

5. Innovations are crafted with an insight

There is a great amount of potential in understanding the “Why”. More often than not, the answer to “Why” is articulated around the User. The end goal being to delight users, we explore with the question of “why” towards understanding their needs, wants, behaviours and aspirations.

With User & Design Research, organizations are ushered into a long term strategy oriented towards their future. The process embarked, lends to creation, inspired by empathy.

Think design's golden grid: a framework for planning user and design research.

Many a time, we come across a challenge that needs quick resolution and we choose the route of prototyping and testing instead of researching the problem further. The field of Design research is relatively new and tends to be way too esoteric in that we fail to use appropriate methods to get the right answers at the right time. This leads to trivialization of research practice, disbelief in the subject or resistance to practice it in organizations.

At Think Design, we took the initiative to create a simple yet very powerful framework that design practitioners in our organization and outside can immediately use.

Think + See

Information/ evidences/ data.

Methods that involve observation and analysis of information/ data.

Outcome: Understanding or insights based on recorded or real-time information.

Audio/ Video Analysis

Document research, heatmap analysis, social network mapping, time lapse video, trend analysis, usage analytics, see + probe, perceptions/ opinions/ validations.

Methods that involve observation, questioning and/or testing with users or representative users.

Outcome: Understanding or insights based on what people said or demonstrated.

Card Sorting

Concurrent probing, contextual inquiry, dyads & triads, extreme user interviews, fly on the wall, focus groups, in-depth interviews, personal inventory, retrospective probing, unfocus group, user testing/ validation, word concept association, probe + act, experiences/ design bases.

Methods that involve immersing in context and experiencing the situation by probing.

Outcome: Understanding or insights based on researcher’s own interpretation of context.

Brainstorming

Business model canvas, ethnography, guided tour, participatory design, task analysis, visit survey, act + think, hypothesis/ synthesis.

Methods that involve exercising empathy by thinking and doing.

Outcome: Hypothesis or synthesis demonstrated through visual or verbal communication.

A Day In The Life

Be your customer, bodystorming, customer segmentation, heuristic analysis, predict next year’s headline, prototyping, role reversal, simulation/ modeling, try it yourself, using appropriate research methods.

This framework allows us to choose appropriate research methods based on our orientation, on-the-ground challenges, as well as operational concerns such as sample size, time at hand etc.,

Breaking down research methods into their simpler parts

We broke-down the research methods into four quadrants, primarily formed by two intersecting lines: While we have See and Act on the ends of one axis, we have Probe and Think on the other.

Abstraction to multiple disciplines

Fundamentally, all the research methods we have so far and in the foreseeable future are based on one of these four approaches: Observation, Interviews/ Discussions, Action/ Experiential/ Performance based or Synthesis/ Analysis/ Diagnosis . This is so fundamental that it can be applied to all disciplines needing User or Design research, not just UX or Product development.

At Think Design, we recommend at least one research method taken from each quadrant in order to get a holistic understanding of the situation or problem at hand.

While there are a host of research methods, understanding which ones to choose in which context relies on having theoretical understanding of the method and the experience of actually applying them.

Build your next product/ service with our expertise in research!

Research methods

Ui ux design, service design.

We use cookies to ensure that we give you the best experience on our website. If you continue we'll assume that you accept this. Learn more

Recent Tweets

Sign up for our newsletter.

Subscribe to our newsletter to stay updated with the latest insights in UX, CX, Data and Research.

Get in Touch

Thank you for subscribing.

You will be receive all future issues of our newsletter.

Thank you for Downloading.

One moment….

While the report downloads, could you tell us…

U.S. Department of the Treasury

Treasury sanctions impede russian access to battlefield supplies and target revenue generators.

WASHINGTON — Today, the U.S. Department of the Treasury’s Office of Foreign Assets Control (OFAC) is taking action to further implement the commitments that G7 Leaders made on February 24, 2023 and May 19, 2023. The designations announced today by OFAC and the Department of State take measures to inhibit Russia’s access to products that support its military and war efforts; reduce Russia’s revenue from the metals and mining sector; undermine its future energy capabilities; degrade Russia’s access to the international financial system; and starve Russia of G7-produced technology needed for its technology, aerospace, and defense sectors. 

“Today’s actions represent another step in our efforts to constrain Russia’s military capabilities, its access to battlefield supplies, and its economic bottom line,” said Deputy Secretary of the Treasury Wally Adeyemo. “As long as Russia continues to wage its unprovoked and brutal war against Ukraine, we will impose sanctions to deprive Russia of the technology it needs and disrupt the Russian arms industry’s ability to resupply.”

DISRUPTING RUSSIA’S TECHNOLOGY SUPPLIERS, IMPORTERS, AND DEVELOPERS

Sanctions are just one part of the U.S. government’s efforts to stop Russia’s procurement of critical goods and technology. The United States is working with partner governments and the private sector to identify and disrupt evasion networks and the actions taken today complement these shared efforts. While cooperating with partners, Treasury will use all tools available, including sanctions, to prevent Russia from acquiring the sensitive technology it needs to continue its full-scale war against Ukraine.

In a May 19, 2023 Supplemental Alert , Treasury’s Financial Crimes Enforcement Network (FinCEN) and the U.S. Department of Commerce’s Bureau of Industry and Security (BIS) identified certain high priority items, primarily based on the Harmonized System (HS) code classification of components from Russian weapons systems recovered on the battlefield in Ukraine, to assist financial institutions in identifying suspicious transactions relating to possible export control evasion. Items described by these HS codes have been found in multiple Russian weapons systems used against Ukraine, including the Kalibr cruise missile, the Kh-101 cruise missile, and the Orlan-10 UAV. Many of the entities designated today have transferred certain of these high priority items to Russia-based end-users.

Russia’s Use of Kyrgyz Republic-based Entities to Acquire Dual-Use Technology

One of the most common tactics Russian entities have used to continue their importation of foreign-made electronics and technology is the use of third-party intermediaries and transshipment points outside of Russia. Entities based in the Kyrgyz Republic have been frequent exporters of controlled electronics components and other technology to Russia since Russia began its full-scale invasion of Ukraine. Some of these shipments have subsequently supplied sensitive dual-use goods to entities in Russia’s defense sector. 

LLC RM Design and Development ( RMDD ), established in March 2022, is a Kyrgyz Republic-based seller of electronic and telecommunication equipment and parts. Since its founding last year, RMDD has been a prolific shipper of dual-use goods to Russia, including to firms that have supplied electronics to Russia-based defense companies.  

RMDD has sent hundreds of shipments of goods, including semiconductor devices, electronic integrated circuits, and capacitors to the following Russia-based companies, among others:

  • Basis Trade Prosoft LLC  ( BTP ),   a supplier of industrial computers, components for automated process control systems, and radio-electronic components.
  • OOO Radiotekhsnab ( RTS ), an importer of electronic components and wholesaler of electronic and telecommunications equipment and parts.
  • Region-Prof LLC ( Region-Prof ), a supplier of automation equipment, electronic components, and hardware and software for building electronic equipment. 

RMDD, RTS, and Region-Prof were designated pursuant to Executive Order (E.O.) 14024 for operating or having operated in the electronics sector of the Russian Federation economy. BTP was designated pursuant to E.O. 14024 for operating or having operated in the technology sector of the Russian Federation economy.

Limited Liability Company Siaisi  ( CIC ) is a Russia-based company that primarily deals with electronic and optical equipment as well as computers and related equipment. CIC is owned by Russian Federation national Tatyana Grigoryevna Ivanova ( Ivanova ), who also serves as the general director of CIC. Ivanova is also the general director and owner of Kyrgyz Republic-based wholesaler OSOO Progress Lider  ( Progress Lider ), which was established in March 2022 and has made numerous shipments to CIC.

CIC and Ivanova were designated pursuant to E.O. 14024 for operating or having operated in the electronics sector of the Russian Federation economy. Progress Lider was designated pursuant to E.O. 14024 for having materially assisted, sponsored, or provided financial, material, or technological support for, or goods or services to or in support of, CIC.

ZAO GTME Tekhnologii ( GTME Tekhnologii ) is a Kyrgyz Republic-based entity established in June 2022. GTME Tekhnologii has made dozens of shipments of goods to Russia, including high priority items included in the FinCEN-BIS Supplemental Alert, such as tantalum capacitors and electronic integrated circuits. GTME Tekhnologii’s primary customer has been Russia-based Technologies Systems and Complexes Limited ( TSC ), a vendor of electronic and digital equipment. 

GTME Tekhnologii and TSC were designated pursuant to E.O. 14024 for operating or having operated in the technology sector of the Russian Federation economy. 

OSOO Kargolayn ( Cargoline ), founded in March 2022, is a Kyrgyz Republic-based entity that has shipped millions of dollars of foreign-made aviation equipment to Russia, including directly to airlines that are subject to U.S. export controls.  

Cargoline was designated pursuant to E.O. 14024 for operating or having operated in the aerospace sector of the Russian Federation economy.

Targeting Additional Sanctions Evasion Facilitators

Amegino FZE ( Amegino ) is a UAE-based engineering and services company that provides electronic components and related industry services. Amegino has sent dozens of shipments of electronics, including integrated circuits, to Russia since Russia launched its full-scale invasion of Ukraine. 

Amegino was designated pursuant to E.O. 14024 for operating or having operated in the technology and electronics sectors of the Russian Federation economy. 

Limited Liability Company AK Microtech  ( AKM )is a Russia-based firm that specializes in transferring foreign semiconductor technology to Russian microelectronics production companies, including entities that provide microelectronics to the Russian defense industry. A number of those end-users are on Treasury’s Specially Designated Nationals and Blocked Persons List as well as the Department of Commerce’s Entity List. 

AKM uses non-Russian intermediaries to obfuscate Russian recipients. One such intermediary is Serbia-based firm MCI Trading DOO Beograd Palilula  ( MCI ), which has helped AKM acquire high-tech items from producers in Asia, Europe, and the Middle East. MCI has also made dozens of shipments to AKM since Russia’s invasion of Ukraine began in February 2022. Serbia national Ivan Cvetic  ( Cvetic ) is the director of MCI.

AKM was designated pursuant to E.O. 14024 for operating or having operated in the electronics sector of the Russian Federation economy. MCI and Cvetic were designated pursuant to E.O. 14024 for having materially assisted, sponsored, or provided financial, material, or technological support for, or goods or services to or in support of, AKM.

Designations Targeting Russia-Based Importers of Dual-Use Items

OFAC continues to target Russia-based entities that import dual-use technology from abroad. The following Russia-based entities were designated today pursuant to E.O. 14024 for operating or having operated in the electronics sector of the Russian Federation economy:

  • AK Systems , a developer, manufacturer, and distributor of high-tech electronic devices;
  • LLC Altrabeta , a developer and producer of electronic equipment;
  • Joint Stock Company Compel , a supplier of components for electronics manufacturing;
  • Limited Liability Company Forepost Trading , an electronic components supplier and producer of electronic equipment;
  • LLC IQ Components , an electronic components supplier; 
  • Komponenta AO , an electronic components supplier and electronics manufacturing service provider;
  • LLC Onelek , an electronic components supplier; 
  • NPF-Radiotekhkomplekt AO , an electronic components supplier to research institutes and design bureaus;
  • Saturn EK OOO , an electronic components supplier and producer of electronic equipment;
  • LLC Spetselservis , an electronic components supplier,; and
  • Staut Company Limited , a developer of electronic engineering devices, including for robotic systems.

TARGETING RUSSIA’S MUNITIONS FACTORIES AND HIGH-TECHNOLOGY INDUSTRIES THAT SUPPORT RUSSIA’S DEFENSE SECTOR

OFAC continues to target entities that directly support Russia’s war against Ukraine. The following Russia-based entities were designated pursuant to E.O. 14024 for operating or having operated in the defense and related materiel sector of the Russian Federation economy:

  • Aleksinskii Khimicheskii Kombinat ( Aleksinsky Chemical ) produces ammunition and weapons. Aleksinsky Chemical also produces polymers, paints, and composite materials for Russia’s military-industrial complex. Aleksinsky Chemical was also designated pursuant to E.O. 14024 for operating or having operated in the manufacturing sector of the Russian Federation economy.
  • Kazanskii Gosudarstvennyi Kazennyi Porokhovoi Zavod produces explosives, weapons, ammunition, small arms, and other defense items for the Government of the Russian Federation. 
  • Tambovskii Porokhovoi Zavod ( Tambov Gunpowder ) is one of Russia’s main producers of ammunition for artillery and small arms. Tambov Gunpowder also produces and sells explosives, including armor-piercing projectiles, to the Russian military. 
  • Joint Stock Company Tula Cartridge Works  manufactures small arms ammunition for Russia’s military.

In March 2022, OFAC designated Joint Stock Company Kronshtadt (Kronshtadt) pursuant to E.O. 14024 for operating or having operated in the defense and related materiel sector of the Russian Federation economy. Kronshtadt is a Russian defense contractor that develops and manufactures equipment, software, and integrated solutions for Russia’s unmanned aviation and defense industries and supports Russia’s Ministry of Defense. Today, OFAC is targeting two Russia-based entities that are working with Kronshtadt.

AO NPO Kurganpribor  ( Kurganpribor ) produces components for rocket systems, missiles, and bombs. Kurganpribor is likely working with Russian unmanned aerial vehicle (UAV) manufacturers to develop engines for UAV weapons programs.

Joint Stock Company Astrophysika National Centre of Laser Systems and Complexes  ( Astrofizika ) is a research and development center focused on laser and optical technologies, including for defense purposes. Astrofizika is working with Kronshtadt to develop a line of engines for UAVs.

Kurganpribor was designated pursuant to E.O. 14024 for operating or having operated in the defense and related materiel sector of the Russian Federation economy. Astrofizika was designated pursuant to E.O. 14024 for operating or having operated in the technology sector of the Russian Federation economy.

OFAC is also targeting entities in key industries such as aerospace, quantum technologies, and advanced computing that Russia exploits to support its defense industries. To deprive Russia of technology for its aerospace sector, the following Russia-based companies were designated pursuant to E.O. 14024 for operating or having operated in the aerospace sector of the Russian Federation economy:

  • Arsenal Machine Building Plant Open Joint Stock Company   ( MZ Arsenal ) manufactures military equipment and technology, as well as space technology. Specifically, MZ Arsenal develops and produces materiel for Russia’s navy, solid-fuel rocket engines, and ballistic missiles.
  • Joint Stock Company Experimental Design Bureau Fakel  produces products for aerospace purposes. 
  • M.V. Frunze Arsenal Design Bureau Joint Stock Company  is a military contractor that specializes in the development of space remote sensing systems.
  • Joint Stock Company Research and Production Corporation Precision Systems and Instruments  ( NPK SPP ) manufactures electronics for space complexes. NPK SPP won a contract from Russia’s Ministry of Defense to support a space surveillance system.
  • Open Joint Stock Company Russian Institute of Radionavigation and Time  develops aerospace systems, including for defense purposes.  
  • Joint Stock Company Science Research Institute for Precise Instruments  ( RIPI ) designs and manufactures radio engineering equipment and software-hardware complexes for the Russian aerospace industry. Additionally, RIPI has showcased space-related products and radar at a Russian military forum.  
  • Space Research Institute Russian Academy of Sciences  designs and tests equipment and systems for space research under the control of Russia’s Ministry of Defense. 
  • Joint Stock Company Special Research Bureau of Moscow Power Engineering Institute  produces aerospace industry products for missiles and aircraft.

Scientific Production Company Optolink  ( Optolink ) is a Russia-based producer of technological and electronic products, including optical fibers, high precision fiber optic gyroscopes, diodes, and transistors, used in aerospace systems. On December 8, 2022, Optolink was added to the Department of Commerce’s Entity List based on information that Optolink contributes to Russia’s military and/or defense industrial base. 

Optolink was designated pursuant to E.O. 14024 for operating or having operated in the technology and electronics sectors of the Russian Federation economy. 

OFAC is also targeting research institutes and other entities that support Russia’s research and development of high-technology goods. The following entities were designated pursuant to E.O. 14024 for operating or having operated in the technology sector of the Russian Federation economy:

  • The Budker Institute of Nuclear Physics of Siberian Branch Russian Academy of Sciences is one of Russia’s leading physics research centers and focuses on the development of new technologies. 
  • P.L. Kapitza Institute for Physical Problems, Russian Academy of Sciences is a Russia-based research institution primarily researching quantum fluids and superconductivity.
  • The Federal State Budgetary Institution of Science Federal Research Center Kazan Scientific Center of the Russian Academy of Sciences  ( FRC KAZSC RAS ) is a Russian Federal Research Center responsible for achieving results in the implementation of technological priorities in Russia, particularly in areas of strategic importance. FRC KAZSC RAS conducts research related to nanotechnologies and quantum informatics and is a leading center in the field of radio spectroscopy. 
  • The Osipyan Institute of Solid State Physics of the Russian Academy of Sciences ( ISSP ) is a Russia-based quantum research institute and it is involved in solving problems with high-tech applications. On September 30, 2022, the Department of Commerce added ISSP to the Entity List for acquiring and attempting to acquire U.S.-origin items in support of the Russian military.
  • A.M. Prokhorov General Physics Institute Russian Academy of Sciences is a Russia-based institute that focuses research on laser physics and optics, quantum electronics, microelectronics, and nanoelectronics.
  • Closed Joint Stock Company Superconducting Nanotechnology is a Russia-based company that specializes in the development, fabrication, and implementation of superconducting devices and which produces products which have applications for quantum computing.

The Institute of Laser Physics of the Siberian Branch of the Russian Academy of Sciences  ( Institute of Laser Physics ) is a federally financed institution owned by the Government of the Russian Federation. The Institute of Laser Physics is involved in the application of high-power lasers for scientific research and technology.

The Institute of Laser Physics   was designated   pursuant to E.O. 14024 for being owned or controlled by, or for having acted or purported to act for or on behalf of, directly or indirectly, the Government of the Russian Federation.

DEGRADING RUSSIA’S ACCESS TO THE INTERNATIONAL FINANCIAL SYSTEM 

Imposing sanctions against additional Russia-based financial institutions further degrades the Russian Federation’s ability to maintain access to the global financial system. The following five Russian banks were designated pursuant to E.O. 14024 for operating or having operated in the financial services sector of the Russian Federation economy:

  • Joint Stock Company Locko Bank , a commercial bank located in Moscow, Russia.
  • Joint Stock Company Petersburg Social Commercial Bank , a commercial bank located in Moscow and Saint Petersburg, Russia. 
  • Joint Stock Company Commercial Bank Solidarnost , a commercial bank located in Moscow, Russia and among the leading credit institutions in Russia’s Volga region. 
  • JSC Tinkoff Bank ( Tinkoff Bank ), a commercial bank in Moscow, Russia. Tinkoff Bank is partially owned by U.S.-designated Vladimir Olegovich Potanin. Tinkoff Bank was sanctioned by the European Union (EU) and the United Kingdom (UK) in February and May of 2023, respectively.  
  • Unistream Commercial Bank JSC , a money-transfer institution located in Moscow, Russia. 

FURTHER LIMITING RUSSIA’S REVENUE FROM EXTRACTIVE INDUSTRIES AND FUTURE CAPABILITIES

Today, OFAC is taking further action to limit Russia’s revenue from its metals industries and to limit Russia’s future energy capabilities in support of the G7 commitments. 

Reducing Russia’s Revenue from the Metals and Mining Sector

Joint Stock Company Ural Mining and Metallurgical Company ( UMMC ) is one of Russia’s top producers of metals such as copper, zinc, gold, and silver. 

UMMC Nonferrous Metals Processing Limited Liability Company ( UMMC NFMP ) is a Russia-based UMMC subsidiary involved in the non-ferrous metals processing industry that operates plants that manufacture copper, brass, bronze, copper-nickel, and nickel rolled products. 

Joint Stock Company Uralelektromed ( Uralelektromed ) is a Russia-based UMMC subsidiary involved in the refining of precious metals, cathodes, and bullion products. 

UMMC, UMMC NFMP, and Uralelektromed were designated pursuant to E.O. 14024 for operating or having operated in the metals and mining sector of the Russian Federation economy.

Targeting Russia’s Manufacturers of Equipment and Chemicals for the Energy Industry

The following Russia-based manufacturers of energy industry equipment were designated pursuant to E.O. 14024 for operating or having operated in the manufacturing sector of the Russian Federation economy:

  • Joint Stock Company Scientific Production Enterprise Research and Design Institute of Well Logging  designs special methods and technologies for geophysical surveys involving oil, gas, ore, and coal wells and is involved in manufacturing equipment for well logging.  
  • Limited Liability Company Proizvostvennaya Kommercheskaya Firma Gazneftemash manufactures equipment for the drilling of new oil and gas wells.
  • Joint Stock Company Gazprom Avtomatizatsiya manufactures gas distribution stations for Public Joint Stock Company Gazprom, an entity that is subject to Directive 4 Under E.O. 13662 and Directive 3 Under E.O. 14024. 
  • Joint Stock Company Neftegazavtomatika manufactures automation equipment for the oil and gas industries.
  • Limited Liability Company Oktanta manufactures drill piping inspection equipment.
  • Limited Liability Company Perm Oil Machine Company manufactures oilfield and drilling equipment. 
  • Limited Liability Company Rustmash manufactures oil drilling equipment.

The following Russia-based manufacturers of energy-related refining agents were designated pursuant to E.O. 14024 for operating or having operated in the manufacturing sector of the Russian Federation economy:

  • Limited Liability Company Ishimbay Specialized Chemical Plant of Catalyst manufactures chemicals.
  • Limited Liability Company KNT KAT manufactures catalysts for the oil and gas industries.  
  • Limited Liability Company RN KAT  ( RN KAT ) is a subsidiary of Open Joint-Stock Company Rosneft Oil Company (Rosneft), an entity that is subject to Directive and Directive 4 of E.O. 13662. RN KAT manufactures refining agents for Rosneft’s refineries.
  • Limited Liability Company Sterlitamak Catalyst Plant manufactures chemicals.

Limited Liability Company Tyumen Petroleum Research Center ( TPRC ) is Rosneft’s corporate research and design institute. TPRC, which is involved in technology development, performs field engineering and support for geological survey processes and is involved in the development of oil and gas fields in Russia and elsewhere for Rosneft subsidiaries.

TPRC was designated pursuant to E.O. 14024 for operating or having operated in the technology sector of the Russian Federation economy.

Targeting a Facilitator of Investment in Russia’s Extractive Industries

The Fund for Development of Energy Complex Energy ( Fund Energy ) is a Russia-based investment house that invests in energy, oil and gas, and mining enterprises and infrastructure facilities.

Fund Energy was designated pursuant to E.O. 14024 for operating or having operated in the financial services sector of the Russian Federation economy.

SANCTIONS IMPLICATIONS

As a result of today’s action, all property and interests in property of the persons above that are in the United States or in the possession or control of U.S. persons are blocked and must be reported to OFAC. In addition, any entities that are owned, directly or indirectly, 50 percent or more by one or more blocked persons are also blocked. All transactions by U.S. persons or within (or transiting) the United States that involve any property or interests in property of designated or blocked persons are prohibited unless exempt or authorized by a general or specific license issued by OFAC. These prohibitions include the making of any contribution or provision of funds, goods, or services by, to, or for the benefit of any blocked person and the receipt of any contribution or provision of funds, goods, or services from any such person. 

For identifying information on the individuals and entities sanctioned or property identified today, click here.  

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 05 June 2024

Companies inadvertently fund online misinformation despite consumer backlash

  • Wajeeha Ahmad 1 ,
  • Ananya Sen 2 ,
  • Charles Eesley 1 &
  • Erik Brynjolfsson 3  

Nature volume  630 ,  pages 123–131 ( 2024 ) Cite this article

Metrics details

  • Interdisciplinary studies

The financial motivation to earn advertising revenue has been widely conjectured to be pivotal for the production of online misinformation 1 , 2 , 3 , 4 . Research aimed at mitigating misinformation has so far focused on interventions at the user level 5 , 6 , 7 , 8 , with little emphasis on how the supply of misinformation can itself be countered. Here we show how online misinformation is largely financed by advertising, examine how financing misinformation affects the companies involved, and outline interventions for reducing the financing of misinformation. First, we find that advertising on websites that publish misinformation is pervasive for companies across several industries and is amplified by digital advertising platforms that algorithmically distribute advertising across the web. Using an information-provision experiment 9 , we find that companies that advertise on websites that publish misinformation can face substantial backlash from their consumers. To examine why misinformation continues to be monetized despite the potential backlash for the advertisers involved, we survey decision-makers at companies. We find that most decision-makers are unaware that their companies’ advertising appears on misinformation websites but have a strong preference to avoid doing so. Moreover, those who are unaware and uncertain about their company’s role in financing misinformation increase their demand for a platform-based solution to reduce monetizing misinformation when informed about how platforms amplify advertising placement on misinformation websites. We identify low-cost, scalable information-based interventions to reduce the financial incentive to misinform and counter the supply of misinformation online.

The prevalence of online misinformation can have important social consequences, such as contributing to greater fatalities during the COVID-19 pandemic 10 , exacerbating the climate crisis 11 , and sowing political discord 12 . Yet the supply of misinformation is often financially motivated. The economic incentive to produce misinformation has been widely conjectured by academics and practitioners to be one of the main reasons websites that publish misinformation (hereafter referred to as ‘misinformation websites’ or ‘misinformation outlets’), masquerading as legitimate news outlets, continue to be prevalent online 1 , 2 , 3 , 4 . During the 2016 US Presidential election, one operator of a misinformation outlet openly stated “For me, this is all about income” 13 .

Media reports have anecdotally observed that companies and digital platforms contribute towards financially sustaining misinformation outlets via advertising 14 , 15 . Advertising companies can either place their advertisements directly on specific websites or use digital advertising platforms to distribute their advertisements across the internet ( Methods , ‘Background on digital advertising’). The vast majority of online display advertising today is done via digital advertising platforms that automatically distribute advertisements across millions of websites 16 , which may include misinformation outlets. According to a recent industry estimate, for every US$2.16 in digital advertising revenue sent to legitimate newspapers, US advertisers send US$1 to misinformation sites 17 .

Existing work to counter the proliferation of misinformation online has primarily focused on empowering news consumers 3 , 5 in order to reduce the demand for misinformation through interventions such as fact-checking news articles 6 , providing crowd-sourced labels 8 and nudging users to share more accurate content 7 . However, a vital question remains regarding how the incentive to produce or supply misinformation may be countered. Indeed, recently, academics have proposed ‘supply-side’ policies for steering platforms away from the revenue models that might contribute towards sustaining harmful content 18 . Digital platforms have also attempted to decrease advertising revenue going to some misinformation websites 19 . However, despite these attempts, advertising from well-known companies and organizations continues to appear on misinformation websites, thereby financing such outlets 20 , 21 . Moreover, the supply of misinformation is expected to increase with generative AI technologies making it easier to create large volumes of content to earn advertising revenue 22 , 23 .

In this Article, we attempt to provide a first step in understanding how to limit the financing of online misinformation via advertising using descriptive and experimental evidence. To tackle the problem of financing online misinformation, it is important to first understand the role of different entities within this ecosystem. In particular, we need to establish whether companies directly place advertisements on misinformation outlets or do so by automating such placement through digital advertising platforms. Although several mainstream digital platforms generate the vast majority of their revenue via advertising 3 , little is understood about the role of advertising-driven platforms in financing misinformation. To evaluate the relative roles of advertising companies and digital advertising platforms in monetizing misinformation, we construct unique large-scale datasets by combining data on websites publishing misinformation with advertising activity per website over a period of three years.

Next, the extent to which companies can be dissuaded from advertising on misinformation websites depends on how their customers respond to information about the prevalence of companies’ advertising on such websites. As people find out about companies advertising on misinformation websites through news and social media reports 20 , 24 , they may reduce their demand for such companies or voice concerns against such practices online 25 , 26 . Therefore, it is important to measure the preferences of the people who consume a company’s products or services regardless of whether these consumers visit misinformation websites themselves. To measure these effects, we conducted a survey experiment with a sample of the US population by randomly varying the pieces of factual information we provided to participants. By simultaneously measuring how people shift their consumption and the types of actors (that is, advertisers or digital advertising platforms) that they voice concerns about, we capture how peoples’ reactions change as the degree to which advertisers and advertising platforms are held responsible varies. We also study how consumer responses may vary depending on the intensity of a company’s advertising on misinformation websites by providing company rankings on this dimension.

Finally, whether decision-makers within companies are aware of their company’s advertisements appearing on misinformation outlets and prefer to avoid doing so can have an important role in curbing the financing of misinformation. In recent years, advertisers have often participated in boycotts of advertising-driven platforms such as YouTube, Facebook and Twitter for placing their advertisements next to problematic content 27 , 28 . However, there is little systematic measurement of the knowledge and preferences of key decision-makers within companies in this context. To address this gap, we surveyed executives and managers by contacting the alumni of executive education programmes. Moreover, we conducted an information-provision experiment to examine whether decision-makers would increase their demand for a platform-based solution to avoid advertising on misinformation outlets when informed about the role of digital advertising platforms in monetizing misinformation.

We report three sets of findings from our descriptive and experimental analyses. First, our descriptive analysis suggests that misinformation websites are primarily monetized via advertising revenue, with a substantial proportion of companies across several industries appearing on such websites. We further show that the use of digital advertising platforms amplifies the financing of misinformation. Second, we find that people switch consumption away from companies whose advertising appears on misinformation outlets, reducing the demand for such companies. This switching effect persists even when consumers are informed about the role of digital advertising platforms in placing companies’ advertisements on misinformation websites and the role of other advertising companies in financing misinformation. Third, our survey of decision-makers suggests that most of them are ill-informed about the roles of their own company and the digital advertising platforms that they use in financing misinformation outlets. However, decision-makers report a high demand for information on whether their advertisements appeared on misinformation outlets and solutions to avoid doing so. Those who were uncertain and unaware about where their advertising appeared also increased their demand for a platform-based solution to reduce advertising on misinformation websites upon learning how platforms amplify advertising on such websites.

In sum, our results indicate that there is room to decrease the financing of misinformation using two low-cost, scalable interventions. First, improving transparency for advertisers about where their advertisements appear could by itself reduce advertising on misinformation websites, especially among companies who were previously unaware of their advertisements appearing on such outlets and were thus inadvertently financing misinformation. Second, although it is currently possible for consumers to find out about advertising companies financing misinformation through news and social media, platforms could make advertising on misinformation outlets more easily and continuously traceable to the advertising companies involved for consumers. Our results suggest that both simple information disclosures and comparative company rankings can reduce consumer demand away from companies advertising on misinformation websites.

We build on prior work analysing the ecosystem supporting misinformation websites 29 , 30 , 31 , 32 , 33 and programmatic advertising 34 by matching millions of instances of advertising companies appearing across thousands of news outlets with data on misinformation websites, thereby providing large-scale evidence of the ecosystem that sustains online misinformation over a consistent period of three years. Additionally, we present descriptive evidence about the relative roles of advertising companies and digital advertising platforms in financing misinformation. Next, our information-provision experiments examine the effects of advertising on misinformation websites for companies and platforms. Previous work has examined the conditions under which people react against companies for failing to operate up to their expectations—for example, due to service quality deterioration 26 , not fulfilling social responsibilities 35 , advertising next to violent content 36 , or taking a political stance 37 , 38 . Our research design contributes to this literature in two key ways by: (1) measuring both types of potential consumer responses—that is, ‘exit’ and ‘voice’—that are theorized in the literature 25 ; and (2) doing so using incentive-compatible behavioural outcomes at the individual level, which enables us to capture costly decisions people make and move beyond stated preferences recorded in related experimental research 36 , 39 . More broadly, our research suggests an alternative approach to countering misinformation online by suggesting how the monetization of misinformation could be curbed using information interventions. Our study complements and extends prior work on using disclosures 40 , 41 and interventions to counter misinformation 5 , 7 by showing that disclosures about companies advertising on misinformation outlets can shift consumption away from such companies, ultimately incentivizing companies to reduce the financing of misinformation via advertising.

Collection of website and advertising data

To categorize whether a website contains misinformation, we compiled a list of misinformation domains using three different sources: NewsGuard, the Global Disinformation Index (GDI) and websites used in prior work (see  Methods , ‘Collecting website data’). NewsGuard and the GDI use automated and manual methods to source and evaluate websites, but each website is rated manually by expert professionals who apply journalistic standards to evaluate online news outlets in a non-partisan and transparent manner.

We collected data on advertiser behaviour from 2019 to 2021 via Oracle’s Moat Pro platform, which includes data collected by ‘crawling’ approximately 10,000 websites daily to create a snapshot of the advertising landscape. Moat’s web crawlers mirror a normal user experience and attempt to visit a representative sample of pages for each website at least once a day. To the best of our knowledge, these data are the gold standard used by many industry stakeholders for competitive analysis. For all the websites in our sample that get non-zero traffic throughout this period and have advertising data available on the Moat Pro platform, we collected monthly data on the advertising companies appearing on each website and digital advertising platforms used by each website.

Our final dataset, which contains data on advertising and misinformation, consists of 5,485 websites (including 1,276 misinformation websites and 4,209 non-misinformation websites) and 42,595 unique advertisers with 9,539,847 instances of advertising companies appearing on news websites between 2019 and 2021. Additionally, for the most active 100 advertisers each year, as identified by Moat Pro, we collected weekly data on the websites that they appeared on and the digital advertising platforms that they used.

Descriptive analysis

Of the websites in our sample, 89.3% were supported by advertising revenue between 2019 and 2021, and the majority of misinformation websites (74.5%) were monetized by advertising during this period. Moreover, among websites rated by NewsGuard, a much smaller percentage of misinformation websites had a paywall (2.7% in the USA and 3.2% globally) relative to non-misinformation websites (25.0% in the USA and 24.0% globally), which indicates a greater reliance on advertising for financing relative to other subscription-based business models among misinformation websites. Although different entities may have specific ideological or financial motivations for propagating online misinformation, data from NewsGuard-rated websites (see Supplementary Table 3 ) shows that relative to non-misinformation websites, misinformation websites were also more likely to be operated by individuals as opposed to corporate, non-profit or government entities. Given that advertising appears to be the dominant business model that sustains misinformation outlets, it merits a closer look. We find that companies that advertise on misinformation websites span a wide range of industries (Supplementary Table 4 ) and account for 46% to 82% of overall companies in each industry (Fig. 1a ). These include several well-known brands among commonly used household products, technology products and business services, as well as finance, health, government and educational institutions among other industries. Further, the intensity of advertising on misinformation sites is similar (mean = 1.01, 95% confidence interval [0.945, 1.074], t (22) = 0.311, P  = 0.759 from one-sample t -test, n  = 23) to that on non-misinformation sites for companies across several industries (Fig. 1b ).

figure 1

From 2019 to 2021, we recorded the number of times companies in a given industry appeared on the 5,485 websites in our sample per month. Our final sample of advertisers consists of 42,595 companies and 9,539,847 instances of companies advertising on the websites in our sample. We removed industries where the number of advertising appearances by all companies combined was below the 5th percentile of the total number of advertising appearances, resulting in a total of 23 industries. a , The proportion of companies in each industry that appear on misinformation websites at least once in our sample. b , The advertising intensity on misinformation sites relative to non-misinformation websites for each industry. This is calculated by dividing the proportion of advertisements from companies of that industry appearing on misinformation websites among all advertising appearances on misinformation websites with the same proportion for non-misinformation websites per industry. Therefore, values lower than 1 indicate less, values close to 1 represent similar and values higher than 1 represent greater advertising intensity on misinformation sites relative to non-misinformation websites.

Source Data

Next, we examined the role of digital advertising platforms in financing misinformation. For the one hundred most active advertisers in each year, we collected weekly data on the websites their advertisements appeared on and their use of digital advertising platforms. On average, about 79.8% of advertisers that used digital advertising platforms in a given week appeared on misinformation websites that week. In contrast, among companies that did not use digital advertising platforms in a given week, only 7.74% appeared on misinformation websites on average in a given week (two-sided t -test t (192.12) = 93.903, P  < 0.001, n  = 144). In other words, companies that used digital advertising platforms were approximately ten times more likely to appear on misinformation websites than companies that did not use digital advertising platforms. Moreover, we account for industry and time trends to find that the use of digital advertising platforms by companies substantially amplifies the likelihood of a company’s advertising appearing on misinformation websites (see Extended Data Table 1 ).

Effects of advertising on misinformation

Next, our survey experiment aimed to determine potential changes in consumer behaviour based on experimentally varied information about the roles of companies and platforms in financing misinformation via advertising. Using the framework of Hirschman 25 , we measured how people (1) exit (that is, decrease their consumption), and (2) voice concerns about company or platform practices via online petitions in response to the information provided in an incentive-compatible manner.

Average treatment effects

As detailed in Methods , ‘Consumer experiment design’, participants in our experiment were offered a gift card from a company of their choice. Our primary pre-registered outcome is whether respondents exit by switching their top gift card choice after receiving an information treatment, which takes the value one for people who switch and the value zero for all other participants ( n  = 4,039). To observe exit outcomes, we focus on company-related information treatments (T1, T3 and T4), where respondents are informed that advertisements from their top choice of gift card company recently appeared on misinformation websites. Table 1 , column 1 shows that respondents increasingly exit (that is, increase switching away or decrease demand from) their first choice company relative to control ( b  = 0.13, 95% confidence interval [0.10, 0.16], P  < 0.001) in response to learning about their top choice gift card company’s advertisements appearing on misinformation websites (T1). This effect persists ( b  = 0.13, 95% confidence interval [0.10, 0.16], P  < 0.001; Table 1 , column 2) when we control for participants’ demographic and behavioural characteristics in our preferred specification, which enables more precise estimates (see  Supplementary Information , ‘Analysis: consumer study outcomes’). We also use text analysis of the responses to a free-form question, which helps to identify the effect of the information intervention more directly. Respondents’ text responses explaining their choice of the gift card reveal that misinformation concerns drive this switching behaviour (Extended Data Fig. 1a ).

Switching behaviour also increases relative to the control group ( b  = 0.10, 95% confidence interval [0.07, 0.13] , P  < 0.001) when respondents are told about the substantial role of digital advertising platforms in placing companies’ advertisements on misinformation websites (T3). This switching behaviour persists even though respondents are more likely to state that digital advertising platforms are responsible for placing companies’ advertisements on misinformation websites by four percentage points relative to the control group ( b  = 0.04, 95% confidence interval [0.02, 0.06] , P  < 0.001, Extended Data Fig. 1b ). This suggests that advertising companies can continue to experience a decline in demand for their products or services despite consumers knowing that digital advertising platforms have a substantial role in placing companies’ advertisements on misinformation websites.

When provided with a ranking of companies in order of their intensity of appearance on misinformation websites (T4), respondents switch away from opting for their top choice gift card company ( b  = 0.08, 95% confidence interval [0.05, 0.11] , P  < 0.001). This result shows that the advertising companies can expect to face a decrease in consumption for financing misinformation despite other companies also advertising on misinformation outlets. Respondents are less likely to mention product features that are relevant to the companies they are interested in—for example, healthy food, good prices and availability in the local area, among others ( b  =  − 0.07, 95% confidence interval [ − 0.09, −0.05], P  < 0.001, Extended Data Fig. 1a ). Examining the direction of consumer switching shows that among those who switch their gift card preference ( n  = 430), those provided with company-ranking information in T4 made the most switches towards companies that less frequently advertised on misinformation websites ( b  = 0.95, 95% confidence interval [0.19, 1.71] , P  = 0.015). This result suggests that providing a ranking of advertising companies transparently could steer consumer demand towards companies that advertise less frequently on misinformation websites.

Our results are robust to alternative exit outcomes that include whether participants switch to a product they prefer less than their first choice (Table 1 , columns 3 and 4) and whether they switch their choice across product categories (Table 1 , columns 5 and 6), further indicating that participants incur a real cost of switching to a company that is not equivalent to their top-ranked one. Although our platform-related information treatment (T2) does not explicitly mention the respondents’ first choice gift company (as in T1, T3 and T4) or its specific use of digital advertising platforms (as in T3), we observe a small amount of switching in T2 relative to the control group ( b  = 0.03, 95% confidence interval [0.01, 0.05], P  = 0.012). This could be because respondents might partially blame their first choice gift card company as it could be top of mind for them 42 or assume that the information provided in T2 alluded to the company they had just chosen 43 . It is important to note that the other outcomes reported in Table 1 in the paper—that is, switching to lower preference gift cards and switching across categories are not statistically significant for T2, which suggests that T2 does not result in treatment effects similar to our other treatments. Overall, we find that companies whose advertisements appear on misinformation websites can face substantial consumer backlash in terms of both exit and voice. Consumers who switched their gift card choice as a result of our information treatments lost about 39.4% of the mean value and 42.9% of the median value of their gift card value on average. Given that the value of the gift card is US$25, a 39.4% decline in the mean value translates to treated consumers losing an equivalent of US$9.85. The distribution of weights assigned to the initial top gift card choice and the final selection is shown in Extended Data Fig. 2 , which illustrates a substantial leftward shift in the weight distribution when individuals switch away from their top choice. We also find suggestive evidence for vast differences between consumers’ stated and revealed preferences, as shown in Supplementary Fig. 3 . When compared to prior research, our 13 percentage point decline in demand is similar in magnitude to the demand reduction observed from receiving negative product feedback 44 and exceeds the magnitude of previously measured changes in demand associated with companies taking a social or political stance 37 , 38 .

Next, we examine the effects of the information interventions on our pre-registered voice outcomes captured by individuals signing an online petition to voice concerns about advertising on misinformation websites. Participants were given the option to sign one of four different petitions on Change.org ( https://www.change.org/ ): two company-level petitions advocating that companies in general should block or allow their advertisements from appearing on misinformation outlets, and two similar platform-level petitions. Although we observe petition signatures at the group level, we use clicks on petition links as our primary voice outcome since this information is available at the individual level and most closely matches the proportions of actual signatures (Extended Data Fig. 3 ). Our results are robust to using alternative petition outcomes, such as intention to sign a petition, self-reported petition signatures and actual signatures (Extended Data Table 2 ). Of note, we do not analyse actual signatures for the T4 group since Change.org accidentally deleted these petitions after they were recorded.

Relative to the control group, participants were 5 percentage points (36%) significantly more likely to click on the platform petition link when given information about the role of digital advertising platforms in automatically placing advertisements on misinformation websites in the platform (T2) treatment group (Table 2 , columns 3 and 4). Text analysis from respondents’ explanation of their petition choice confirms that respondents hold digital advertising platforms more responsible for financing misinformation in T2 relative to the control group ( b  = 0.02, 95% confidence interval [0.01, 0.04], P  = 0.012, Extended Data Fig. 1b ). For example, one respondent stated who opted for the platform blocking petition explained their choice by stating that the platform option “involves more than one company.” Another stated that their chosen gift card company is “not the only ad being put on misinformation sites. It is a larger issue that has to do with the platforms used to place ads.” Indeed, signing these petitions is the only way that participants can take any action to hold advertising platforms responsible in response to T2, which explicitly highlights the role of platforms.

Upon receiving information about all six gift card companies’ advertisements appearing on misinformation websites (T4), participants were significantly more likely to click on petition links suggesting that advertising companies need to block their advertisements from appearing on misinformation websites (Table 2 , columns 3 and 4). Based on their open-ended text responses (Extended Data Fig. 1a ), respondents increasingly highlighted misinformation-related concerns ( b  = 0.09, 95% confidence interval [0.07, 0.11], P  < 0.001) and placed less emphasis on product usage ( b  =  − 0.05, 95% confidence interval [ − 0.07, − 0.03], P  < 0.001) and product features ( b  =  − 0.07, 95% confidence interval [ − 0.09, − 0.05], P  < 0.001). In T4, the treatment intensity for companies, in general, is significantly stronger relative to T1 and T3 since we highlighted that all six gift card companies advertise on misinformation websites (at varying levels). This increase in treatment intensity could explain a higher treatment effect for T4 relative to the null effects for company petitions in the other treatment arms, which only specifically mentioned the respondents’ top choice gift card company.

Heterogeneous treatment effects

Next, we explore heterogeneity in treatment effects along four pre-registered dimensions (gender, political orientation, frequency of use of the company’s products or services, and consumption of misinformation) based on our hypotheses (see  Methods , ‘Consumer experiment design’). Focusing on exit (Extended Data Table 3 , columns 1–4), we observe positive treatment effects for all groups—that is, male and female, Biden voters and Trump voters, frequent and infrequent users of a company’s products or services, and those who report consuming news from misinformation outlets in our survey and those who do not. As reported in Extended Data Table 3 , in line with our predictions, we find stronger treatment effects for exit among women ( b  = 0.05, P  = 0.011) and Biden voters ( b  = 0.03, P  = 0.058) and less strong treatment effects for frequent users ( b  =  − 0.05, P  = 0.007) and those who consume news from select popular misinformation outlets ( b  =  − 0.04, P  = 0.097). Respondents who voted for President Biden in the 2020 US Presidential election were also 5 percentage points more likely to voice concerns against company practices ( P  = 0.04; Extended Data Table 3 , column 6). Overall, we believe these heterogeneity results bolster the external validity of our experimental estimates. In particular, we highlight that product-specific factors such as frequency of use can have an important role in the decision to switch or not separately from ideological reasons such as political leaning.

Measuring decision-maker preferences

Given that advertising on misinformation websites is pervasive and could provoke consumer backlash, we next examine what explains the prevalence of this phenomenon among companies. To shed light on this question, we surveyed key strategic decision-makers such as executives and managers at companies by partnering with the executive education programmes at two universities to survey their alumni. In collaboration with our partner organizations, we also verified the job titles of the majority (71%) of our respondents using external sources, which are shown in Extended Data Fig. 4 . About 94% of the participants whose job titles we were able to verify served in a top executive role or managerial role at the time of our survey (for example, chief executive, general or operations manager of multiple departments or locations, advertising or sales manager or operations manager) and the remainder were individuals who could influence decision-making within their companies, especially given their interest in learning leadership and managerial skills via executive education programmes.

Baseline beliefs and preferences

We found a wide dispersion in decision-makers’ pre-registered beliefs about the role of companies and platforms in financing misinformation as shown in Supplementary Fig. 6 and 7 , which complements prior work showing wide dispersion in decision-makers’ beliefs in other settings 45 , 46 . Decision-makers largely overestimate the overall proportion of companies that advertise on misinformation websites and underestimate the role of digital advertising platforms in placing companies’ advertisements on misinformation websites. In particular, respondents estimated that about 64% of companies’ advertisements appeared on misinformation websites on average (Supplementary Table 12 ). However, our data show that 55% of the 100 most active advertisers appeared on misinformation websites. Regarding the role of digital advertising platforms, respondents estimated that around 44.5% of companies using digital advertising platforms appear on misinformation websites (Supplementary Table 12 ), whereas 79.8% of companies among the 100 most active advertisers in fact do so. Moreover, only 41% of decision-makers believed that consumers react against companies whose advertisements appear on misinformation websites. These results suggest that decision-makers believe that advertising on misinformation websites is probably commonplace but has little to do with using digital advertising platforms and has limited consequences for the companies involved.

However, in contrast to the average belief that most companies advertised on misinformation websites, respondents substantially underestimated their own company’s likelihood of appearing on misinformation websites. Only 20% of respondents believing that their own company’s advertisements recently appeared on misinformation websites, which indicates the presence of a false uniqueness effect among decision-makers 47 . We further segmented our results by type of role within the company (Extended Data Table 4 ). Although our sub-samples were small, these baseline beliefs and characteristics were largely similar across various roles. Among participants who expressed an interest in learning about whether their company’s advertisements appeared on misinformation websites (that is, requested an advertisement check by providing their company name and contact details) and whose companies appeared in our advertising data, approximately 81% of companies appeared on misinformation websites. Moreover, most respondents who were given follow-up information that their companies’ advertisements appeared on misinformation websites reported being surprised by this information (62%), whereas none of those who learned their companies advertisements did not appear on misinformation websites reported being surprised. These figures illustrate that decision-makers are largely uninformed about the high likelihood of their company’s advertisements appearing on misinformation websites. Given these findings about the beliefs of decision-makers, our results suggest that companies may be financing misinformation inadvertently.

Most participants requested an advertisement check by providing their company name and email address (74%). The demand for an advertisement check was high regardless of respondents’ initial beliefs, suggesting a substantial interest in learning about whether their company’s advertisements appeared on misinformation websites. Despite only 41% of respondents agreeing that consumers react against companies whose advertisements appear on misinformation websites, most participants (73%) opted to receive information on how consumers respond to companies whose advertisements appear on misinformation websites with 58% inquiring about exit and 15% enquiring about voice. This suggests that although decision-makers may be unaware of how advertising on misinformation websites can provoke consumer backlash, most of them are interested in learning about the degree of potential backlash. Finally, for our most costly revealed-preference measure—that is, signing up to attend a 15-minute expert-led information session on how companies can avoid advertising on misinformation websites—18% of decision-makers opted to sign up, an arguably high rate given the value of decision-makers’ time and the opportunity cost of attending the session.

Information intervention results

We report the results of our information treatment on our pre-registered outcomes. For the full sample of participants, we estimate positive and statistically significant effects on participants’ posterior beliefs about the role of advertising platforms in placing advertisements on misinformation websites (Table 3 , column 1), driven mainly by respondents who believe that their company’s advertisements had not appeared on misinformation websites in the recent past (Table 3 , column 3).

We find an overall null effect of our information treatment on participants’ demand for a platform-based solution, as measured by their demand for information on which platforms least frequently place companies’ advertisements on misinformation websites (Table 3 , columns 4–6). However, this result masks substantial heterogeneity based on participants’ prior beliefs. Since our information treatment changes beliefs for the subset of participants who believe that their company’s advertisements had not recently appeared on misinformation websites (Table 3 , column 3), we further investigated and reported results based on participant’s prior beliefs for this sub-sample in Table 4 . Only participants who were uncertain and unaware about their own company’s advertisements appearing on misinformation websites responded positively and significantly to our information treatment by increasing their demand for a platform-based solution by 36 percentage points ( b  = 0.36, 95% confidence interval [0.11, 0.61] , P  = 0.008, n  = 68), as shown in Table 4 , column 4. Our results imply that the way in which participants respond to information about the role of digital advertising platforms in financing misinformation is highly dependent on their prior beliefs about their own company. Such information could make companies switch advertising platforms or pressure the platforms they currently use to enable them to easily steer their advertising away from misinformation outlets. This finding is in line with a lack of attention describing decision-makers’ behaviours across various settings 48 , 49 , 50 . However, these results should be viewed as suggestive and exploratory since the subsample sizes in these regressions are small and these sample splits were not pre-registered.

We did not find meaningful treatment effects for our donation preference outcome, which measures the proportion of respondents who prefer that we donate to the GDI instead of DataKind (Supplementary Table 13 ). Since both GDI and DataKind have similar goals of advancing technology’s ethical and responsible use, respondents may have considered their missions interchangeable. Moreover, unlike our first behavioural outcome, respondents could have considered donating to the GDI less relevant to their own organizations’ needs and more a matter of personal preference.

Together, our descriptive and experimental findings offer clear, practical implications. Given the potential for a substantial decline in demand, as demonstrated by our consumer study, advertising companies may wish to account for consumer preferences in placing their advertising across various online outlets and exercise caution while incorporating automation in their business processes via digital advertising platforms. For instance, given that consumers switched to other products upon learning about a company’s advertisements appearing on misinformation websites, companies could use lists of misinformation outlets provided by independent third-party organizations such as NewsGuard and the GDI to limit advertising budgets being spent on misinformation outlets through digital platforms. Moreover, since consumer backlash was particularly strong for women and politically left-leaning consumers, companies targeting such audiences may need to exercise greater caution.

On the basis of our results, we identify two interventions that could reduce the financing of online misinformation. First, digital advertising platforms that run automated auctions could enable advertisers to more easily access data on whether their advertisements appear on misinformation outlets. This would enable advertisers to make advertising placement decisions consistent with their preferences rather than inadvertently financing misinformation 51 . Second, while it is currently possible for consumers to find out about companies financing misinformation through media reports, digital platforms could improve transparency for consumers about which companies advertise on misinformation outlets. Platforms could provide such information to consumers when they are viewing an advertisment using simple information labels (as in our ‘company only’ information treatment) similar to the ‘sponsored by’ and ‘paid for by’ labels that are presently common on various digital media platforms. Similarly, rank-based information provided in our company-ranking information treatment (T4) could be provided as a ranking of companies in order of intensity of appearing on misinformation websites where customers are selecting products from a menu of choices while shopping. Platforms have provided similar contextual information about companies in other settings—for example, Google Flights displays carbon emissions data alongside flight prices when people select a flight to purchase among several options 52 . Enabling consumers to view such information at the point of purchase could provide a stronger incentive for companies to steer their advertisements away from such outlets, especially since the effect of negative information can persist for several months 53 . Overall, these interventions could decrease the inadvertent advertising revenue going towards misinformation outlets, which could eventually lead to such sites ceasing to operate, as observed anecdotally in prior work 29 .

These interventions could ensure that both consumers and advertisers are provided with information about the consequences of their respective purchasing and advertising placement decisions so that they can account for their preferences. Having access to such information is necessary for an efficiently functioning economic system in accordance with the first fundamental theorem of welfare economics. However, whereas digital platforms are uniquely well-positioned in the ecosystem of consumers, advertisers and publishers to implement information interventions in the form of disclosures and rankings 54 , 55 , they may not have incentives to implement such interventions. With the backdrop of mounting pressure from advertisers 27 , 28 and calls for transparency in the programmatic advertising business 56 , information-based interventions could be incorporated into existing legislation to improve transparency. These include efforts such as the EU Digital Services Act, which includes a Code of Practice on Disinformation with enforceable provisions for different stakeholders in the advertising ecosystem to collectively fight misinformation, and US bills such as the Honest Ads Act and the Competition and Transparency in Digital Advertising (CTDA) Act, which include provisions to improve transparency in political advertising and the digital advertising ecosystem in general. Notably, in recent years, policy proposals that aim to reduce the prevalence of misinformation such as the Combating Misinformation and Disinformation bill in Australia and the bill against fake news in Germany have faced backlashes over posing risks to free speech 57 , 58 . Although such proposals face the challenge of striking the right balance between combating misinformation and protecting freedom of expression, the information interventions that we identify could help counter the financial incentive to produce misinformation in the first place by reducing the unintended advertising revenue going towards misinformation outlets. There are many parallels for regulation by information provision to address externalities in other industries, including chemicals (toxic release inventory reporting requirements), automobiles (fuel consumption information), food (nutrition and content labels) and airlines (greenhouse gas emissions), of which several have been demonstrated to be effective in prior work 41 , 59 , 60 .

Previously studies have shown that ‘demand-side’ interventions to counter online misinformation have focused on reducing the consumption and spread of misinformation among news consumers on online platforms. Although interventions such as accuracy prompts and digital literacy tips can increase the quality of news that people share 5 , this line of work has found limited support for news credibility signals in increasing the demand for credible news 61 or in reducing misperceptions among users 6 . Such constraints in changing user behaviour may also apply to credibility signals like watermarks for detecting AI misinformation. Moreover, whereas such interventions are only effective for the small subset of users who are exposed to misinformation 62 , our complementary ‘supply-side’ approach targets entities and individuals who might not necessarily consume or spread misinformation themselves.

Relative to existing proposals of supply-side interventions to curb the production of misinformation, which involve social media platforms banning the advertising of false news 63 or changing their advertising-driven business model altogether 18 , we outline a middle path to suggest that accounting for the preferences of advertisers and consumers could help counter the financing of online misinformation. Although platforms could coordinate to identify and deplatform misinformation websites 64 , prior work suggests that misinformation websites nearly always resurface through alternative providers unless the incentive to produce misinformation is addressed 29 . Moreover, the information interventions that we identify are also an improvement on the status quo, whereby advertisers and consumers can only implement their preferences by participating in boycotts of digital platforms over their inability to contain misinformation. Allowing advertisers to more easily observe and control whether their advertisements appear on misinformation websites could also limit backlash by enabling advertisers to better implement their preferences rather than participating in one-off short-term advertising boycotts 27 , 28 . Additionally, since consistently providing negative information can create lasting associations for consumers 65 , providing information disclosures on every advertisement for whether the advertising company involved appears on misinformation websites could have a substantial effect on consumer demand over time, providing incentives for advertising companies to reduce advertising on misinformation websites.

Given our findings, we suggest three promising avenues for future research. First, future work could evaluate the effectiveness of our information interventions in the field over a longer time period to quantify the decline in revenue generated by misinformation outlets resulting from increasing transparency for consumers or advertisers. Related to this, future work could also target a wider set of advertisers to validate the robustness of our interventions which would allow for broader generalizability. Second, our results on whether companies are willing to adopt solutions to avoid monetizing misinformation are based on their existing (often incorrect) beliefs about the prevalence of advertising on misinformation websites in general and for their own company. More research is needed to understand how advertising companies would respond in the context of correct beliefs. Third, although our research identifies potential interventions that digital platforms can adopt to curb the monetization of online misinformation, it is unclear whether it is in the interest of digital advertising platforms to do so. Moreover, whether the potential monetary and societal benefits of the information interventions we identify outweigh the revenue platforms generate by serving advertisements on misinformation websites remains to be studied. Overall, the effectiveness of platforms in mitigating misinformation will depend on a multi-pronged approach. Given that misinformation is largely financially motivated and that financially sustaining online misinformation can be substantially harmful for the advertising companies involved, simple low-cost informational interventions such as the ones we identify could go a long way in curbing the supply of online misinformation.

Background on digital advertising

The predominant business model of several mainstream digital media platforms relies on monetizing attention via advertising 3 . While these platforms typically offer free content and services to individual consumers, they generate revenue by serving as an intermediary or advertising exchange connecting advertisers with independent websites that want to host advertisements. To do so, platforms run online auctions to algorithmically distribute advertising across websites, known as ‘programmatic advertising’. For example, Google distributes advertising in this manner to more than two million non-Google sites that are part of the Google Display Network. This allows websites to generate revenue for hosting advertising, and they share a percentage of this payment with the platform. In the USA, more than 80% of digital display advertisements are placed programmatically 16 . We refer to these advertising exchanges as digital advertising platforms and use the term digital platforms to collectively refer to all the services offered by such media platforms.

We examine the role of advertising companies and digital advertising platforms in monetizing online misinformation. While in other forms of (offline) media, advertisers typically have substantial control over where their advertisements appear, advertising placement through digital advertising platforms is mainly automated. Since most companies do not have the capacity to participate in high-frequency advertising auctions that require them to place individual bids for each advertising slot they are interested in, they typically outsource the bidding process to an advertising platform. Such programmatic advertising gives companies relatively less control over where their advertisements end up online. However, companies can take steps to reduce advertising on misinformation websites, such as by only being part of advertising auctions for a select list of credible websites or blocking advertisements from appearing on specific misinformation outlets.

Collecting website data

We collect data on misinformation websites in three steps. First, we use a dataset maintained by NewsGuard. This company rates all the news and information websites that account for 95% of online engagement in each of the five countries where it operates. Journalists and experienced editors manually generate these ratings by reviewing news and information websites according to nine apolitical journalistic criteria. Recent research has used this dataset to identify misinformation websites 6 , 66 , 67 . In this paper, we consider each website that NewsGuard rates as repeatedly publishing false content between 2019 and 2021 to be a misinformation website and all others to be non-misinformation websites, leading to a set of 1,546 misinformation websites and 6,499 non-misinformation websites. To get coverage throughout our study period, we sample websites provided by NewsGuard from the start, middle and end of each year from 2019 to 2021. Additionally, we also sample websites from January 2022 and June 2022 to account for websites that may have existed during our study period and discovered later. Supplementary Table 3 summarizes the characteristics of this dataset. Our NewsGuard dataset contains websites across the political spectrum, including left-leaning websites (for example, https://www.palmerreport.com/ and https://occupydemocrats.com/ ), politically neutral websites (for example, https://rt.com/ and https://www.nationalenquirer.com ), and right-leaning websites (for example, https://www.thegatewaypundit.com/ and http://theconservativetreehouse.com/ ).

Note that prior research that has used the NewsGuard dataset has often used the term ‘untrustworthy’ to describe websites 6 , 67 . Such research has used NewsGuard’s aggregate classification whereby a site that scores below a certain threshold (60 points) on NewsGuard’s weighted score system is labelled as untrustworthy. Instead of using NewsGuard’s overall score for a website, we use the first criterion classified by NewsGuard for each website—that is, whether a website repeatedly publishes false news to identify a set of 1,546 misinformation websites. While 94% of the NewsGuard misinformation websites we identify in this manner are also untrustworthy based on NewsGuard’s classification, only about 52% of the untrustworthy websites are misinformation websites or websites that repeatedly publish false news. Our measure of misinformation is, therefore, more conservative than prior work using NewsGuard’s ‘untrustworthy’ label.

In addition to the NewsGuard dataset, we use a list of websites provided by the GDI. This non-profit organization identifies disinformation by analysing both the content and context of a message, and how they are spread through networks and across platforms 68 . In this way, GDI maintains a list of monthly-updated websites, which it also shares with interested advertising tech platforms to help reduce advertising on misinformation websites. The GDI list allows us to identify 1,869 additional misinformation websites. Finally, we augment our list of misinformation websites with 396 additional ones used in prior work 69 , 70 . Among the websites that NewsGuard rated as non-misinformation (at any point in our sample), 310 websites were considered to be misinformation websites by our other sources or by NewsGuard itself (during a different period in our sample). We categorize these websites as misinformation websites given their risk of producing misinformation.

Altogether, our website dataset consists of 10,310 websites, including 3,811 misinformation and 6,499 non-misinformation websites. Similar to prior work 6 , 67 , our final measure of misinformation is at the level of the website or online news outlet. Aggregating article-level information and using website-level metadata is meaningful since it reduces noise when arriving at a website-level measure. Finally, we use data from SEMRush, a leading online analytics platform, to determine the level of monthly traffic received by each website from 2019 to 2021.

Consumer experiment design

This study was reviewed by the Stanford University Institutional Review Board (Protocol No. IRB-63897) and the Carnegie Mellon University Institutional Review Board (protocol no. IRB00000603). Our study was pre-registered at the American Economic Association’s Registry under AEARCTR-0009973. Informed consent was obtained from all participants at the beginning of the survey.

Setting and sample recruitment

We recruited a sample of US internet users via CloudResearch. CloudResearch screened respondents for our study so that they are representative of the US population in terms of age, gender and race based on the US Census (2020). It is important to note that while we recruited our sample to be representative on these dimensions to improve the generalizability and external validity of our results, our sample is a diverse sample of US internet users, which is not necessarily representative of the US population on other dimensions 71 . To ensure data quality, we include a screener in our survey to check whether participants pay attention to the information provided. Only participants who pass this screener can proceed with the survey. Our total sample includes 4,039 participants, who are randomized into five groups approximately evenly.

The flow of the survey study is shown in Supplementary Fig. 1 . We begin by asking participants to report demographics such as age, gender and residence. From a list of trustworthy and misinformation outlets, we then ask participants questions about their behaviours in terms of the news outlets they have used in the past 12 months, their trust in the media (on a 5-point scale), the online services or platforms they have used and the number of petitions they have signed in the past 12 months.

Initial gift card preferences

We then inform participants that one in five (that is, 20% of all respondents) who complete the survey will be offered a US$25 gift card from a company of their choice out of six company options. Respondents are asked to rank the six gift card companies on a scale from their first choice (most preferred) to their sixth choice (least preferred). These six companies belong to one of three categories: fast food, food delivery and ride-sharing. All six companies appeared on the misinformation websites in our sample during the past three years (2019–2021), offer items below US$25, and are commonly used throughout the USA. The order in which the six companies are presented is randomized at the respondent level. As a robustness check, we also ask respondents to assign weights to each of the six gift card options. This question gives respondents greater flexibility by allowing them to indicate the possibility of indifference (that is, equal weights) between any set of options. We then ask participants to confirm which gift card they would like to receive if they are selected to ensure they have consistent preferences regardless of how the question is asked. At this initial elicitation stage, the respondents did not know that they will get another chance to revise their choice. Hence, these choices can be thought of as capturing their revealed preference.

Information treatments

All participants in the experiment are given baseline information on misinformation and advertising. This is meant to ensure that all participants in our experiment are made aware of how we define misinformation along with examples of a few misinformation websites (including right-wing, neutral and left-wing misinformation websites), how misinformation websites are identified, and how companies advertise on misinformation websites (via an illustrative example) and use digital platforms to automate placing advertisements.

Participants are then randomized into one control and four treatment groups, in which the information treatments are all based on factual information from our data and prior research. We use an active control design to isolate the effect of providing information relevant to the practice of specific companies on people’s behaviour 9 . Participants in the control group are given generic information based on prior research that is unrelated to advertising companies or platforms but relevant to topic of news and misinformation.

In our first ‘company only’ treatment group (T1), participants are given factual information stating that advertisements from their top choice gift card company appeared on misinformation websites in the recent past. Based on their preferences, people may change their final gift card preference away from their initial top-ranked company after receiving this information. It is unclear, however, whether advertising on misinformation websites would cause a sufficient change in consumption patterns and which sets of participants may be more affected.

Our second ‘platform only’ treatment group (T2) informs participants that companies using digital advertising platforms were about 10 times more likely to appear on misinformation websites than companies that did not use such platforms in the recent past. This information treatment measures the effects of digital advertising platforms in financing misinformation news outlets. Since it does not contain information about advertising companies, it practically serves as a second control group for our company-level outcome and aims to measure how people may respond to our platform-related outcome.

Because our descriptive data suggest that the use of digital advertising platforms amplifies advertising revenue for misinformation outlets, we are interested in measuring how consumers respond to a specific advertising company appearing on misinformation websites when also informed of the potential role of digital advertising platforms in placing companies’ advertising on misinformation websites. It is unclear whether consumers will attribute more blame to companies or advertising platforms for financing misinformation websites when informed about the role of the different stakeholders in this ecosystem. For this reason, our third ‘company and platform’ treatment (T3) combines information from our first two treatments (T1 and T2). Similar to T1, participants are given factual information that advertisements from their top choice gift card company appeared on misinformation websites in the recent past. Additionally, we informed participants that their top choice company used digital advertising platforms and companies that used such platforms were about ten times more likely to appear on misinformation websites than companies that did not use digital advertising platforms, as mentioned in T2.

Finally, since several advertising companies appear on misinformation websites, we would like to determine whether informing consumers about other advertising companies also appearing on misinformation websites changes their response towards their top choice company. In our fourth company-ranking treatment (T4), participants are given factual information, which states that “In the recent past, ads from all six companies below repeatedly appeared on misinformation websites in the following order of intensity”, and provided with a ranking from one of three years in our study period—that is, 2019, 2020 or 2021. We personalize these rankings by providing truthful information based on data from different years in the recent past such that the respondents’ top gift card choice company does not appear last in the ranking (that is, is not the company that advertises least on misinformation websites) and in most cases, advertises more intensely on misinformation websites than its potential substitute in the same company category (for example, fast food, food delivery or ride-sharing). Such a treatment allows us to measure potential differences in the direction of consumers switching their gift card choices, such as switching towards companies that advertise more or less intensely on misinformation websites. It could also give consumers reasonable deniability such as “everyone advertises on misinformation websites” leading to ambiguous predictions about the exact impact of the treatment effect.

We measure two pre-registered behavioural outcomes that collectively allow us to measure how people respond to our information treatments in terms of both voice and exit 25 . After the information treatment, all participants are asked to make their final gift card choice from the same six options they were shown earlier. Our main outcome of interest is whether participants ‘exit’ or switch their gift card preference—that is, whether they select a different gift card after the information treatment than their top choice indicated before the information treatment. To ensure incentive compatibility, participants are (truthfully) told that those randomly selected to receive a gift card will be offered the gift card of their choice at the end of our study. As mentioned above, the probability of being randomly chosen to receive a gift card is 20%. We choose a high probability of receiving a gift card relative to other online experiments since prior work has shown that consumers process choice-relevant information more carefully as realization probability increases 72 . To make the gift card outcome as realistic as possible, we also had a large value gift card (US$25). The focus of our experiments is on single-shot outcomes. While it would have been interesting to capture longer-term effects, the cost of implementing our gift card outcome for a large sample and expenditure on the other studies made a follow-up study cost-prohibitive.

Secondly, participants are given the option to sign one of several real online petitions that we made and hosted on Change.org. Participants can opt to sign a petition that advocates for either blocking or allowing advertising on misinformation or choose not to sign any petition. Further, participants could choose between two petitions for blocking advertisements on misinformation websites, suggesting that either: (1) advertising companies, or (2) digital advertising platforms, need to block advertisements from appearing on misinformation websites. Overall, participants selected among the following five choices: (1) “Companies like X need to block their ads from appearing on misinformation websites.”, where X is their top choice gift card company; (2) “Companies like X need to allow their ads to appear on misinformation websites.”, where X is their top choice gift card company; (3) “Digital ad platforms used by companies need to block ads from appearing on misinformation websites.”; (4) “Digital ad platforms used by companies need to allow ads to appear on misinformation websites.”; and (5) I do not want to sign any petition. To track the number of petition signatures for each of these four petition options across our randomized groups, we provide separate petition links to participants in each randomized group. We record several petition-related outcomes. First, we measure participants’ intention to sign a petition based on the option they select in this question. Participants who pass our attention check and opt to sign a petition are later provided with a link to their petition of choice. This allows tracking whether participants click on the petition link provided. Participants can also self-report whether they signed the petition. Finally, for each randomized group, we can track the total number of actual petition signatures.

Our petition outcomes serves two purposes. While our gift card outcome measures how people change their consumption behaviour in response to the information provided, people may also respond to our information treat ments in alternative ways—for example, by voicing their concerns or supplying information to the parties involved 25 , 26 . Given that the process of signing a petition is costly, participants’ responses to this outcome would constitute a meaningful measure similar to petition measures used in prior experimental work 73 , 74 . Second, since participants must choose between signing either company or platform petitions, this outcome allows us to measure whether or not, across our treatments, people hold advertising companies more responsible for financing misinformation than the digital advertising platforms that automatically place advertisements for companies.

In addition to our behavioural outcomes, we also record participants’ stated preferences. To do so, we ask participants about their degree of agreement with several statements about misinformation on a seven-point scale ranging from ‘strongly agree’ to ‘strongly disagree’. These include whether they think: (1) companies have an important role in reducing the spread of misinformation through their advertising practices; and whether (2) digital platforms should give companies the option to avoid advertising on misinformation websites.

We explore heterogeneity in consumer responses along four pre-registered dimensions. First, prior research recognizes differences in the salience of prosocial motivations across gender 75 , with women being more affected by social-impact messages than men 76 and more critical consumers of new media content 77 . Given these findings, we could expect female participants to be more strongly affected by our information treatments.

Responses to our information treatments may also differ by respondents’ political orientation. According to prior research, conservatives are especially likely to associate the mainstream media with the term ‘fake news’. These perceptions are generally linked to lower trust in media, voting for Trump, and higher belief in conspiracy theories 78 . Moreover, conservatives are more likely to consume misinformation 2 and the supply of misinformation has been found to be higher on the ideological right than on the left 79 . Consequently, we might expect stronger treatment effects for left-wing respondents.

Consumers who more frequently use a company’s products or services could be presumed to be more loyal towards the company or derive greater utility from its use, which could limit changes in their behaviour 37 . Alternatively, more frequent consumers may be more strongly affected by our information treatments as they may perceive their usage as supporting such company practices to a greater extent than less frequent consumers.

Finally, we measure whether people’s responses differ by whether they consume misinformation themselves based on whether they reported using misinformation outlets in the initial question asking them to select which news outlets they used in the past 12 months.

Tackling experimental validity concerns

In our incentivized, online setting where we measure behavioural outcomes, we expect experimenter demand effects to be minimal as has been evidenced in the experimental literature 80 , 81 . We take several steps to mitigate potential experimenter demand effects, including implementing best practices recommended in prior work 9 . First, our experiment has a neutral framing throughout the survey since the recruitment of participants. While recruiting participants, we invite them to “take a survey about the news, technology and businesses” without making any specific references to misinformation or its effects. While introducing misinformation websites and how they are identified by independent non-partisan organizations, we include examples of misinformation websites across the political spectrum (including both right-wing and left-wing sites) and provide an illustrative example of misinformation by foreign actors. In drafting the survey instruments, the phrasing of the questions and choices available were as neutral as possible. For example, while introducing our online petitions, we presented participants with the option to sign real petitions that suggest both blocking and allowing advertising on misinformation sites. Indeed, we find that the vast majority of participants believe that the information provided in the survey was unbiased as shown in Supplementary Fig. 4 . Only about 10% of participants chose one of the ‘biased’ or ‘very biased’ options when asked to rate the political bias of the survey information provided from a seven-point scale ranging from ‘very right-wing biased’ to ‘very left-wing biased’.

In our active control design, participants in all randomized groups are presented with the same baseline information about misinformation, given misinformation-related information in the information intervention and asked the same questions after the information intervention to emphasize the same topics and minimize potential differences in the understanding of the study across treatment groups. Moreover, to maximize privacy and increase truthful reporting 82 , respondents complete the surveys on their own devices without the physical presence of a researcher. We also do not collect respondents’ names or contact details (with the exception of eliciting emails to provide gift cards to participants at the end of the study).

In presenting our information interventions and measuring our behavioural outcomes, we take special care to not highlight the names of the specific entities being randomized across groups to avoid emphasizing what is being measured. We do, however, highlight our gift card incentives by putting the gift card information in bold text to ensure incentive compatibility since prior work has found that failing to make incentives conspicuous can vastly undermine their ability to shift behaviour 83 .

Apart from making the above design choices to minimize experimenter demand effects, we measure their relevance using a survey question. Since demand effects are less likely a concern if participants cannot identify the intent of the study 9 , we ask participants an open-ended question—that is, “What do you think is the purpose of our study?”. Following prior work 84 , 85 , we then analyse the responses to this question to examine whether they differ across treatment groups. To measure potential differences in the respondents’ perceptions of the study, we examine their open-ended text responses about the purpose of the study using a Support Vector Machine classifier, which incorporates several features in text analysis, including word, character, and sentence counts, sentiments, topics (using Gensim) and word embeddings. We predict treatment status using the classifier, keeping 75% of the sample for the training set and the remaining 25% as the test set. The classifier predicts treatment status similar to chance for our main treatment groups relative to the control group, as shown in Supplementary Table 11 . These results, which are similar in magnitude to those found in previous research 84 , 85 , suggest that our treatments do not substantially affect participants’ perceptions about the purpose of the study. Overall, this analysis gives us confidence that our main experimental findings are unlikely to be driven by experimenter demand effects.

To address external validity concerns, we incorporate additional exit outcomes in the paper, showing that treated individuals switched to lower preference products (Table 1 , columns 3 and 4) and products across categories (Table 1 , columns 5 and 6) after our information interventions by 8 and 5 percentage points, respectively. We also show in Supplementary Table 8 that as the difference between participants’ highest weighted and second highest weighted gift card choice increases, their switching behaviour decreases. This shows that the weights assigned by participants to their gift card options are capturing meaningful and costly differences in value, highlighting the external validity of our findings. More generally, our pre-registered heterogeneity analysis lends credence to the study’s external validity. In line with expectations, we find that less frequent users and more politically liberal individuals are likelier to switch (see Extended Data Table 3 for the full set of pre-registered heterogeneity results). Moreover, we find that the cost of switching gift cards varies based on participants’ observable characteristics. For example, treated participants who reported not using any of the misinformation news outlets in our survey lost 50% of the median value (US$12.50) of their initial top choice gift card whereas treated participants who reported reading such outlets lost 33.3% of the median value (US$8.33) of their initial top choice gift card. Participants’ text responses also indicate that they believed their choices to be consequential (see Supplementary Tables 1 and 2 ). As an example, while explaining their choice of gift card, one participant stated, “Because I would most likely use this gift card on my next visit to… and it is less likely that i would use the others.” Regarding the petition outcome, one participant stated “The source of this problem seems to be from the digital advertising platforms, so I’d rather sign the petition that stops them from putting ads on misinformation websites.”

Decision-maker experiment design

We followed the same IRB review, pre-registration and consent procedures as those used for our consumer study. This study addresses two research questions. First, we aim to measure the existing beliefs and preferences decision-makers have about advertising on misinformation websites. This will help inform whether companies may be inadvertently or willingly sustaining online misinformation. Secondly, we ask: how do decision-makers update their beliefs and demand for a platform-based solution to avoid advertising on misinformation websites in response to information about the role of platforms in amplifying the financing of misinformation? This will suggest whether companies may be more interested in adopting advertising platforms that reduce the financing of misinformation. To this end, we conduct an information-provision experiment 9 . While past work has examined how firm behaviour regarding market decisions changes in response to new information 48 , 49 , it is unclear how information on the role of digital advertising platforms in amplifying advertising on misinformation would affect decision-makers’ non-market strategies.

To recruit participants, we partnered with the executive education programmes at the Stanford Graduate School of Business and Heinz College at Carnegie Mellon University. We did so in order to survey senior managers and leaders who could influence strategic decision-making within their firms, in contrast to studies relying heavily on MBA students for understanding decision-making in various contexts such as competition, pricing, strategic alliances and marketing 86 , 87 , 88 , 89 . Additionally, partnering with two university programmes instead of a specific firm allowed us to access a more diverse sample of companies than prior work that sampled specific types of firms—for example, innovative firms, startups or small businesses 90 , 91 , 92 . Throughout this study, we use the preferences of decision-makers (for example, chief executive officers) as a proxy for company-level preferences since people in such roles shape the outcomes of their companies through their strategic decisions 93 , 94 .

Our partner organizations sent emails to their alumni on our behalf. We used neutral language in our study recruitment emails to attract a broad audience of participants to our survey regardless of their initial beliefs and concerns about misinformation, stating our goal as “conducting vital research on the role of digital technologies in impacting your organization” without mentioning misinformation. We received 567 complete responses, of which 90% are kept since they are from currently employed respondents. To ensure data quality, we dropped an additional 13% of responses where participants were inattentive in answering the survey, resulting in a final sample of 442 responses. These participants were determined to be inattentive since they provided an answer greater than 100 when asked to estimate a number out of 100 in the two questions eliciting their prior beliefs about companies and platforms before the information treatment was provided. Our final sample of 442 respondents is from companies that span all the 23 industries in our descriptive analysis. Moreover, as shown in Supplementary Fig. 5 , our sample of participants represents a broad array of company sizes and experience levels at their current roles. Additionally, about 22% of the executives in our sample (and 25% of all our participants) are women, which is aligned with the 21% to 26% industry estimates of women in senior roles globally 95 , 96 .

Supplementary Fig. 2 shows the design of the survey study. We first elicit participants’ current employment status. All those working in some capacity are allowed to continue the survey, whereas the rest of the participants are screened out. After asking for their main occupation, all participants in the experiment are provided with baseline information on misinformation and advertising similar to that provided in the consumer experiment.

 In our pre-registration, we highlighted that we would measure the baseline beliefs and preferences of decision-makers. We measure participants’ baseline beliefs about the roles of companies in general, their own company and platforms in general in financing misinformation. Specifically, participants are asked to estimate the number of companies among the most active 100 advertisers whose advertisements appeared on misinformation websites during the past three years (2019–2021). Additionally, we ask participants to report whether they think their company or organization had its advertisements appear on misinformation websites in the past three years. Finally, we measure participants’ beliefs about the role of digital advertising platforms in placing advertisements on misinformation websites. To do so, we first inform participants that during the past three years (2019–2021), out of every 100 companies that did not use digital advertising platforms, eight companies appeared on misinformation websites on average. We then asked participants to provide their best estimate for the number of companies whose advertisements appeared on misinformation websites out of every 100 companies that did use digital advertising platforms.

In addition to recording participants’ stated preferences using self-reported survey measures, we measure participants’ revealed preferences. To ensure incentive compatibility, participants are asked three questions in a randomized order: (1) information demand about consumer responses—that is, whether they would like to learn how consumers respond to companies whose advertisements appear on misinformation websites (based on our consumer survey experiment); (2) advertisement check—that is, whether they would like to know about their own company’s advertisements appearing on misinformation websites in the recent past; and (3) demand for a solution—that is, whether they would like to sign up for a 15-minute information session on how companies can manage where their advertisements appear online. Participants are told they can receive information about consumer responses at the end of the study if they opt to receive it whereas the advertisement check and solution information are provided as a follow-up after the survey. Participants are required to provide their emails and company name for the advertisement check. To sign up for an information session from our industry partner on a potential solution to avoid advertising on misinformation websites, participants sign up on a separate form by providing their emails. Since all three types of information offered are novel and otherwise costly to obtain, we expect respondents’ demand for such information to capture their revealed preferences.

Information intervention

Participants are then randomized into a treatment group, which receives information about the role of digital advertising platforms in placing advertising on misinformation websites, and a control group, which does not receive this information. Based on the dataset we assembled, participants are given factual information that companies that used digital advertising platforms were about ten times more likely to appear on misinformation websites than companies that did not use such platforms in the recent past. This information is identical to the information provided to participants in the T2 (that is, platform only) group in the consumer experiment.

After the information intervention, we first measure participants’ posterior beliefs about the role of digital advertising platforms in placing advertisements on misinformation websites following our pre-registration. Participants are told about the average number of companies whose advertisements appear per month on misinformation websites that are not monetized by digital advertising platforms. They are then asked to estimate the average number of companies whose advertisements appear monthly on misinformation websites that use digital advertising platforms. This question measures whether participants believe that the use of digital advertising platforms amplifies advertising on misinformation websites.

We record two behavioural outcomes, which were pre-registered as our primary outcomes of interest after the information intervention. Our main outcome of interest is the respondents’ demand for a platform-based solution to avoid advertising on misinformation websites. Participants can opt to learn more about two different types of information—that is: (1) which platforms least frequently place companies’ advertising on misinformation websites; and (2) which types of analytics technologies are used to improve advertising performance—or opt not to receive any information. Since participants can only opt to receive one of the two types of information, this question is meant to capture the trade-off between respondents’ concern for avoiding misinformation outlets and their desire to improve advertising performance, respectively. Participants are told that they will be provided with the information they choose at the end of this study. Following the literature in measuring information acquisition 97 , we measure respondents’ demand for solution information, which serves as a revealed-preference proxy for their interest in implementing a solution for their organization.

Additionally, to measure whether the information treatment increases concern for financing misinformation in general, we record a second behavioural measure. Participants are told that the research team will donate US$100 to one of two organizations after randomly selecting one of the first hundred responses: (1) the GDI; and (2) DataKind, which helps mission-driven organizations increase their impact by unlocking their data science potential ethically and responsibly.

Similarly to our consumer experiment, this survey was carried out in an online setting, where experimenter demand effects are limited 80 , 81 . We followed best practices 9 by keeping the treatment language neutral and ensuring the anonymity of the participants wherever possible. We find that most participants believe that the information provided in the survey was unbiased. Only about 7% of participants chose one of the ‘biased’ or ‘very biased’ options when asked to rate the political bias of the survey information provided from a seven-point scale ranging from ‘very right-wing biased’ to ‘very left-wing biased’.

Importantly, to ensure truthful reporting, our main experimental outcomes were incentive-compatible. In particular, respondents who chose our platform solution demand outcome to learn about which platforms least contribute to placing companies’ advertisements on misinformation websites had to face a trade-off between receiving this information and receiving information on improving advertising performance. Additionally, our baseline information demand outcomes elicited before the information intervention were also incentive-compatible in that participants would be asked to follow up on their decisions whether they opted for additional information via email or via an online information session.

These design choices are made to minimize demand effects on our main outcomes of interest. However, it is possible that these effects are still relevant, partially because participants may have an interest in ‘doing the right thing’ on a survey administered by an institution they have a connection with. We measure the relevance of potential demand effects using a survey question mirroring the approach used for our consumer experiment. To measure potential differences in the respondents’ perceptions of the study across our treatment and control groups, we predict treatment status based on respondents’ open-ended text responses about the purpose of the study via a support vector machine classifier, keeping 75% of the sample for the training set and the remaining 25% as the test set. We find that the classifier is only slightly worse than random chance in predicting treatment status (Supplementary Table 16 ) but similar in magnitude to those in the consumer experiment. Therefore, although experimenter demand effects may still be present, these results suggest that these effects do not drive our findings.

We address the external validity of our findings by verifying the decision-making capacity of our respondents within their organizations and by examining the generalizability of our sample. We find that the vast majority of those whose job titles we verify (94%) serve in executive or managerial roles within their organizations. The regression estimates in Supplementary Tables 18 and 19 show that our results remain qualitatively and quantitatively similar after the exclusion of the small sample of individuals in non-executive and non-managerial roles. Moreover, the verified and self-reported decision-makers are similar across observable characteristics as reported in Supplementary Table 17 , suggesting limited selection in our verification process. To examine the generalizability of our sample, we investigate their observable characteristics. As shown in Supplementary Fig. 5 , our sample of participants represents a broad array of company sizes and experience levels at their current roles. Additionally, about 22% of the executives in our sample (and 25% of all our participants) are women, which is aligned with the 21% to 26% industry estimates of women in senior roles globally 95 , 96 .

Reporting summary

Further information on research design is available in the  Nature Portfolio Reporting Summary linked to this article.

Data availability

Our study was pre-registered at the American Economic Association’s Registry under AEARCTR-0009973. The data that we collected for our experimental studies are available in anonymized form and can be accessed from https://github.com/wajeeha-ahmad/misinformation-advertising . Data on job titles for the second survey experiment are not available, to protect participant confidentiality. Data analysing the descriptive analysis of advertising on misinformation websites can be made available after obtaining permission from the proprietary sources on misinformation domains (NewsGuard and the GDI) and advertising (Oracle).  Source data are provided with this paper.

Code availability

Code supporting the findings of the paper is available at https://github.com/wajeeha-ahmad/misinformation-advertising . Code analysing the descriptive analysis of advertising on misinformation websites can be made available after obtaining permission from the proprietary sources.

Blumberg, D. L. 3 ways the ‘splinternet’ is damaging society. MIT Management Sloan School https://mitsloan.mit.edu/ideas-made-to-matter/3-ways-splinternet-damaging-society (2023).

Guess, A., Nagler, J. & Tucker, J. Less than you think: prevalence and predictors of fake news dissemination on Facebook. Sci. Adv. 5 , 1494–1504 (2019).

Article   Google Scholar  

Lazer, D. M. et al. The science of fake news: addressing fake news requires a multidisciplinary effort. Science 359 , 1094–1096 (2018).

Article   ADS   CAS   PubMed   Google Scholar  

Mosseri, A. Working to Stop Misinformation and False News. Meta Newsroom https://about.fb.com/news/2017/04/working-to-stop-misinformation-and-false-news/ (2017).

Arechar, A. A. et al. Understanding and combatting misinformation across 16 countries on six continents. Nat. Hum. Behav. 7 , 1502–1513 (2023).

Article   PubMed   Google Scholar  

Aslett, K., Guess, A. M., Bonneau, R., Nagler, J. & Tucker, J. A. News credibility labels have limited average effects on news diet quality and fail to reduce misperceptions. Sci. Adv. 8 , 3844 (2022).

Article   ADS   Google Scholar  

Pennycook, G. et al. Shifting attention to accuracy can reduce misinformation online. Nature 592 , 590–595 (2021).

Pennycook, G. & Rand, D. G. Fighting misinformation on social media using crowdsourced judgments of news source quality. Proc. Natl Acad. Sci. USA 116 , 2521–2526 (2019).

Article   ADS   CAS   PubMed   PubMed Central   Google Scholar  

Haaland, I., Roth, C. & Wohlfart, J. Designing information provision experiments. J. Econ. Lit. 61 , 3–40 (2023).

Bursztyn, L., Rao, A., Roth, C. P. & Yanagizawa Drott, D. H. Misinformation during a pandemic. Working paper 27417 (National Bureau of Economic Research, 2020); https://www.nber.org/papers/w27417 .

Van der Linden, S., Leiserowitz, A., Rosenthal, S. & Maibach, E. Inoculating the public against misinformation about climate change. Global Challenges 1 , 756–784 (2017).

Google Scholar  

McCarthy, B. Misinformation and the Jan 6 insurrection: when ‘patriot warriors’ were fed lies. Politifact https://www.politifact.com/article/2021/jun/30/misinformation-and-jan-6-insurrection-when-patriot/ (2021).

Higgins, A., McIntire, M. & Dance, J. G. Inside a fake news sausage factory: ‘This Is All About Income’. The New York Times (25 November 2016); https://www.nytimes.com/2016/11/25/world/europe/fake-news-donald-trump-hillary-clinton-georgia.html

Hao, K. How Facebook and Google fund global misinformation. MIT Technology Review https://www.technologyreview.com/2021/11/20/1039076/facebook-google-disinformation-clickbait/ (2021).

Giansiracusa, N. Google needs to defund misinformation. Slate https://slate.com/technology/2021/11/google-ads-misinformation-defunding-artificial-intelligence.html . (2021).

Austin, A., Barnard, J. & Hutcheon, N. Programmatic Marketing Forecasts (Zenith, 2019); https://s3.amazonaws.com/media.mediapost.com/uploads/ProgrammaticMarketingForecasts2019.pdf

Special Report: Top Brands are Sending $2.6 Billion to Misinformation Websites Each Year (NewsGuard, 2021); https://www.newsguardtech.com/special-reports/brands-send-billions-to-misinformation-websites-newsguard-comscore-report/ .

Romer, P. A tax that could fix big tech. The New York Times (6 May 2019); https://www.nytimes.com/2019/05/06/opinion/tax-facebook-google.html .

Love, J. & Cooke, K. Google, Facebook move to restrict ads on fake news sites. Reuters https://www.reuters.com/article/us-alphabet-advertising/google-facebook-move-to-restrict-ads-on-fake-news-sites-idUSKBN1392MM (2016).

Hsu, T. & Tracy, M. Investors push Home Depot and Omnicom to steer ads from misinformation. The New York Times (18 January 2021); https://www.nytimes.com/2021/01/18/business/media/investors-push-home-depot-and-omnicom-to-steer-ads-from-misinformation.html .

Grant, N. & Myers, S. L. Google promised to defund climate lies, but the ads keep coming. The New York Times (2 May 2023); https://www.nytimes.com/2023/05/02/technology/google-youtube-disinformation-climate-change.html .

Ryan-Mosley, T. Junk websites filled with AI-generated text are pulling in money from programmatic ads. MIT Technology Review https://www.technologyreview.com/2023/06/26/1075504/junk-websites-filled-with-ai-generated-text-are-pulling-in-money-from-programmatic-ads/ (2023).

Milmo, D. & Hern, A. Elections in UK and US at risk from AI-driven disinformation, say experts. The Guardian (20 May 2023); https://www.theguardian.com/technology/2023/may/20/elections-in-uk-and-us-at-risk-from-ai-driven-disinformation-say-experts .

Gomes Ribeiro, B., Horta Ribeiro, M., Almeida, V. & Meira, W. Analyzing the “sleeping giants” activism model in Brazil. In Proc. 14th ACM Web Science Conference https://doi.org/10.1145/3501247.3531563 (ACM, 2022).

Hirschman, A. O. Exit, Voice, and Loyalty: Responses to Decline in Firms, Organizations, and States (Harvard Univ. Press, 1970).

Gans, J. S., Goldfarb, A. & Lederman, M. Exit, tweets, and loyalty. Am. Econ. J. Microeconomics 13 , 68–112 (2021).

Hsu, T. Twitter’s advertisers pull back as layoffs sweep through company. The New York Times (4 November 2022); https://www.nytimes.com/2022/11/04/technology/twitter-advertisers.html .

Hsu, T. & Lutz, E. More than 1,000 companies boycotted Facebook. Did it work? The New York Times (1 August 2020); https://www.nytimes.com/2020/08/01/business/media/facebook-boycott.html .

Han, C., Kumar, D. & Durumeric, Z. On the infrastructure providers that support misinformation websites. In Proc. 16th International AAAI Conference on Web and Social Media https://ojs.aaai.org/index.php/ICWSM/article/view/19292/19064 (Association for the Advancement of Artificial Intelligence, 2022).

Papadogiannakis, E., Papadopoulos, P., Markatos, E. P. & Kourtellis, N. Who funds misinformation? A systematic analysis of the ad-related profit routines of fake news sites. Preprint at https://doi.org/10.48550/arXiv.2202.05079 (2023).

Bozarth, L. & Budak, C. An analysis of the partnership between retailers and low-credibility news publishers. J. Quant. Descr. Digit. Media https://doi.org/10.51685/jqd.2021.010 (2021).

Bozarth, L. & Budak, C. Market forces: quantifying the role of top credible ad servers in the fake news ecosystem. In Proc. 15th International AAAI Conference on Web and Social Media https://ojs.aaai.org/index.php/ICWSM/article/view/18043/17846 (Association for the Advancement of Artificial Intelligence, 2021).

Kohno, T., Zeng, E. & Roesner, F. Bad news, cickbait and deceptive ads on news and misinformation websites. In Workshop on Technology and Consumer Protection https://badads.cs.washington.edu/files/Zeng-ConPro2020-BadNews.pdf (2020).

Braun, J. A. & Eklund, J. L. Fake news, real money: ad tech platforms, profit-driven hoaxes, and the business of journalism. Digit. Journal. https://doi.org/10.1080/21670811.2018.1556314 (2019).

Du, S., Bhattacharya, C. B. & Sen, S. Corporate social responsibility and competitive advantage: overcoming the trust barrier. Manage. Sci. 57 , 1528–1545 (2011).

Bellman, S., Abdelmoety, Z. H., Murphy, J., Arismendez, S. & Varan, D. Brand safety: the effects of controversial video content on pre-roll advertising. Heliyon 4 , e01041 (2018).

Article   PubMed   PubMed Central   Google Scholar  

Liaukonyte, J., Tuchman, A. & Zhu, X. Frontiers: spilling the beans on political consumerism: do social media boycotts and buycotts translate to real sales impact? Market. Sci. https://doi.org/10.1287/mksc.2022.1386 (2022).

Chatterji, A. K. & Toffel, M. W. Assessing the impact of CEO activism. Organ. Environ. 32 , 159–185 (2019).

Lull, R. B. & Bushman, B. J. Do sex and violence sell? A meta-analytic review of the effects of sexual and violent media and ad content on memory, attitudes, and buying intentions. Psychol. Bull. 141 , 1022–1048 (2015).

Gaskell, G., Veltri, G. A., Lupianez-Villanueva, F., Folkvord, F. & Theben, A. The impact of online platform transparency of information on consumers’ choices. Behav. Public Policy 7 , 55–82 (2020).

Doshi, A. R., Dowell, G. W. & Toffel, M. W. How firms respond to mandatory information disclosure. Strat. Manage. J. 34 , 1209–1231 (2013).

Bettinger, E., Cunha, N., Lichand, G. & Madeira, R. Are the Effects of Informational Interventions Driven by Salience? Working paper (Univ. of Zurich, Department of Economics, 2021); https://www.econ.uzh.ch/apps/workingpapers/wp/econwp350.pdf .

Hauser, D. J. & Schwarz, N. It’s a trap! Instructional manipulation checks prompt systematic thinking on “tricky” tasks. SAGE Open https://doi.org/10.1177/2158244015584617 (2015).

Cabral, L. & Hortaçsu, A. The dynamics of seller reputation: evidence from Ebay. J. Ind. Econ. 58 , 54–78 (2010).

Link, S., Peichl, A., Roth, C. & Wohlfart, J. Information frictions among firms and households. J. Monet. Econ. 135 , 99–115 (2023).

Coibion, O., Gorodnichenko, Y. & Kumar, S. How do firms form their expectations? New survey evidence. Am. Econ. Rev. 108 , 2671–2713 (2018).

Perloff, L. S. & Brickman, P. False consensus and false uniqueness: biases in perceptions of similarity. Acad. Psychol. Bull. 4 , 475–494 (1982).

Kim, H. The value of competitor information: evidence from a field experiment. Acad. Manage. Proc. https://doi.org/10.5465/AMBPP.2021.12714abstract (2021).

Hanna, R., Mullainathan, S. & Schwartzstein, J. Learning through noticing: theory and evidence from a field experiment. Q. J. Econ. 129 , 1311–1353 (2014).

Ocasio, W. Towards an attention-based view of the firm. Strat. Manage. J. 18 , 187–206 (1997).

Ada, S., Nabout, N. A. & Feit, E. M. Context information can increase revenue in online display advertising auctions: evidence from a policy change. J. Market. Res. 59 , 1040–1058 (2021).

Holden, R. Google Travel: Find flights with lower carbon emissions. Google Blog https://blog.google/products/travel/find-flights-with-lower-carbon-emissions/ (2021).

Spampatti, T., Hahnel, U. J., Trutnevyte, E. & Brosch, T. Short and long-term dominance of negativeinformation in shaping public energy perceptions: the case of shallow geothermal systems. Energy Pol. 167 , 113070 (2022).

Boudreau, K. & Hagiu, A. Platforms, Markets, and Innovation (Edward Elgar, 2009).

Rietveld, J., Seamans, R. & Meggiorin, K. Market orchestrators: The effects of certification on platforms and their complementors. Strat. Sci. 6 , 244–264 (2021).

Horwitz, J. & Hagey, K. Google’s secret ‘Project Bernanke’ revealed in Texas antitrust case. Wall Street Journal (11 April 2021); https://www.wsj.com/articles/googles-secret-project-bernanke-revealed-in-texas-antitrust-case-11618097760 .

Remeikis, A. Why is Labor’s bill on combatting disinformation so controversial? The Guardian (1 October 2023); https://www.theguardian.com/australia-news/2023/oct/01/why-is-labors-bill-on-combatting-disinformation-so-controversial

Faiola, A. & Kirchner, S. How do you stop fake news? In Germany, with a law. The Washington Post (5 April 2017) https://www.washingtonpost.com/world/europe/how-do-you-stop-fake-news-in-germany-with-a-law/2017/04/05/e6834ad6-1a08-11e7-bcc2-7d1a0973e7b2_story.html .

Johnson, M. Regulation by shaming: deterrence effects of publicizing violations of workplace safety and health laws. Am. Econ. Rev. 110 , 1866–1904 (2020).

Jin, G. Z. & Leslie, P. Reputational Incentives for restaurant hygiene. Am. Econ. J. Microeconomics 1 , 237–267 (2009).

Chopra, F., Haaland, I. & Roth, C. Do people demand fact-checked news? Evidence from U.S. Democrats. J. Public Econ. 205 , 104549 (2022).

Allen, J., Howland, B., Mobius, M., Rothschild, D. & Watts, D. J. Evaluating the fake news problem at the scale of the information ecosystem. Sci. Adv. https://doi.org/10.1126/sciadv.aay3539 (2020).

Chiou, L. & Tucker, C. Fake News and Advertising on Social Media: A Study of the Anti-Vaccination Movement Working Paper 25223 (National Bureau of Economic Research, 2018); http://www.nber.org/papers/w25223.pdf .

Doshi, A. R. & Schmidt, W. Soft governance across digital platforms using transparency. Strat. Sci. https://doi.org/10.1287/stsc.2023.0006 (2024).

Lougee, B. & Wallace, J. The corporate social responsibility (CSR) trend. J. Appl. Corp. Finance 20 , 96–108 (2008).

Bhadani, S. et al. Political audience diversity and news reliability in algorithmic ranking. Nat. Hum. Behav. 6 , 495–505, (2022).

Moore, R. C., Dahlke, R. & Hancock, J. T. Exposure to untrustworthy websites in the 2020 US election. Nat. Hum. Behav. 7 , 1096–1105 (2023).

Decker, B. Adversarial narratives: a new model for disinformation. Global Disinformation Index https://www.disinformationindex.org/research/2019-4-1-adversarial-narratives-a-new-model-for-disinformation/ (2019).

Grinberg, N., Friedland, L., Swire-Thompson, B. & Lazer, D. Fake news on Twitter during the 2016 U.S. presidential election. Science 363 , 374–378 (2019).

Allcott, H., Gentzkow, M. & Yu, C. Trends in the diffusion of misinformation on social media. Res. Politics https://doi.org/10.1177/2053168019848554 (2019).

Coppock, A. & McClellan, O. A. Validating the demographic, political, psychological, and experimental results obtained from a new source of online survey respondents. Res. Politics https://doi.org/10.1177/2053168018822174 (2019).

Cao, X. & Zhang, J. Preference learning and demand forecast. Market. Sci, 40 , 62–79 (2021).

Grigorieff, A., Roth, C. & Ubfal, D. Does information change attitudes toward immigrants? Demography 57 , 1117–1143 (2020).

Haaland, I. & Roth, C. Labor market concerns and support for immigration. J. Public Econ. 191 , 104256 (2020).

Falk, A. et al. Global evidence on economic preferences. Q. J. Econ. 133 , 1645–1692 (2018).

Guzman, J., Oh, J. J. & Sen, A. What motivates innovative entrepreneurs? Evidence from a global field experiment. Manage. Sci. 66 , 4808–4819 (2020).

Xiao, X., Su, Y. & Lee, D. K. L. Who consumes new media content more wisely? Examining personality factors, SNS use, and new media literacy in the era of misinformation. Soc. Media Soc. https://doi.org/10.1177/2056305121990635 (2021).

Van der Linden, S., Panagopoulos, C. & Roozenbeek, J. You are fake news: political bias in perceptions of fake news. Media Cult. Soc. 42 , 460–470 (2020).

Garrett, R. K. & Bond, R. M. Conservatives’ susceptibility to political misperceptions. Sci. Adv. https://doi.org/10.1126/sciadv.abf1234 (2021).

Mummolo, J. & Peterson, E. Demand effects in survey experiments: an empirical assessment. Am. Polit. Sci. Rev. 113 , 517–529 (2019).

De Quidt, J., Haushofer, J. & Roth, C. Measuring and bounding experimenter demand. Am. Econ. Rev. 108 , 3266–3302 (2018).

Ong, A. D. & Weiss, D. J. The impact of anonymity on responses to sensitive questions. J. Appl. Soc. Psychol. 30 , 1691–1708 (2000).

John, L. K., Blunden, H., Milkman, K. L., Foschini, L. & Tuckfield, B. The limits of inconspicuous incentives. Organ. Behav. Hum. Decis. Process. 172 , 104180 (2022).

Bursztyn, L., Haaland, I. K., Rao, A. & Roth, C. P. Disguising Prejudice: Popular Rationales as Excuses for Intolerant Expression (Univ. of Warwick, Department of Economics, 2021); https://ideas.repec.org/p/wrk/warwec/1340.html

Song, L. The heterogeneous effects of social media content on racial attitudes. Working paper (2022); https://www.dropbox.com/s/f48vgfadd23226r/TwitterDiversity.pdf?dl=0 .

Mintz, O. & Currim, I. S. What drives managerial use of marketing and financial metrics and does metric use affect performance of marketing-mix activities? J. Market. 77 , 17–40 (2013).

Shah, R. H. & Swaminathan, V. Factors influencing partner selection in strategic alliances: the moderating role of alliance context. Strat. Manage. J. 29 , 471–494 (2008).

Dyer, J. H., Gregersen, H. B. & Christensen, C. Entrepreneur behaviors, opportunity recognition, and the origins of innovative ventures. Strat. Entrep. J. 2 , 317–338 (2008).

Montgomery, D. B., Moore, M. C. & Urbany, J. E. Reasoning about competitive reactions: evidence from executives. Market. Sci. 24 , 138–149 (2005).

Alekseev, G. et al. The effects of COVID-19 on U.S. small businesses: evidence from owners, managers, and employees. Manage. Sci. 69 , 7–24 (2022).

Bessen, J., Impink, S. M., Reichensperger, L. & Seamans, R. The role of data for AI startup growth. Res. Policy 51 , 104513 (2022).

Kerr, S. P., Kerr, W. R. & Dalton, M. Risk attitudes and personality traits of entrepreneurs and venture team members. Proc. Natl Acad. Sci. USA 116 , 17712–17716 (2019).

Bertrand, M. & Schoar, A. Managing with style: the effect of managers on firm policies. Q. J. Econ. 118 , 1169–1208 (2003).

Porter, M. E. Competitive Strategy Techniques for Analyzing Industries and Competitors (Free Press, 1980).

Women in the Workplace 2022 (LeanIn.org and McKinsey, 2022); https://womenintheworkplace.com/2022 .

Women in Business 2021: A Window of Opportunity (Grant Thornton, 2021); https://www.grantthornton.global/globalassets/1.-member-firms/global/insights/women-in-business/2021/grant-thornton-women-in-business-report-2021.pdf .

Capozza, F., Haaland, I., Roth, C. & Wohlfart, J. Studying information acquisition in the field: a practical guide and review. Working paper (Department of Economics, University of Copenhagen, 2021); https://www.econstor.eu/bitstream/10419/258958/1/cebi-wp2115.pdf .

Download references

Acknowledgements

The authors thank their data partners (NewsGuard, the GDI and Oracle) for sharing data; the executive education programmes at the Stanford Graduate School of Business and Heinz College at Carnegie Mellon University for partnership; T. Le for providing research assistance on part of this work; G. Jin, J. Wu, M. Liu, M. Collis, M. Gentzkow, P. Schwardmann, R. J. Duran, R. Moore, R. Appel, S. Agarwal, S. Borwankar and W. Lee for feedback and comments on earlier versions of this work; and participants at the Queen’s Workshop on the Economics of Media 2022, 2022 MIT Conference on Digital Experimentation, Workshop on Information Systems Economics 2022, Rotman School of Management PhD Strategy Seminar 2023, ISMS Marketing Science Conference 2023, The Sumantra Ghoshal Conference at London Business School 2023, Platform Strategy Research Symposium at Boston University 2023, NBER SI 2023 Digital Economics and Artificial Intelligence, Social Science Research Council Workshop on the Economics of Social Media 2023, West Coast Research Symposium 2023, Stanford University Trust and Safety Research Conference 2023, Conference on Information Systems and Technology 2023, HKU Business School Management and Strategy Seminar 2023, and Boston University Online Research Seminar on Digital Businesses 2024. This research was supported in part by the Stanford Digital Economy Lab, Stanford McCoy Family Center for Ethics in Society, Stanford Impact Labs, Stanford Technology Ventures Program, Project Liberty Institute and the Economics of Digital Services initiative at the University of Pennsylvania.

Author information

Authors and affiliations.

Department of Management Science and Engineering, Stanford University, Stanford, CA, USA

Wajeeha Ahmad & Charles Eesley

Heinz College of Information Systems and Public Policy, Carnegie Mellon University, Pittsburgh, PA, USA

Institute for Human-Centered Artificial Intelligence, Stanford University, Stanford, CA, USA

Erik Brynjolfsson

You can also search for this author in PubMed   Google Scholar

Contributions

W.A. conceived the research and collected the descriptive data. W.A. and A.S. designed the survey experiments. W.A. conducted the consumer and decision-maker experiments. W.A. analysed the descriptive and experimental data for all three studies, with A.S. aiding in research conceptualization and data analysis. W.A. wrote the paper with input from A.S. W.A., A.S. and C.E. edited the paper. A.S., E.B. and C.E. supervised the work. All authors secured grant financing and partnerships, made revisions and approved the final manuscript.

Corresponding author

Correspondence to Wajeeha Ahmad .

Ethics declarations

Competing interests.

W.A. was a research intern at Microsoft during summer 2023. The other authors declare no competing interests.

Peer review

Peer review information.

Nature thanks the anonymous reviewer(s) for their contribution to the peer review of this work. Peer review reports are available.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data figures and tables

Extended data fig. 1 text explanation clustering by randomized treatment group..

Notes: This figure plots regression coefficients from OLS regressions of an indicator for cluster membership on each randomized group. Results are shown for our primary model specification, which controls for participants’ demographic and behavioral characteristics (see  Supplementary Information , “Analysis: Consumer study outcomes”). Data are presented as coefficients with the horizontal bars representing 95% confidence intervals derived from robust standard errors. The topics along the y-axes are binary variables that take value 1 if a participant’s response is classified into the given topic and zero otherwise. Details about the text analyses are mentioned in  Supplementary Information and sample text responses are shown in Tables A 1 and A 2 . Figure (a) shows OLS regression results for text analysis on the open-ended reasons participants mentioned while explaining their choice of gift card ( n  = 4039). Figure (b) shows OLS regression results for text analysis on the open-ended reasons participants mentioned while explaining their choice of online petition to sign ( n  = 4039).

Extended Data Fig. 2 Weights Assigned by Treated Participants to their Initial and Final Gift Card Choices.

Notes: This figure shows the distribution of weights assigned by treated participants (i.e. those in randomized treatments T1, T3 or T4) to their top choice gift card before and after receiving the information treatment. The mean weight drops from 39.11 to 23.71 after receiving the information treatment, representing a 39.4% decline in the mean gift card value. The median weight drops from 35.0 to 20.0 after receiving the information treatment, representing a 42.9% decline in the median gift card value.

Extended Data Fig. 3 Delta Values by Treatment Group and Type of Voice Outcome.

Notes: The delta value reported here represents the difference between the proportion of a particular outcome variable and the proportion of actual recorded signatures for each treatment group.

Extended Data Fig. 4 Verified job titles of participants in our second survey experiment by category.

Notes: This figure shows the job titles of the sub-sample of participants (N = 316) whose job titles we were able to verify from external sources, e.g., LinkedIn, Crunchbase, etc. The size of each word corresponds to its frequency of appearance in our sample.

Supplementary information

Supplementary information.

This file contains Supplementary Methods (including more details about the design of the survey experiments and the method for analysing participants' open-text responses), Supplementary Tables and Figures (including descriptive statistics, additional analyses, and robustness checks).

Reporting Summary

Peer review file, source data, source data fig. 1, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Ahmad, W., Sen, A., Eesley, C. et al. Companies inadvertently fund online misinformation despite consumer backlash. Nature 630 , 123–131 (2024). https://doi.org/10.1038/s41586-024-07404-1

Download citation

Received : 07 July 2023

Accepted : 09 April 2024

Published : 05 June 2024

Issue Date : 06 June 2024

DOI : https://doi.org/10.1038/s41586-024-07404-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

research design company

  • Share full article

Advertisement

Supported by

Once a Sheriff’s Deputy in Florida, Now a Source of Disinformation From Russia

In 2016, Russia used an army of trolls to interfere in the U.S. presidential election. This year, an American given asylum in Moscow may be accomplishing much the same thing all by himself.

A lone car on a cobbled street lined with trees. A spire rises in the background under a deep blue sky.

By Steven Lee Myers

Steven Lee Myers spoke to more than a dozen researchers and government officials for this article.

A dozen years ago, John Mark Dougan, a former deputy sheriff in Palm Beach County, Fla., sent voters an email posing as a county commissioner, urging them to oppose the re-election of the county’s sheriff.

Listen to this article with reporter commentary

He later masqueraded online as a Russian tech worker with a pseudonym, BadVolf, to leak confidential information in violation of state law, fooling officials in Florida who thought they were dealing with a foreigner.

He also posed as a fictional New York City heiress he called Jessica, tricking an adviser to the Palm Beach County Sheriff’s Office into divulging improper conduct by the department.

“And boy, did he ever spill ALL of the beans,” Mr. Dougan said in a written response to questions for this article, in which he confirmed his role in these episodes.

Those subterfuges in the United States, it turned out, were only a prelude to a more prominent and potentially more ominous campaign of deception he has been conducting from Russia.

Mr. Dougan, 51, who received political asylum in Moscow, is now a key player in Russia’s disinformation operations against the West. Back in 2016, when the Kremlin interfered in the American presidential election, an army of computer trolls toiled for hours in an office building in St. Petersburg to try to fool Americans online.

Today Mr. Dougan may be accomplishing much the same task largely by himself, according to American and European government officials and researchers from companies and organizations that have tracked his activities since August. The groups include NewsGuard, a company that reviews the reliability of news and information online; Recorded Future, a threat intelligence company; and Clemson University’s Media Forensics Hub.

Working from an apartment crowded with servers and other computer equipment, Mr. Dougan has built an ever-growing network of more than 160 fake websites that mimic news outlets in the United States, Britain and France.

With the help of commercially available artificial intelligence tools, including OpenAI’s ChatGPT and DALL-E 3, he has filled the sites with tens of thousands of articles, many based on actual news events. Interspersed among them are also bespoke fabrications that officials in the United States and European Union have attributed to Russian intelligence agencies or the administration of President Vladimir V. Putin.

Between September and May, Mr. Dougan’s outlets have been cited or referred to in news articles or social media posts nearly 8,000 times, and seen by more than 37 million people in 16 languages, according to a report released Wednesday by NewsGuard .

The fakes have recently included a baseless article on a fake San Francisco Chronicle website that said Ukraine’s president, Volodymyr Zelensky, had smuggled 300 kilograms of cocaine from Argentina. Another false narrative appeared last month in the sham Chronicle and on another site, called The Boston Times, claiming that the C.I.A. was working with Ukrainians to undermine Donald J. Trump’s presidential campaign.

Mr. Dougan, in a series of text exchanges and one telephone interview with The New York Times, denied operating the sites. A digital trail of clues, including web domains and internet protocol addresses, suggests otherwise, the officials and researchers say.

A friend in Florida who has known Mr. Dougan for 20 years, Jose Lambiet, also said in a telephone interview that Mr. Dougan told him in January that he had created the sites.

Steven Brill, a founder of NewsGuard, which has spent months tracking Mr. Dougan’s work, said he represented “a massive incursion into the American news ecosystem.”

“It’s not just some guy sitting in his basement in New Jersey tapping out a phony website,” he added.

Mr. Dougan’s emergence as a weapon of the Kremlin’s propaganda war follows a troubled life in the United States that included home foreclosures and bankruptcy. As a law enforcement officer in Florida and Maine, he faced accusations of excessive use of force and sexual harassment that resulted in costly lawsuits against the departments he worked for.

He faces an arrest warrant in Florida — its records sealed by court order — on 21 felony charges of extortion and wiretapping that resulted from a long-running feud with the sheriff of Palm Beach County.

Mr. Dougan’s activities from Moscow, where he fled in 2016 one step ahead of those charges, continue to draw scrutiny from the authorities in the United States. Last year, he impersonated an F.B.I. agent in a telephone call to Mr. Brill, according to an account by Mr. Brill to be published next week in a new book, “The Death of Truth.”

Mr. Dougan, who acknowledged making the call in a text message this week, had been angered by a NewsGuard report in February 2023 that criticized YouTube for allowing videos parroting Russian propaganda about the war in Ukraine, including some by Mr. Dougan.

In a rambling, profanity-laced video in response on YouTube last year, Mr. Dougan posted excerpts from the call with Mr. Brill and showed a Google Earth satellite photograph of his home in Westchester County, a suburb of New York City — “just down the road from the Clinton crime family,” as Mr. Dougan put it, referring to the home of former President Bill Clinton and former Secretary of State Hillary Rodham Clinton.

The call prompted an F.B.I. investigation that, according to Mr. Brill, traced the call to Mr. Dougan’s telephone in Russia. (A spokeswoman for the bureau did not respond to a request for comment on the investigation or Mr. Dougan’s previous activities.)

A History in Law Enforcement

Mr. Dougan began to hone the skills that he is putting to use today during a turbulent childhood in the United States. In the written responses to questions for this article, he said he had struggled at home and in school, bullied because of Tourette’s syndrome, but found a passion in computers. When he was 8, he said, the man who would become his stepfather began teaching him to write computer code.

“By the time I was 16,” he wrote in one response, “I knew a dozen different programming languages.”

After a four-year stint in the Marine Corps, which he claims he offered to join in lieu of a jail sentence for fleeing a police stop for speeding on a motorcycle, he became a police officer first in a small force in Mangonia Park, Fla., and then the Palm Beach County Sheriff’s Office from 2005 to 2009.

According to news reports and his own accounts over the years, Mr. Dougan repeatedly clashed with superiors and colleagues, facing numerous internal investigations that he said were retaliatory because he objected to police misconduct, including instances of racial bias.

In 2009, he moved briefly to Windham, Maine, to work in another small-town police department. There he faced a complaint of sexual harassment that resulted in his dismissal before he completed his probationary period.

Mr. Dougan started a website called WindhamTalk to defend himself. The website foreshadowed others he would create, including one devoted to the Palm Beach County Sheriff’s Office, PBSOTalk.

After moving back to Florida, he used PBSOTalk to torment in particular the department’s elected sheriff, Ric L. Bradshaw, whom he accused of corruption. He posted the unlawful recordings of “Jessica” chatting with a former detective commander, Mark Lewis, who, Mr. Dougan claimed, was investigating the sheriff’s critics, including himself. As Mr. Dougan acknowledged in a video interview last year, it is illegal in Florida to record a telephone conversation without permission.

In a statement, a spokeswoman for the Sheriff’s Office, Therese C. Barbera, said Mr. Dougan was “a wanted felon for cyberstalking using unsubstantiated and fabricated claims that have NO factual basis.”

In February 2016, PBSOTalk posted confidential information about thousands of police officers, federal agents and judges. The next month, F.B.I. agents and local police officers searched Mr. Dougan’s home, seizing all of his electronic equipment.

Fearing arrest, he said, he made his way to Canada and caught a flight to Moscow. He was indicted on the 21 Florida felony charges the next year.

Peddling Russia Propaganda

In Russia, Mr. Dougan refashioned himself as a kind of journalist, documenting his travels around the country, including Lake Baikal in Siberia and Crimea, the peninsula in Ukraine that Russia annexed in 2014 in violation of international law.

He posted photographs and videos from those trips on YouTube, which suspended his channel after NewsGuard’s report last year. He also appeared regularly on state media, including with two former intelligence operatives, Maria Butina , who penetrated Republican political circles, and Anna Chapman , one of 10 spies who inspired the television series “The Americans.”

In 2021, as Mr. Putin began mobilizing the military forces that would invade Ukraine, Mr. Dougan posted a video that the Kremlin would cite as one justification for its attack. In it, he claimed that the United States operated biological weapons factories in Ukraine, an accusation that Russia and its allies have pushed without ever providing evidence .

Once the war started, Mr. Dougan recounted in his written responses to questions, he traveled to Ukraine 14 times to report from the Russian side of the front lines. He appeared in Russian government hearings purporting to expose Ukraine’s transgressions, indicating some level of cooperation with the government authorities.

He has faced criticism for the reports, including in a profile in The Daily Beast, that he posted on YouTube and other platforms. Mr. Dougan has portrayed the war much as Russia’s propaganda has: as a righteous battle against neo-Nazis backed by a decadent West, led by the United States and NATO.

“The West has consistently lied about every aspect of this conflict,” he wrote. “Why does only one side get to tell their story?”

Fake News Sites in the U.S.

In April 2021, Mr. Dougan revived a website called DC Weekly, which had been created four years earlier and published fake articles about the Palm Beach County Sheriff’s Office. According to a report last December by Clemson’s Media Forensics Hub, the domain and internet protocol address were shared by PBSOTalk and Mr. Dougan’s personal website, as well as two marketing books he wrote in exile and a security firm he operated, Falcon Eye Tech, which offered “offshore security monitoring services.”

After Russia’s assault on Ukraine began in 2022, the site carried articles about the war.

Then, last August, the site began to publish articles based on elaborate fabrications that the Western government officials and disinformation researchers said came from Russia’s propaganda units. They often appeared first in videos or audio recordings on obscure X accounts or YouTube channels, then spread to sites like DC Weekly and then to Russian state media as if they were authentic accusations, a process researchers call “narrative laundering.”

The baseless narratives included claims that relatives or cronies of Ukraine’s leader secretly bought luxury properties, yachts or jewelry, and that Prince Andrew, the brother of King Charles III of Britain, had abducted and abused children during a secret visit to Ukraine.

Dozens of new sites have appeared in recent months. They included ones made to look like local news outlets: The Chicago Chronicle, The Miami Chronicle, The Boston Times, The Flagstaff Post and The Houston Post. Some hijacked names of actual news organizations, like The San Francisco Chronicle, or approximated them, in the case of one called The New York News Daily.

When The New York Times reported on the new sites in March, DC Weekly published a lengthy response in a stilted style that indicated the use of artificial intelligence. It was written under the name Jessica Devlin, one of the fictitious journalists on the site. “I’m not a shadowy foreign actor,” the article said.

At the end, the article invited media inquiries at an email address with the domain Falcon Eye Tech.

Two days later, Mr. Dougan answered.

103 New Sites in Two Days

Mr. Dougan, who became a Russian citizen last year and voted in the country’s presidential election in March, said in his messages to The Times that he made a living by selling security devices he designed for a manufacturer in China. He denied being paid by any Russian authorities, claiming he funds his activities himself.

His friend Mr. Lambiet, a private investigator and former journalist, said he considered Mr. Dougan a good man but cautioned that Mr. Dougan had a propensity to make things up. “He’s like a Russian disinformation campaign: It’s hard to know what’s true and what’s not,” he said.

As evidence of Mr. Dougan’s role in the news sites has emerged, he has shifted tactics. Recorded Future, the threat intelligence company, released a report this month that detailed his ties to agencies linked to the Russian disinformation. The report documented the extensive use of A.I., which one of the company’s researchers, Clément Briens, estimated made Mr. Dougan’s work far cheaper than hiring a troll army.

At the time, Recorded Future identified 57 domains that Mr. Dougan had created. In a two-day span after the report was published, 103 new sites appeared, all on a server in California.

“He’s trying to obfuscate the Russian links,” Mr. Briens said.

Mr. Dougan at times treats his activities as a game of cat and mouse. He spent months engaging with a researcher at NewsGuard, McKenzie Sadeghi, revealing details of his life in Moscow while mocking her boss, Mr. Brill.

“He seemed to be toying with me, both to elicit my responses and, it seemed, to show off his talent for global online mischief, without actually admitting anything,” she wrote in the report published on Wednesday.

While Mr. Dougan’s sites have focused on Russian narratives about the war in Ukraine, the researchers and government officials say he has laid the foundation for interference in the unusually large confluence of elections taking place around the world this year.

This suggests a “risk of an expanded operation scope in the near future, potentially targeting diverse audiences and democratic systems in Europe and other Western nations for various strategic objectives,” the diplomatic service of the European Union wrote in a report last month when the network included only 23 websites.

In recent weeks, the sites have included themes that seem intended to stoke the partisan fires in the United States before November’s presidential election.

Last month, articles appeared on two of Mr. Dougan’s newer fake sites, The Houston Post and The Flagstaff Post, detailing a baseless claim that the F.B.I. had planted an eavesdropping device in Mr. Trump’s office at Mar-a-Lago in Florida.

Some of the new sites have names, like Right Review and Red State Report, that suggest a conservative political bent. In April, a site that researchers also linked to Mr. Dougan offered “major cryptocurrency rewards” for leaks of information about American officials, singling out two prosecutors and a judge involved in the criminal cases against Mr. Trump.

“If the site was mine,” he wrote in response to a question about it, “I would want people to give documents on any dirty politician, Republican, Democrat or other.”

Read by Steven Lee Myers

Audio produced by Patricia Sulbarán .

Steven Lee Myers covers misinformation and disinformation from San Francisco. Since joining The Times in 1989, he has reported from around the world, including Moscow, Baghdad, Beijing and Seoul. More about Steven Lee Myers

Ukraine-Russia war latest: French instructors in Ukraine would be 'legitimate target', Lavrov says on visit to West Africa

Sergei Lavrov, the Russian foreign minister, is on a tour in West Africa as part of a diplomatic push by the isolated Kremlin to forge new ties around the world. Meanwhile, an upcoming summit on Ukraine will reportedly aim to create a pathway for Russian officials to join future talks.

Wednesday 5 June 2024 17:57, UK

Russia's Foreign Minister Sergei Lavrov attends a meeting with Burkina Faso's Foreign Minister Karamoko Jean Marie Traore in Ouagadougou, Burkina Faso June 4, 2024. Russian Foreign Ministry/Handout via REUTERS THIS IMAGE HAS BEEN SUPPLIED BY A THIRD PARTY. NO RESALES. NO ARCHIVES. MANDATORY CREDIT.

  • Lavrov: French military instructors in Ukraine would be 'legitimate target'
  • Ivor Bennett: Why is Lavrov in Africa?
  • Ukraine peace summit 'opens door to limited talks with Russia'
  • Remote-control stretchers on trial in Ukraine
  • Big picture:  Everything you need to know about the war right now
  • Mapped: The territorial situation on the frontline today
  • Your questions answered: Are there any signs of an underground resistance in Russia?
  • Live reporting by Guy Birchall

Ukraine has used American weapons to strike inside Russia in recent days, according to a Western official.

The weapons were used under recently approved guidance from Joe Biden, allowing US arms to be used to strike inside Russia in defence of Kharkiv, Ukraine's second city.

The official spoke to Reuters on condition of anonymity.

Mr Biden's directive allows for US-supplied weapons to be used to strike Russian forces that are attacking or preparing to attack.

It does not change US policy that directs Ukraine not to use American-provided tactical or long-range missiles and other munitions to make offensive strikes inside Russia, US officials said.

Ukrainian officials had stepped up calls on the US to allow Kyiv's forces to defend themselves against attacks originating from Russian territory.

Kharkiv is 12 miles from the Russian border and has recently come under intensified Russian attack.

Volodymyr Zelenskyy has arrived in Qatar for talks with the state's emir, Sheikh Tamim bin Hamad Al Thani.

The Ukrainian president said on X that he planned to discuss Qatar's participation in a process of returning Ukrainian children abducted by Russia, as well as bilateral economic and security issues.

In March, Ukraine and Russia exchanged six children via Qatari mediation.

Earlier this week, Mr Zelenskyy made a surprise trip to the Philippines to thank the country for agreeing to participate in the upcoming peace summit being held in Switzerland.

A Russian-American man has been sentenced by a St Petersburg court to three-and-a-half years in prison on charges of "rehabilitating Nazism". 

Yuri Malev was arrested in December over social media posts in which he was alleged to have denigrated the Saint George's ribbon, a Russian military symbol of valour. 

One post reportedly contained "obscene language" and another other showed a picture of a corpse wearing the ribbon, captioned: "How to wear the Saint George's ribbon correctly". 

The court in St Petersburg said this showed disrespect for society and insulted the memory of the Great Patriotic War (the Russian name for the Second World War). 

Malev admitted guilt, according to the court. 

He was a graduate of the law faculty of St Petersburg University and had lived in the United States since 1991, according to independent Russian language media.

Baza, a Telegram channel with links to Russian authorities, said Malev was a resident of Brooklyn, New York. 

He reportedly entered Russia by bus from Estonia two weeks before he was arrested. 

Moscow routinely refers to the government in Kyiv as a "Nazi regime", despite Volodymyr Zelenskyy being Jewish.

Ukraine has been allowed to shoot down Moscow's planes over Russian territory with American weapons since the war broke out, the White House has clarified.

Ukraine "can shoot down Russian aeroplanes that pose an impending threat", national security spokesman John Kirby said. 

"And they have. They have since the beginning of the war."

The clarification was given due to confusion over Washington’s recent decision to relax rules on US-supplied weapons striking military targets on Russian soil.

Joe Biden gave authorisation for the strikes on a limited basis to help Kyiv defend itself against Kremlin forces advances towards Kharkiv.

Mr Kirby said he could not confirm reports that Ukraine had used US-supplied weapons on Russian territory for the first time.

He told reporters: "We're just not in a position on a day-to-day basis of knowing exactly what the Ukrainians are firing at what.

"It's certainly at a tactical level. So, I can't confirm that. I can tell you that they understand the guidance that they've been given."

By Ivor Bennett, Moscow correspondent 

Sergei Lavrov's trip to Africa is part of an ongoing diplomatic offensive by the Kremlin, running in parallel to its conflict in Ukraine.

Isolated from the West, Russia is trying to forge new ties and has found fertile ground in Africa.

There have been several coups in recent years that have ushered in anti-Western military juntas.

US troops were kicked out of Niger, for example, while the French had to leave Burkina Faso.

In both cases, Moscow was quick to move in as the new security guarantors, and their efforts clearly don't stop there.

This is the veteran foreign minister Lavrov's ninth visit to the continent since Russia invaded Ukraine.

Kenya, Burundi and South Africa were among his stops last year; this week it's Guinea, Congo and Burkina Faso.

In return for military support, Russia gains an ally - they may not support the war, but they won't criticise it either.

The Kremlin portrays this as the formation of a new world order, free from Western imperialism and hegemony.

But others say Russia are the neo-colonialists, painting this as a blatant attempt to expand their sphere of influence.

Ukraine's first deputy foreign minister has held talks with his Chinese counterpart in an effort to increase cooperation between the two countries, the Ukrainian ministry said. 

Ukraine's Andriy Sybiha also told Chinese Vice Foreign Minister Sun Weidong that he hoped China would participate in a Ukraine-led peace summit later in June. 

Mr Sybiha added that it could be "a good opportunity to make a practical contribution to achieving a just and lasting peace", the Ukrainian foreign ministry said.

A report earlier today (see 7.40am post) said that June's peace summit opened doors to "limited talks with Russia" - despite Russian officials not being invited.

Earlier we brought you news that Ukraine said it had shot down 22 of the 27 Shahed-type drones launched by Russia overnight (see 8.04am post). 

Now photographs have emerged of the aftermath of one of the strikes. 

Firefighters work to put out the massive blaze in the Poltava region.

As Russia opens a new front on Ukraine's northeastern border, the war has entered an important phase.

Readers have been sending in their questions to our senior correspondents and military experts for their take on the changing battlefield environment.

Today, Trevor Prew asks:

Are there any signs of an underground Russian resistance operating inside Russia, or can Russians openly criticize Putin, as long as they don't protest on the streets or mention the war.

Russia correspondent Ivor Bennett says:

In a word, no.

There wasn't much opposition to speak of in Russia even before the war, but now there's nothing left whatsoever.

All of Putin's political opponents are either exiled, jailed or dead, as are those with any connections to Alexei Navalny.

Those who dare to speak out are silenced.

At one end, are the long-time critics and opposition activists, like Vladimir Kara-Murza, the dual Russian-British national opposition who is serving a 25-year prison sentence for treason.

But at the other end are ordinary Russians, like the former schoolteacher Nikita Tushkanov sentenced to 5.5 years for comments he made online about Putin.

The crackdown on dissent seemingly knows no bounds and it's created a climate in which those who oppose the war are terrified to speak out.

They do exist - as evidenced by the huge turnout for Navalny's funeral. But that was a unique moment and is unlikely to be repeated anytime soon.

A unit of a Spanish firm that is refurbishing Leopard tanks for delivery to Ukraine suffered a cyber attack that took its website down, a pro-Russian hackers group said.

A spokesperson for the aerospace and defence company General Dynamics, of which the Spain-based Santa Barbara Systems is a part, said it was still analysing the cause of the website outage.

The company added that all of its operations in Europe were running normally.

The NoName hacking group claimed responsibility for the distributed denial-of-service (DDoS) attack on Telegram. 

DDoS attacks direct high volumes of internet traffic towards targeted servers to knock them offline. 

"We sent our DDoS missiles against websites in Russophobic Spain," the group, which often directs such actions against countries which support Ukraine, wrote on Telegram.

NATO said last month that Russia was behind an intensifying campaign of hybrid attacks on companies and infrastructure in member states, an accusation Russia dismissed as "misinformation". 

Santa Barbara assembles heavy vehicles such as Leopard tanks and artillery equipment for the Spanish army and has been involved in refurbishing Spain's mothballed Leopard tanks for delivery to the Ukrainian army, according to the defence ministry.

Last week, Spain pledged €1bn (£850.5m) in military support for Ukraine this year.

An update on our previous posts on Russian foreign minister Sergei Lavrov's trip to Africa. 

Mr Lavrov announced that Moscow will send additional military supplies and instructors to Burkina Faso to help the west African country boost its defence capabilities, Russian state media reported.

Burkina Faso, under military leadership since a 2022 coup, has played host to contingents of the Wagner mercenary force, whose founder Yevgeny Prigozhin was killed in a plane crash last August.

"From the very first contacts between our countries after President [Ibrahim] Traore came to power, we have been very closely engaged in all areas of cooperation, including the development of military and military-technical ties," TASS news agency cited Lavrov as saying. 

"I have no doubt that thanks to this cooperation, the remaining pockets of terrorism on the territory of Burkina Faso will be destroyed," Mr Lavrov said. 

Russia's foreign minister has made a series of visits to Africa since the start of the war in Ukraine as Russia, hit by Western sanctions, seeks new trade partners and tries to rally developing countries behind its vision of a "multipolar world" no longer dominated by the US and former European powers. 

Growing Russian security ties with Africa, including countries such as Mali, Burkina Faso and Niger where military leaders have seized power in coups, are a source of concern to the US and other Western governments. 

Separately, the RIA news agency reported on Wednesday that Russian aluminium giant Rusal is in negotiations with the government of Sierra Leone on a bauxite mining concession. 

Be the first to get Breaking News

Install the Sky News app for free

research design company

IMAGES

  1. PPT

    research design company

  2. Research Design: Definition, Types & Characteristics

    research design company

  3. Research Methodology Research Design Examples : Do You Know How to

    research design company

  4. Understanding what research design is

    research design company

  5. What is Research Design in Qualitative Research

    research design company

  6. Our guide to design research

    research design company

VIDEO

  1. Basic design of experimental research design

  2. Module -3 /Research Design/Research methodology/6 th semester/BA Economics /#calicutuniversity

  3. Research Design, Research Method: What's the Difference?

  4. QUALITATIVE RESEARCH DESIGN IN EDUCATIONAL RESEAERCH

  5. Research Assistant|Research Designs|Types of research design|Educationsimplified by OCDC|

  6. Needs of Experimental Design

COMMENTS

  1. IDEO

    Leading Global Design & Innovation Firm Transforming Businesses. IDEO's Mission: to Create a Sustainable, Equitable & Prosperous Future.

  2. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  3. What is a Research Design? Definition, Types, Methods and Examples

    A research design is defined as the overall plan or structure that guides the process of conducting research. Learn more about research design types, methods and examples. ... Experimental Design: A pharmaceutical company conducts a randomized controlled trial (RCT) to test the efficacy of a new drug. Participants are randomly assigned to two ...

  4. IDEO Design Research

    Complete independent design research activities. Contribute your ideas, expertise, or feedback via paid research surveys, usability tests, or written and multimedia diary studies. Experience a live prototype of a product, service, or experience. Walkthrough and provide feedback on life-size mockups of designs like voting booths, airplane cabins ...

  5. Research Design for Business

    Research design is the overall strategy (or research methodology) used to carry out a study. It defines the framework and plan to tackle established problems and/or questions through the collection, interpretation, analysis, and discussion of data.

  6. What Is Research Design? 8 Types + Examples

    Research design refers to the overall plan, structure or strategy that guides a research project, from its conception to the final analysis of data. Research designs for quantitative studies include descriptive, correlational, experimental and quasi-experimenta l designs. Research designs for qualitative studies include phenomenological ...

  7. The Four Types of Research Design

    Research Design Examples. Let's explore how leading brands employ different types of research design. In most cases, companies combine several methods to reach a comprehensive overview of a problem and find a solution. UnderArmour. Image Source. UnderArmour doubled its market share among running shoes by referring to diagnostic and ...

  8. What is Design Research?

    What is Design Research? Design research is the practice of gaining insights by observing users and understanding industry and market shifts. For example, in service design it involves designers' using ethnography—an area of anthropology—to access study participants, to gain the best insights and so be able to start to design popular ...

  9. Evidence-driven Design®

    The way a customer uses your product is the way they perceive your brand. Our evidence-driven design approach creates inspired experiences that reflect the brand behind them. 02 Evidence is stronger than opinions. From identifying the business opportunity to specifying the very last design detail, research leads to superior and more confident ...

  10. Smart Design

    Smart is a strategic design company that helps people live better and work smarter. Launching powerful products and experiences is our specialty. Smart Design. Projects; ... Research and insights; Business and org design; Product and service design; Engineering and technology; Industries. Technology; Consumer goods; Financial services ...

  11. IBM Design Research

    Design research guides teams to uncover insights and inform the experiences we create. It begins with the rigorous study of the people we serve and their context. This is the heart of Enterprise Design Thinking. While in the Loop, design research leads teams to continuously build understanding and empathy through observation, prototyping ...

  12. Research Design

    Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies. Frequently asked questions.

  13. Research Design: What it is, Elements & Types

    Research design is the framework of research methods and techniques chosen by a researcher to conduct a study. The design allows researchers to sharpen the research methods suitable for the subject matter and set up their studies for success. Creating a research topic explains the type of research (experimental,survey research,correlational ...

  14. 5 Best Design and Innovation Consulting Firms in 2023

    October 14, 2023. Product Management. UX studio is one of the global innovation design consulting firms focusing on UX design, research, and strategic consulting. We checked the top design consulting firms' competency level, experience, and portfolio to define the most reliable design thinking companies you can trust and partner up with in 2024.

  15. Research Design

    Research design: The research design will be a quasi-experimental design, with a pretest-posttest control group design. ... Business: Research design is used in the field of business to investigate consumer behavior, marketing strategies, and the impact of different business practices. Researchers use various designs, such as survey research ...

  16. (PDF) Basics of Research Design: A Guide to selecting appropriate

    for validity and reliability. Design is basically concerned with the aims, uses, purposes, intentions and plans within the. pr actical constraint of location, time, money and the researcher's ...

  17. What is Research Design and Why is it Important for Businesses

    Advantages of Research Design. Provides a systematic approach: Research design in business offers a structured and systematic approach to conducting research. It helps the researchers organize their thoughts, select appropriate research methods, and develop a plan to collect and analyze data.

  18. User Research and Design Services: Think Design

    User/ Design research can help you prepare for that future. Insights of what exactly shapes User/ Customers' experience, can empower us to delight them with relevant solutions along their journeys. Research by leading to those insights, drives us to create products which are relevant, accessible and applicable for the people we work with.

  19. Imposing Additional Costs on Russia for Its Continued War Against

    JOINT STOCK COMPANY DESIGN CENTER SOYUZ is a Russian electronics entity that specializes in the design and development of semiconductors and integrated circuits. OJSC SCIENTIFIC RESEARCH INSTITUTE OF PRECISION MECHANICAL ENGINEERING is a Russian electronics entity that develops process equipment for Russia's electronic industry, including for ...

  20. Treasury Sanctions Impede Russian Access to Battlefield Supplies and

    M.V. Frunze Arsenal Design Bureau Joint Stock Company is a military contractor that specializes in the development of space remote sensing systems. Joint Stock Company Research and Production Corporation Precision Systems and Instruments (NPK SPP) manufactures electronics for space complexes. NPK SPP won a contract from Russia's Ministry of ...

  21. Targeting Russia's Defense Establishment

    Joint Stock Company Special Design Bureau Turbina is a Russian defense entity that produces engines for Russia's armored vehicles, missiles, and artillery systems. Public Joint Stock Company Research and Production Corporation Istok Named After A.I. Shokin is a Russian defense entity that produces electronic warfare systems for Russia's ...

  22. Search Jobs

    Learn about careers at McKinsey by reading profiles, launching a job search, or exploring the firm.

  23. Khrunichev State Research and Production Space Center

    The company designed and produced all Soviet space stations, including Mir. OKB-23, renamed to Salyut Design Bureau, became an independent company in 1988. In 1993, the Khrunichev Plant and the Salyut Design Bureau were joined again to form Khrunichev State Research and Production Space Center.

  24. Google UX Design Professional Certificate

    As the digital world continues to expand, companies recognize that designing good user experiences is a necessity, which is why UX design is a high-growth and in-demand job field. UX designers create and organize products that people interact with daily, like mobile apps, websites, smart watches, and even physical products.

  25. How To Start A Business In 11 Steps (2024 Guide)

    Web Design & Hosting. ... Be sure to do your research, create a solid business plan and pivot along the way. Once you're operational, don't forget to stay focused and organized so you can ...

  26. OpenAI Says It Has Begun Training a New Flagship A.I. Model

    The advanced A.I. system would succeed GPT-4, which powers ChatGPT. The company has also created a new safety committee to address A.I.'s risks. By Cade Metz Reporting from San Francisco OpenAI ...

  27. Companies inadvertently fund online misinformation despite ...

    Our research design contributes to this literature in two key ways by: (1) measuring both types of potential consumer responses—that is, 'exit' and 'voice'—that are theorized in the ...

  28. Once a Sheriff's Deputy in Florida, Now a Source of Disinformation From

    In 2016, Russia used an army of trolls to interfere in the U.S. presidential election. This year, an American given asylum in Moscow may be accomplishing much the same thing all by himself.

  29. Ukraine-Russia war latest: Kremlin responds to claims Russia is

    American companies in particular need to pay more attention to their supply chains to ensure they are not complicit with Russia's evasion of sanctions over Ukraine, Wally Adeyemo said in an ...