Home Blog Design Understanding Data Presentations (Guide + Examples)

Understanding Data Presentations (Guide + Examples)

Cover for guide on data presentation by SlideModel

In this age of overwhelming information, the skill to effectively convey data has become extremely valuable. Initiating a discussion on data presentation types involves thoughtful consideration of the nature of your data and the message you aim to convey. Different types of visualizations serve distinct purposes. Whether you’re dealing with how to develop a report or simply trying to communicate complex information, how you present data influences how well your audience understands and engages with it. This extensive guide leads you through the different ways of data presentation.

Table of Contents

What is a Data Presentation?

What should a data presentation include, line graphs, treemap chart, scatter plot, how to choose a data presentation type, recommended data presentation templates, common mistakes done in data presentation.

A data presentation is a slide deck that aims to disclose quantitative information to an audience through the use of visual formats and narrative techniques derived from data analysis, making complex data understandable and actionable. This process requires a series of tools, such as charts, graphs, tables, infographics, dashboards, and so on, supported by concise textual explanations to improve understanding and boost retention rate.

Data presentations require us to cull data in a format that allows the presenter to highlight trends, patterns, and insights so that the audience can act upon the shared information. In a few words, the goal of data presentations is to enable viewers to grasp complicated concepts or trends quickly, facilitating informed decision-making or deeper analysis.

Data presentations go beyond the mere usage of graphical elements. Seasoned presenters encompass visuals with the art of data storytelling , so the speech skillfully connects the points through a narrative that resonates with the audience. Depending on the purpose – inspire, persuade, inform, support decision-making processes, etc. – is the data presentation format that is better suited to help us in this journey.

To nail your upcoming data presentation, ensure to count with the following elements:

  • Clear Objectives: Understand the intent of your presentation before selecting the graphical layout and metaphors to make content easier to grasp.
  • Engaging introduction: Use a powerful hook from the get-go. For instance, you can ask a big question or present a problem that your data will answer. Take a look at our guide on how to start a presentation for tips & insights.
  • Structured Narrative: Your data presentation must tell a coherent story. This means a beginning where you present the context, a middle section in which you present the data, and an ending that uses a call-to-action. Check our guide on presentation structure for further information.
  • Visual Elements: These are the charts, graphs, and other elements of visual communication we ought to use to present data. This article will cover one by one the different types of data representation methods we can use, and provide further guidance on choosing between them.
  • Insights and Analysis: This is not just showcasing a graph and letting people get an idea about it. A proper data presentation includes the interpretation of that data, the reason why it’s included, and why it matters to your research.
  • Conclusion & CTA: Ending your presentation with a call to action is necessary. Whether you intend to wow your audience into acquiring your services, inspire them to change the world, or whatever the purpose of your presentation, there must be a stage in which you convey all that you shared and show the path to staying in touch. Plan ahead whether you want to use a thank-you slide, a video presentation, or which method is apt and tailored to the kind of presentation you deliver.
  • Q&A Session: After your speech is concluded, allocate 3-5 minutes for the audience to raise any questions about the information you disclosed. This is an extra chance to establish your authority on the topic. Check our guide on questions and answer sessions in presentations here.

Bar charts are a graphical representation of data using rectangular bars to show quantities or frequencies in an established category. They make it easy for readers to spot patterns or trends. Bar charts can be horizontal or vertical, although the vertical format is commonly known as a column chart. They display categorical, discrete, or continuous variables grouped in class intervals [1] . They include an axis and a set of labeled bars horizontally or vertically. These bars represent the frequencies of variable values or the values themselves. Numbers on the y-axis of a vertical bar chart or the x-axis of a horizontal bar chart are called the scale.

Presentation of the data through bar charts

Real-Life Application of Bar Charts

Let’s say a sales manager is presenting sales to their audience. Using a bar chart, he follows these steps.

Step 1: Selecting Data

The first step is to identify the specific data you will present to your audience.

The sales manager has highlighted these products for the presentation.

  • Product A: Men’s Shoes
  • Product B: Women’s Apparel
  • Product C: Electronics
  • Product D: Home Decor

Step 2: Choosing Orientation

Opt for a vertical layout for simplicity. Vertical bar charts help compare different categories in case there are not too many categories [1] . They can also help show different trends. A vertical bar chart is used where each bar represents one of the four chosen products. After plotting the data, it is seen that the height of each bar directly represents the sales performance of the respective product.

It is visible that the tallest bar (Electronics – Product C) is showing the highest sales. However, the shorter bars (Women’s Apparel – Product B and Home Decor – Product D) need attention. It indicates areas that require further analysis or strategies for improvement.

Step 3: Colorful Insights

Different colors are used to differentiate each product. It is essential to show a color-coded chart where the audience can distinguish between products.

  • Men’s Shoes (Product A): Yellow
  • Women’s Apparel (Product B): Orange
  • Electronics (Product C): Violet
  • Home Decor (Product D): Blue

Accurate bar chart representation of data with a color coded legend

Bar charts are straightforward and easily understandable for presenting data. They are versatile when comparing products or any categorical data [2] . Bar charts adapt seamlessly to retail scenarios. Despite that, bar charts have a few shortcomings. They cannot illustrate data trends over time. Besides, overloading the chart with numerous products can lead to visual clutter, diminishing its effectiveness.

For more information, check our collection of bar chart templates for PowerPoint .

Line graphs help illustrate data trends, progressions, or fluctuations by connecting a series of data points called ‘markers’ with straight line segments. This provides a straightforward representation of how values change [5] . Their versatility makes them invaluable for scenarios requiring a visual understanding of continuous data. In addition, line graphs are also useful for comparing multiple datasets over the same timeline. Using multiple line graphs allows us to compare more than one data set. They simplify complex information so the audience can quickly grasp the ups and downs of values. From tracking stock prices to analyzing experimental results, you can use line graphs to show how data changes over a continuous timeline. They show trends with simplicity and clarity.

Real-life Application of Line Graphs

To understand line graphs thoroughly, we will use a real case. Imagine you’re a financial analyst presenting a tech company’s monthly sales for a licensed product over the past year. Investors want insights into sales behavior by month, how market trends may have influenced sales performance and reception to the new pricing strategy. To present data via a line graph, you will complete these steps.

First, you need to gather the data. In this case, your data will be the sales numbers. For example:

  • January: $45,000
  • February: $55,000
  • March: $45,000
  • April: $60,000
  • May: $ 70,000
  • June: $65,000
  • July: $62,000
  • August: $68,000
  • September: $81,000
  • October: $76,000
  • November: $87,000
  • December: $91,000

After choosing the data, the next step is to select the orientation. Like bar charts, you can use vertical or horizontal line graphs. However, we want to keep this simple, so we will keep the timeline (x-axis) horizontal while the sales numbers (y-axis) vertical.

Step 3: Connecting Trends

After adding the data to your preferred software, you will plot a line graph. In the graph, each month’s sales are represented by data points connected by a line.

Line graph in data presentation

Step 4: Adding Clarity with Color

If there are multiple lines, you can also add colors to highlight each one, making it easier to follow.

Line graphs excel at visually presenting trends over time. These presentation aids identify patterns, like upward or downward trends. However, too many data points can clutter the graph, making it harder to interpret. Line graphs work best with continuous data but are not suitable for categories.

For more information, check our collection of line chart templates for PowerPoint and our article about how to make a presentation graph .

A data dashboard is a visual tool for analyzing information. Different graphs, charts, and tables are consolidated in a layout to showcase the information required to achieve one or more objectives. Dashboards help quickly see Key Performance Indicators (KPIs). You don’t make new visuals in the dashboard; instead, you use it to display visuals you’ve already made in worksheets [3] .

Keeping the number of visuals on a dashboard to three or four is recommended. Adding too many can make it hard to see the main points [4]. Dashboards can be used for business analytics to analyze sales, revenue, and marketing metrics at a time. They are also used in the manufacturing industry, as they allow users to grasp the entire production scenario at the moment while tracking the core KPIs for each line.

Real-Life Application of a Dashboard

Consider a project manager presenting a software development project’s progress to a tech company’s leadership team. He follows the following steps.

Step 1: Defining Key Metrics

To effectively communicate the project’s status, identify key metrics such as completion status, budget, and bug resolution rates. Then, choose measurable metrics aligned with project objectives.

Step 2: Choosing Visualization Widgets

After finalizing the data, presentation aids that align with each metric are selected. For this project, the project manager chooses a progress bar for the completion status and uses bar charts for budget allocation. Likewise, he implements line charts for bug resolution rates.

Data analysis presentation example

Step 3: Dashboard Layout

Key metrics are prominently placed in the dashboard for easy visibility, and the manager ensures that it appears clean and organized.

Dashboards provide a comprehensive view of key project metrics. Users can interact with data, customize views, and drill down for detailed analysis. However, creating an effective dashboard requires careful planning to avoid clutter. Besides, dashboards rely on the availability and accuracy of underlying data sources.

For more information, check our article on how to design a dashboard presentation , and discover our collection of dashboard PowerPoint templates .

Treemap charts represent hierarchical data structured in a series of nested rectangles [6] . As each branch of the ‘tree’ is given a rectangle, smaller tiles can be seen representing sub-branches, meaning elements on a lower hierarchical level than the parent rectangle. Each one of those rectangular nodes is built by representing an area proportional to the specified data dimension.

Treemaps are useful for visualizing large datasets in compact space. It is easy to identify patterns, such as which categories are dominant. Common applications of the treemap chart are seen in the IT industry, such as resource allocation, disk space management, website analytics, etc. Also, they can be used in multiple industries like healthcare data analysis, market share across different product categories, or even in finance to visualize portfolios.

Real-Life Application of a Treemap Chart

Let’s consider a financial scenario where a financial team wants to represent the budget allocation of a company. There is a hierarchy in the process, so it is helpful to use a treemap chart. In the chart, the top-level rectangle could represent the total budget, and it would be subdivided into smaller rectangles, each denoting a specific department. Further subdivisions within these smaller rectangles might represent individual projects or cost categories.

Step 1: Define Your Data Hierarchy

While presenting data on the budget allocation, start by outlining the hierarchical structure. The sequence will be like the overall budget at the top, followed by departments, projects within each department, and finally, individual cost categories for each project.

  • Top-level rectangle: Total Budget
  • Second-level rectangles: Departments (Engineering, Marketing, Sales)
  • Third-level rectangles: Projects within each department
  • Fourth-level rectangles: Cost categories for each project (Personnel, Marketing Expenses, Equipment)

Step 2: Choose a Suitable Tool

It’s time to select a data visualization tool supporting Treemaps. Popular choices include Tableau, Microsoft Power BI, PowerPoint, or even coding with libraries like D3.js. It is vital to ensure that the chosen tool provides customization options for colors, labels, and hierarchical structures.

Here, the team uses PowerPoint for this guide because of its user-friendly interface and robust Treemap capabilities.

Step 3: Make a Treemap Chart with PowerPoint

After opening the PowerPoint presentation, they chose “SmartArt” to form the chart. The SmartArt Graphic window has a “Hierarchy” category on the left.  Here, you will see multiple options. You can choose any layout that resembles a Treemap. The “Table Hierarchy” or “Organization Chart” options can be adapted. The team selects the Table Hierarchy as it looks close to a Treemap.

Step 5: Input Your Data

After that, a new window will open with a basic structure. They add the data one by one by clicking on the text boxes. They start with the top-level rectangle, representing the total budget.  

Treemap used for presenting data

Step 6: Customize the Treemap

By clicking on each shape, they customize its color, size, and label. At the same time, they can adjust the font size, style, and color of labels by using the options in the “Format” tab in PowerPoint. Using different colors for each level enhances the visual difference.

Treemaps excel at illustrating hierarchical structures. These charts make it easy to understand relationships and dependencies. They efficiently use space, compactly displaying a large amount of data, reducing the need for excessive scrolling or navigation. Additionally, using colors enhances the understanding of data by representing different variables or categories.

In some cases, treemaps might become complex, especially with deep hierarchies.  It becomes challenging for some users to interpret the chart. At the same time, displaying detailed information within each rectangle might be constrained by space. It potentially limits the amount of data that can be shown clearly. Without proper labeling and color coding, there’s a risk of misinterpretation.

A heatmap is a data visualization tool that uses color coding to represent values across a two-dimensional surface. In these, colors replace numbers to indicate the magnitude of each cell. This color-shaded matrix display is valuable for summarizing and understanding data sets with a glance [7] . The intensity of the color corresponds to the value it represents, making it easy to identify patterns, trends, and variations in the data.

As a tool, heatmaps help businesses analyze website interactions, revealing user behavior patterns and preferences to enhance overall user experience. In addition, companies use heatmaps to assess content engagement, identifying popular sections and areas of improvement for more effective communication. They excel at highlighting patterns and trends in large datasets, making it easy to identify areas of interest.

We can implement heatmaps to express multiple data types, such as numerical values, percentages, or even categorical data. Heatmaps help us easily spot areas with lots of activity, making them helpful in figuring out clusters [8] . When making these maps, it is important to pick colors carefully. The colors need to show the differences between groups or levels of something. And it is good to use colors that people with colorblindness can easily see.

Check our detailed guide on how to create a heatmap here. Also discover our collection of heatmap PowerPoint templates .

Pie charts are circular statistical graphics divided into slices to illustrate numerical proportions. Each slice represents a proportionate part of the whole, making it easy to visualize the contribution of each component to the total.

The size of the pie charts is influenced by the value of data points within each pie. The total of all data points in a pie determines its size. The pie with the highest data points appears as the largest, whereas the others are proportionally smaller. However, you can present all pies of the same size if proportional representation is not required [9] . Sometimes, pie charts are difficult to read, or additional information is required. A variation of this tool can be used instead, known as the donut chart , which has the same structure but a blank center, creating a ring shape. Presenters can add extra information, and the ring shape helps to declutter the graph.

Pie charts are used in business to show percentage distribution, compare relative sizes of categories, or present straightforward data sets where visualizing ratios is essential.

Real-Life Application of Pie Charts

Consider a scenario where you want to represent the distribution of the data. Each slice of the pie chart would represent a different category, and the size of each slice would indicate the percentage of the total portion allocated to that category.

Step 1: Define Your Data Structure

Imagine you are presenting the distribution of a project budget among different expense categories.

  • Column A: Expense Categories (Personnel, Equipment, Marketing, Miscellaneous)
  • Column B: Budget Amounts ($40,000, $30,000, $20,000, $10,000) Column B represents the values of your categories in Column A.

Step 2: Insert a Pie Chart

Using any of the accessible tools, you can create a pie chart. The most convenient tools for forming a pie chart in a presentation are presentation tools such as PowerPoint or Google Slides.  You will notice that the pie chart assigns each expense category a percentage of the total budget by dividing it by the total budget.

For instance:

  • Personnel: $40,000 / ($40,000 + $30,000 + $20,000 + $10,000) = 40%
  • Equipment: $30,000 / ($40,000 + $30,000 + $20,000 + $10,000) = 30%
  • Marketing: $20,000 / ($40,000 + $30,000 + $20,000 + $10,000) = 20%
  • Miscellaneous: $10,000 / ($40,000 + $30,000 + $20,000 + $10,000) = 10%

You can make a chart out of this or just pull out the pie chart from the data.

Pie chart template in data presentation

3D pie charts and 3D donut charts are quite popular among the audience. They stand out as visual elements in any presentation slide, so let’s take a look at how our pie chart example would look in 3D pie chart format.

3D pie chart in data presentation

Step 03: Results Interpretation

The pie chart visually illustrates the distribution of the project budget among different expense categories. Personnel constitutes the largest portion at 40%, followed by equipment at 30%, marketing at 20%, and miscellaneous at 10%. This breakdown provides a clear overview of where the project funds are allocated, which helps in informed decision-making and resource management. It is evident that personnel are a significant investment, emphasizing their importance in the overall project budget.

Pie charts provide a straightforward way to represent proportions and percentages. They are easy to understand, even for individuals with limited data analysis experience. These charts work well for small datasets with a limited number of categories.

However, a pie chart can become cluttered and less effective in situations with many categories. Accurate interpretation may be challenging, especially when dealing with slight differences in slice sizes. In addition, these charts are static and do not effectively convey trends over time.

For more information, check our collection of pie chart templates for PowerPoint .

Histograms present the distribution of numerical variables. Unlike a bar chart that records each unique response separately, histograms organize numeric responses into bins and show the frequency of reactions within each bin [10] . The x-axis of a histogram shows the range of values for a numeric variable. At the same time, the y-axis indicates the relative frequencies (percentage of the total counts) for that range of values.

Whenever you want to understand the distribution of your data, check which values are more common, or identify outliers, histograms are your go-to. Think of them as a spotlight on the story your data is telling. A histogram can provide a quick and insightful overview if you’re curious about exam scores, sales figures, or any numerical data distribution.

Real-Life Application of a Histogram

In the histogram data analysis presentation example, imagine an instructor analyzing a class’s grades to identify the most common score range. A histogram could effectively display the distribution. It will show whether most students scored in the average range or if there are significant outliers.

Step 1: Gather Data

He begins by gathering the data. The scores of each student in class are gathered to analyze exam scores.

After arranging the scores in ascending order, bin ranges are set.

Step 2: Define Bins

Bins are like categories that group similar values. Think of them as buckets that organize your data. The presenter decides how wide each bin should be based on the range of the values. For instance, the instructor sets the bin ranges based on score intervals: 60-69, 70-79, 80-89, and 90-100.

Step 3: Count Frequency

Now, he counts how many data points fall into each bin. This step is crucial because it tells you how often specific ranges of values occur. The result is the frequency distribution, showing the occurrences of each group.

Here, the instructor counts the number of students in each category.

  • 60-69: 1 student (Kate)
  • 70-79: 4 students (David, Emma, Grace, Jack)
  • 80-89: 7 students (Alice, Bob, Frank, Isabel, Liam, Mia, Noah)
  • 90-100: 3 students (Clara, Henry, Olivia)

Step 4: Create the Histogram

It’s time to turn the data into a visual representation. Draw a bar for each bin on a graph. The width of the bar should correspond to the range of the bin, and the height should correspond to the frequency.  To make your histogram understandable, label the X and Y axes.

In this case, the X-axis should represent the bins (e.g., test score ranges), and the Y-axis represents the frequency.

Histogram in Data Presentation

The histogram of the class grades reveals insightful patterns in the distribution. Most students, with seven students, fall within the 80-89 score range. The histogram provides a clear visualization of the class’s performance. It showcases a concentration of grades in the upper-middle range with few outliers at both ends. This analysis helps in understanding the overall academic standing of the class. It also identifies the areas for potential improvement or recognition.

Thus, histograms provide a clear visual representation of data distribution. They are easy to interpret, even for those without a statistical background. They apply to various types of data, including continuous and discrete variables. One weak point is that histograms do not capture detailed patterns in students’ data, with seven compared to other visualization methods.

A scatter plot is a graphical representation of the relationship between two variables. It consists of individual data points on a two-dimensional plane. This plane plots one variable on the x-axis and the other on the y-axis. Each point represents a unique observation. It visualizes patterns, trends, or correlations between the two variables.

Scatter plots are also effective in revealing the strength and direction of relationships. They identify outliers and assess the overall distribution of data points. The points’ dispersion and clustering reflect the relationship’s nature, whether it is positive, negative, or lacks a discernible pattern. In business, scatter plots assess relationships between variables such as marketing cost and sales revenue. They help present data correlations and decision-making.

Real-Life Application of Scatter Plot

A group of scientists is conducting a study on the relationship between daily hours of screen time and sleep quality. After reviewing the data, they managed to create this table to help them build a scatter plot graph:

In the provided example, the x-axis represents Daily Hours of Screen Time, and the y-axis represents the Sleep Quality Rating.

Scatter plot in data presentation

The scientists observe a negative correlation between the amount of screen time and the quality of sleep. This is consistent with their hypothesis that blue light, especially before bedtime, has a significant impact on sleep quality and metabolic processes.

There are a few things to remember when using a scatter plot. Even when a scatter diagram indicates a relationship, it doesn’t mean one variable affects the other. A third factor can influence both variables. The more the plot resembles a straight line, the stronger the relationship is perceived [11] . If it suggests no ties, the observed pattern might be due to random fluctuations in data. When the scatter diagram depicts no correlation, whether the data might be stratified is worth considering.

Choosing the appropriate data presentation type is crucial when making a presentation . Understanding the nature of your data and the message you intend to convey will guide this selection process. For instance, when showcasing quantitative relationships, scatter plots become instrumental in revealing correlations between variables. If the focus is on emphasizing parts of a whole, pie charts offer a concise display of proportions. Histograms, on the other hand, prove valuable for illustrating distributions and frequency patterns. 

Bar charts provide a clear visual comparison of different categories. Likewise, line charts excel in showcasing trends over time, while tables are ideal for detailed data examination. Starting a presentation on data presentation types involves evaluating the specific information you want to communicate and selecting the format that aligns with your message. This ensures clarity and resonance with your audience from the beginning of your presentation.

1. Fact Sheet Dashboard for Data Presentation

what is not included in presentation and analysis of data

Convey all the data you need to present in this one-pager format, an ideal solution tailored for users looking for presentation aids. Global maps, donut chats, column graphs, and text neatly arranged in a clean layout presented in light and dark themes.

Use This Template

2. 3D Column Chart Infographic PPT Template

what is not included in presentation and analysis of data

Represent column charts in a highly visual 3D format with this PPT template. A creative way to present data, this template is entirely editable, and we can craft either a one-page infographic or a series of slides explaining what we intend to disclose point by point.

3. Data Circles Infographic PowerPoint Template

what is not included in presentation and analysis of data

An alternative to the pie chart and donut chart diagrams, this template features a series of curved shapes with bubble callouts as ways of presenting data. Expand the information for each arch in the text placeholder areas.

4. Colorful Metrics Dashboard for Data Presentation

what is not included in presentation and analysis of data

This versatile dashboard template helps us in the presentation of the data by offering several graphs and methods to convert numbers into graphics. Implement it for e-commerce projects, financial projections, project development, and more.

5. Animated Data Presentation Tools for PowerPoint & Google Slides

Canvas Shape Tree Diagram Template

A slide deck filled with most of the tools mentioned in this article, from bar charts, column charts, treemap graphs, pie charts, histogram, etc. Animated effects make each slide look dynamic when sharing data with stakeholders.

6. Statistics Waffle Charts PPT Template for Data Presentations

what is not included in presentation and analysis of data

This PPT template helps us how to present data beyond the typical pie chart representation. It is widely used for demographics, so it’s a great fit for marketing teams, data science professionals, HR personnel, and more.

7. Data Presentation Dashboard Template for Google Slides

what is not included in presentation and analysis of data

A compendium of tools in dashboard format featuring line graphs, bar charts, column charts, and neatly arranged placeholder text areas. 

8. Weather Dashboard for Data Presentation

what is not included in presentation and analysis of data

Share weather data for agricultural presentation topics, environmental studies, or any kind of presentation that requires a highly visual layout for weather forecasting on a single day. Two color themes are available.

9. Social Media Marketing Dashboard Data Presentation Template

what is not included in presentation and analysis of data

Intended for marketing professionals, this dashboard template for data presentation is a tool for presenting data analytics from social media channels. Two slide layouts featuring line graphs and column charts.

10. Project Management Summary Dashboard Template

what is not included in presentation and analysis of data

A tool crafted for project managers to deliver highly visual reports on a project’s completion, the profits it delivered for the company, and expenses/time required to execute it. 4 different color layouts are available.

11. Profit & Loss Dashboard for PowerPoint and Google Slides

what is not included in presentation and analysis of data

A must-have for finance professionals. This typical profit & loss dashboard includes progress bars, donut charts, column charts, line graphs, and everything that’s required to deliver a comprehensive report about a company’s financial situation.

Overwhelming visuals

One of the mistakes related to using data-presenting methods is including too much data or using overly complex visualizations. They can confuse the audience and dilute the key message.

Inappropriate chart types

Choosing the wrong type of chart for the data at hand can lead to misinterpretation. For example, using a pie chart for data that doesn’t represent parts of a whole is not right.

Lack of context

Failing to provide context or sufficient labeling can make it challenging for the audience to understand the significance of the presented data.

Inconsistency in design

Using inconsistent design elements and color schemes across different visualizations can create confusion and visual disarray.

Failure to provide details

Simply presenting raw data without offering clear insights or takeaways can leave the audience without a meaningful conclusion.

Lack of focus

Not having a clear focus on the key message or main takeaway can result in a presentation that lacks a central theme.

Visual accessibility issues

Overlooking the visual accessibility of charts and graphs can exclude certain audience members who may have difficulty interpreting visual information.

In order to avoid these mistakes in data presentation, presenters can benefit from using presentation templates . These templates provide a structured framework. They ensure consistency, clarity, and an aesthetically pleasing design, enhancing data communication’s overall impact.

Understanding and choosing data presentation types are pivotal in effective communication. Each method serves a unique purpose, so selecting the appropriate one depends on the nature of the data and the message to be conveyed. The diverse array of presentation types offers versatility in visually representing information, from bar charts showing values to pie charts illustrating proportions. 

Using the proper method enhances clarity, engages the audience, and ensures that data sets are not just presented but comprehensively understood. By appreciating the strengths and limitations of different presentation types, communicators can tailor their approach to convey information accurately, developing a deeper connection between data and audience understanding.

[1] Government of Canada, S.C. (2021) 5 Data Visualization 5.2 Bar Chart , 5.2 Bar chart .  https://www150.statcan.gc.ca/n1/edu/power-pouvoir/ch9/bargraph-diagrammeabarres/5214818-eng.htm

[2] Kosslyn, S.M., 1989. Understanding charts and graphs. Applied cognitive psychology, 3(3), pp.185-225. https://apps.dtic.mil/sti/pdfs/ADA183409.pdf

[3] Creating a Dashboard . https://it.tufts.edu/book/export/html/1870

[4] https://www.goldenwestcollege.edu/research/data-and-more/data-dashboards/index.html

[5] https://www.mit.edu/course/21/21.guide/grf-line.htm

[6] Jadeja, M. and Shah, K., 2015, January. Tree-Map: A Visualization Tool for Large Data. In GSB@ SIGIR (pp. 9-13). https://ceur-ws.org/Vol-1393/gsb15proceedings.pdf#page=15

[7] Heat Maps and Quilt Plots. https://www.publichealth.columbia.edu/research/population-health-methods/heat-maps-and-quilt-plots

[8] EIU QGIS WORKSHOP. https://www.eiu.edu/qgisworkshop/heatmaps.php

[9] About Pie Charts.  https://www.mit.edu/~mbarker/formula1/f1help/11-ch-c8.htm

[10] Histograms. https://sites.utexas.edu/sos/guided/descriptive/numericaldd/descriptiven2/histogram/ [11] https://asq.org/quality-resources/scatter-diagram

what is not included in presentation and analysis of data

Like this article? Please share

Data Analysis, Data Science, Data Visualization Filed under Design

Related Articles

How to Make a Presentation Graph

Filed under Design • March 27th, 2024

How to Make a Presentation Graph

Detailed step-by-step instructions to master the art of how to make a presentation graph in PowerPoint and Google Slides. Check it out!

All About Using Harvey Balls

Filed under Presentation Ideas • January 6th, 2024

All About Using Harvey Balls

Among the many tools in the arsenal of the modern presenter, Harvey Balls have a special place. In this article we will tell you all about using Harvey Balls.

How to Design a Dashboard Presentation: A Step-by-Step Guide

Filed under Business • December 8th, 2023

How to Design a Dashboard Presentation: A Step-by-Step Guide

Take a step further in your professional presentation skills by learning what a dashboard presentation is and how to properly design one in PowerPoint. A detailed step-by-step guide is here!

Leave a Reply

what is not included in presentation and analysis of data

Your Modern Business Guide To Data Analysis Methods And Techniques

Data analysis methods and techniques blog post by datapine

Table of Contents

1) What Is Data Analysis?

2) Why Is Data Analysis Important?

3) What Is The Data Analysis Process?

4) Types Of Data Analysis Methods

5) Top Data Analysis Techniques To Apply

6) Quality Criteria For Data Analysis

7) Data Analysis Limitations & Barriers

8) Data Analysis Skills

9) Data Analysis In The Big Data Environment

In our data-rich age, understanding how to analyze and extract true meaning from our business’s digital insights is one of the primary drivers of success.

Despite the colossal volume of data we create every day, a mere 0.5% is actually analyzed and used for data discovery , improvement, and intelligence. While that may not seem like much, considering the amount of digital information we have at our fingertips, half a percent still accounts for a vast amount of data.

With so much data and so little time, knowing how to collect, curate, organize, and make sense of all of this potentially business-boosting information can be a minefield – but online data analysis is the solution.

In science, data analysis uses a more complex approach with advanced techniques to explore and experiment with data. On the other hand, in a business context, data is used to make data-driven decisions that will enable the company to improve its overall performance. In this post, we will cover the analysis of data from an organizational point of view while still going through the scientific and statistical foundations that are fundamental to understanding the basics of data analysis. 

To put all of that into perspective, we will answer a host of important analytical questions, explore analytical methods and techniques, while demonstrating how to perform analysis in the real world with a 17-step blueprint for success.

What Is Data Analysis?

Data analysis is the process of collecting, modeling, and analyzing data using various statistical and logical methods and techniques. Businesses rely on analytics processes and tools to extract insights that support strategic and operational decision-making.

All these various methods are largely based on two core areas: quantitative and qualitative research.

To explain the key differences between qualitative and quantitative research, here’s a video for your viewing pleasure:

Gaining a better understanding of different techniques and methods in quantitative research as well as qualitative insights will give your analyzing efforts a more clearly defined direction, so it’s worth taking the time to allow this particular knowledge to sink in. Additionally, you will be able to create a comprehensive analytical report that will skyrocket your analysis.

Apart from qualitative and quantitative categories, there are also other types of data that you should be aware of before dividing into complex data analysis processes. These categories include: 

  • Big data: Refers to massive data sets that need to be analyzed using advanced software to reveal patterns and trends. It is considered to be one of the best analytical assets as it provides larger volumes of data at a faster rate. 
  • Metadata: Putting it simply, metadata is data that provides insights about other data. It summarizes key information about specific data that makes it easier to find and reuse for later purposes. 
  • Real time data: As its name suggests, real time data is presented as soon as it is acquired. From an organizational perspective, this is the most valuable data as it can help you make important decisions based on the latest developments. Our guide on real time analytics will tell you more about the topic. 
  • Machine data: This is more complex data that is generated solely by a machine such as phones, computers, or even websites and embedded systems, without previous human interaction.

Why Is Data Analysis Important?

Before we go into detail about the categories of analysis along with its methods and techniques, you must understand the potential that analyzing data can bring to your organization.

  • Informed decision-making : From a management perspective, you can benefit from analyzing your data as it helps you make decisions based on facts and not simple intuition. For instance, you can understand where to invest your capital, detect growth opportunities, predict your income, or tackle uncommon situations before they become problems. Through this, you can extract relevant insights from all areas in your organization, and with the help of dashboard software , present the data in a professional and interactive way to different stakeholders.
  • Reduce costs : Another great benefit is to reduce costs. With the help of advanced technologies such as predictive analytics, businesses can spot improvement opportunities, trends, and patterns in their data and plan their strategies accordingly. In time, this will help you save money and resources on implementing the wrong strategies. And not just that, by predicting different scenarios such as sales and demand you can also anticipate production and supply. 
  • Target customers better : Customers are arguably the most crucial element in any business. By using analytics to get a 360° vision of all aspects related to your customers, you can understand which channels they use to communicate with you, their demographics, interests, habits, purchasing behaviors, and more. In the long run, it will drive success to your marketing strategies, allow you to identify new potential customers, and avoid wasting resources on targeting the wrong people or sending the wrong message. You can also track customer satisfaction by analyzing your client’s reviews or your customer service department’s performance.

What Is The Data Analysis Process?

Data analysis process graphic

When we talk about analyzing data there is an order to follow in order to extract the needed conclusions. The analysis process consists of 5 key stages. We will cover each of them more in detail later in the post, but to start providing the needed context to understand what is coming next, here is a rundown of the 5 essential steps of data analysis. 

  • Identify: Before you get your hands dirty with data, you first need to identify why you need it in the first place. The identification is the stage in which you establish the questions you will need to answer. For example, what is the customer's perception of our brand? Or what type of packaging is more engaging to our potential customers? Once the questions are outlined you are ready for the next step. 
  • Collect: As its name suggests, this is the stage where you start collecting the needed data. Here, you define which sources of data you will use and how you will use them. The collection of data can come in different forms such as internal or external sources, surveys, interviews, questionnaires, and focus groups, among others.  An important note here is that the way you collect the data will be different in a quantitative and qualitative scenario. 
  • Clean: Once you have the necessary data it is time to clean it and leave it ready for analysis. Not all the data you collect will be useful, when collecting big amounts of data in different formats it is very likely that you will find yourself with duplicate or badly formatted data. To avoid this, before you start working with your data you need to make sure to erase any white spaces, duplicate records, or formatting errors. This way you avoid hurting your analysis with bad-quality data. 
  • Analyze : With the help of various techniques such as statistical analysis, regressions, neural networks, text analysis, and more, you can start analyzing and manipulating your data to extract relevant conclusions. At this stage, you find trends, correlations, variations, and patterns that can help you answer the questions you first thought of in the identify stage. Various technologies in the market assist researchers and average users with the management of their data. Some of them include business intelligence and visualization software, predictive analytics, and data mining, among others. 
  • Interpret: Last but not least you have one of the most important steps: it is time to interpret your results. This stage is where the researcher comes up with courses of action based on the findings. For example, here you would understand if your clients prefer packaging that is red or green, plastic or paper, etc. Additionally, at this stage, you can also find some limitations and work on them. 

Now that you have a basic understanding of the key data analysis steps, let’s look at the top 17 essential methods.

17 Essential Types Of Data Analysis Methods

Before diving into the 17 essential types of methods, it is important that we go over really fast through the main analysis categories. Starting with the category of descriptive up to prescriptive analysis, the complexity and effort of data evaluation increases, but also the added value for the company.

a) Descriptive analysis - What happened.

The descriptive analysis method is the starting point for any analytic reflection, and it aims to answer the question of what happened? It does this by ordering, manipulating, and interpreting raw data from various sources to turn it into valuable insights for your organization.

Performing descriptive analysis is essential, as it enables us to present our insights in a meaningful way. Although it is relevant to mention that this analysis on its own will not allow you to predict future outcomes or tell you the answer to questions like why something happened, it will leave your data organized and ready to conduct further investigations.

b) Exploratory analysis - How to explore data relationships.

As its name suggests, the main aim of the exploratory analysis is to explore. Prior to it, there is still no notion of the relationship between the data and the variables. Once the data is investigated, exploratory analysis helps you to find connections and generate hypotheses and solutions for specific problems. A typical area of ​​application for it is data mining.

c) Diagnostic analysis - Why it happened.

Diagnostic data analytics empowers analysts and executives by helping them gain a firm contextual understanding of why something happened. If you know why something happened as well as how it happened, you will be able to pinpoint the exact ways of tackling the issue or challenge.

Designed to provide direct and actionable answers to specific questions, this is one of the world’s most important methods in research, among its other key organizational functions such as retail analytics , e.g.

c) Predictive analysis - What will happen.

The predictive method allows you to look into the future to answer the question: what will happen? In order to do this, it uses the results of the previously mentioned descriptive, exploratory, and diagnostic analysis, in addition to machine learning (ML) and artificial intelligence (AI). Through this, you can uncover future trends, potential problems or inefficiencies, connections, and casualties in your data.

With predictive analysis, you can unfold and develop initiatives that will not only enhance your various operational processes but also help you gain an all-important edge over the competition. If you understand why a trend, pattern, or event happened through data, you will be able to develop an informed projection of how things may unfold in particular areas of the business.

e) Prescriptive analysis - How will it happen.

Another of the most effective types of analysis methods in research. Prescriptive data techniques cross over from predictive analysis in the way that it revolves around using patterns or trends to develop responsive, practical business strategies.

By drilling down into prescriptive analysis, you will play an active role in the data consumption process by taking well-arranged sets of visual data and using it as a powerful fix to emerging issues in a number of key areas, including marketing, sales, customer experience, HR, fulfillment, finance, logistics analytics , and others.

Top 17 data analysis methods

As mentioned at the beginning of the post, data analysis methods can be divided into two big categories: quantitative and qualitative. Each of these categories holds a powerful analytical value that changes depending on the scenario and type of data you are working with. Below, we will discuss 17 methods that are divided into qualitative and quantitative approaches. 

Without further ado, here are the 17 essential types of data analysis methods with some use cases in the business world: 

A. Quantitative Methods 

To put it simply, quantitative analysis refers to all methods that use numerical data or data that can be turned into numbers (e.g. category variables like gender, age, etc.) to extract valuable insights. It is used to extract valuable conclusions about relationships, differences, and test hypotheses. Below we discuss some of the key quantitative methods. 

1. Cluster analysis

The action of grouping a set of data elements in a way that said elements are more similar (in a particular sense) to each other than to those in other groups – hence the term ‘cluster.’ Since there is no target variable when clustering, the method is often used to find hidden patterns in the data. The approach is also used to provide additional context to a trend or dataset.

Let's look at it from an organizational perspective. In a perfect world, marketers would be able to analyze each customer separately and give them the best-personalized service, but let's face it, with a large customer base, it is timely impossible to do that. That's where clustering comes in. By grouping customers into clusters based on demographics, purchasing behaviors, monetary value, or any other factor that might be relevant for your company, you will be able to immediately optimize your efforts and give your customers the best experience based on their needs.

2. Cohort analysis

This type of data analysis approach uses historical data to examine and compare a determined segment of users' behavior, which can then be grouped with others with similar characteristics. By using this methodology, it's possible to gain a wealth of insight into consumer needs or a firm understanding of a broader target group.

Cohort analysis can be really useful for performing analysis in marketing as it will allow you to understand the impact of your campaigns on specific groups of customers. To exemplify, imagine you send an email campaign encouraging customers to sign up for your site. For this, you create two versions of the campaign with different designs, CTAs, and ad content. Later on, you can use cohort analysis to track the performance of the campaign for a longer period of time and understand which type of content is driving your customers to sign up, repurchase, or engage in other ways.  

A useful tool to start performing cohort analysis method is Google Analytics. You can learn more about the benefits and limitations of using cohorts in GA in this useful guide . In the bottom image, you see an example of how you visualize a cohort in this tool. The segments (devices traffic) are divided into date cohorts (usage of devices) and then analyzed week by week to extract insights into performance.

Cohort analysis chart example from google analytics

3. Regression analysis

Regression uses historical data to understand how a dependent variable's value is affected when one (linear regression) or more independent variables (multiple regression) change or stay the same. By understanding each variable's relationship and how it developed in the past, you can anticipate possible outcomes and make better decisions in the future.

Let's bring it down with an example. Imagine you did a regression analysis of your sales in 2019 and discovered that variables like product quality, store design, customer service, marketing campaigns, and sales channels affected the overall result. Now you want to use regression to analyze which of these variables changed or if any new ones appeared during 2020. For example, you couldn’t sell as much in your physical store due to COVID lockdowns. Therefore, your sales could’ve either dropped in general or increased in your online channels. Through this, you can understand which independent variables affected the overall performance of your dependent variable, annual sales.

If you want to go deeper into this type of analysis, check out this article and learn more about how you can benefit from regression.

4. Neural networks

The neural network forms the basis for the intelligent algorithms of machine learning. It is a form of analytics that attempts, with minimal intervention, to understand how the human brain would generate insights and predict values. Neural networks learn from each and every data transaction, meaning that they evolve and advance over time.

A typical area of application for neural networks is predictive analytics. There are BI reporting tools that have this feature implemented within them, such as the Predictive Analytics Tool from datapine. This tool enables users to quickly and easily generate all kinds of predictions. All you have to do is select the data to be processed based on your KPIs, and the software automatically calculates forecasts based on historical and current data. Thanks to its user-friendly interface, anyone in your organization can manage it; there’s no need to be an advanced scientist. 

Here is an example of how you can use the predictive analysis tool from datapine:

Example on how to use predictive analytics tool from datapine

**click to enlarge**

5. Factor analysis

The factor analysis also called “dimension reduction” is a type of data analysis used to describe variability among observed, correlated variables in terms of a potentially lower number of unobserved variables called factors. The aim here is to uncover independent latent variables, an ideal method for streamlining specific segments.

A good way to understand this data analysis method is a customer evaluation of a product. The initial assessment is based on different variables like color, shape, wearability, current trends, materials, comfort, the place where they bought the product, and frequency of usage. Like this, the list can be endless, depending on what you want to track. In this case, factor analysis comes into the picture by summarizing all of these variables into homogenous groups, for example, by grouping the variables color, materials, quality, and trends into a brother latent variable of design.

If you want to start analyzing data using factor analysis we recommend you take a look at this practical guide from UCLA.

6. Data mining

A method of data analysis that is the umbrella term for engineering metrics and insights for additional value, direction, and context. By using exploratory statistical evaluation, data mining aims to identify dependencies, relations, patterns, and trends to generate advanced knowledge.  When considering how to analyze data, adopting a data mining mindset is essential to success - as such, it’s an area that is worth exploring in greater detail.

An excellent use case of data mining is datapine intelligent data alerts . With the help of artificial intelligence and machine learning, they provide automated signals based on particular commands or occurrences within a dataset. For example, if you’re monitoring supply chain KPIs , you could set an intelligent alarm to trigger when invalid or low-quality data appears. By doing so, you will be able to drill down deep into the issue and fix it swiftly and effectively.

In the following picture, you can see how the intelligent alarms from datapine work. By setting up ranges on daily orders, sessions, and revenues, the alarms will notify you if the goal was not completed or if it exceeded expectations.

Example on how to use intelligent alerts from datapine

7. Time series analysis

As its name suggests, time series analysis is used to analyze a set of data points collected over a specified period of time. Although analysts use this method to monitor the data points in a specific interval of time rather than just monitoring them intermittently, the time series analysis is not uniquely used for the purpose of collecting data over time. Instead, it allows researchers to understand if variables changed during the duration of the study, how the different variables are dependent, and how did it reach the end result. 

In a business context, this method is used to understand the causes of different trends and patterns to extract valuable insights. Another way of using this method is with the help of time series forecasting. Powered by predictive technologies, businesses can analyze various data sets over a period of time and forecast different future events. 

A great use case to put time series analysis into perspective is seasonality effects on sales. By using time series forecasting to analyze sales data of a specific product over time, you can understand if sales rise over a specific period of time (e.g. swimwear during summertime, or candy during Halloween). These insights allow you to predict demand and prepare production accordingly.  

8. Decision Trees 

The decision tree analysis aims to act as a support tool to make smart and strategic decisions. By visually displaying potential outcomes, consequences, and costs in a tree-like model, researchers and company users can easily evaluate all factors involved and choose the best course of action. Decision trees are helpful to analyze quantitative data and they allow for an improved decision-making process by helping you spot improvement opportunities, reduce costs, and enhance operational efficiency and production.

But how does a decision tree actually works? This method works like a flowchart that starts with the main decision that you need to make and branches out based on the different outcomes and consequences of each decision. Each outcome will outline its own consequences, costs, and gains and, at the end of the analysis, you can compare each of them and make the smartest decision. 

Businesses can use them to understand which project is more cost-effective and will bring more earnings in the long run. For example, imagine you need to decide if you want to update your software app or build a new app entirely.  Here you would compare the total costs, the time needed to be invested, potential revenue, and any other factor that might affect your decision.  In the end, you would be able to see which of these two options is more realistic and attainable for your company or research.

9. Conjoint analysis 

Last but not least, we have the conjoint analysis. This approach is usually used in surveys to understand how individuals value different attributes of a product or service and it is one of the most effective methods to extract consumer preferences. When it comes to purchasing, some clients might be more price-focused, others more features-focused, and others might have a sustainable focus. Whatever your customer's preferences are, you can find them with conjoint analysis. Through this, companies can define pricing strategies, packaging options, subscription packages, and more. 

A great example of conjoint analysis is in marketing and sales. For instance, a cupcake brand might use conjoint analysis and find that its clients prefer gluten-free options and cupcakes with healthier toppings over super sugary ones. Thus, the cupcake brand can turn these insights into advertisements and promotions to increase sales of this particular type of product. And not just that, conjoint analysis can also help businesses segment their customers based on their interests. This allows them to send different messaging that will bring value to each of the segments. 

10. Correspondence Analysis

Also known as reciprocal averaging, correspondence analysis is a method used to analyze the relationship between categorical variables presented within a contingency table. A contingency table is a table that displays two (simple correspondence analysis) or more (multiple correspondence analysis) categorical variables across rows and columns that show the distribution of the data, which is usually answers to a survey or questionnaire on a specific topic. 

This method starts by calculating an “expected value” which is done by multiplying row and column averages and dividing it by the overall original value of the specific table cell. The “expected value” is then subtracted from the original value resulting in a “residual number” which is what allows you to extract conclusions about relationships and distribution. The results of this analysis are later displayed using a map that represents the relationship between the different values. The closest two values are in the map, the bigger the relationship. Let’s put it into perspective with an example. 

Imagine you are carrying out a market research analysis about outdoor clothing brands and how they are perceived by the public. For this analysis, you ask a group of people to match each brand with a certain attribute which can be durability, innovation, quality materials, etc. When calculating the residual numbers, you can see that brand A has a positive residual for innovation but a negative one for durability. This means that brand A is not positioned as a durable brand in the market, something that competitors could take advantage of. 

11. Multidimensional Scaling (MDS)

MDS is a method used to observe the similarities or disparities between objects which can be colors, brands, people, geographical coordinates, and more. The objects are plotted using an “MDS map” that positions similar objects together and disparate ones far apart. The (dis) similarities between objects are represented using one or more dimensions that can be observed using a numerical scale. For example, if you want to know how people feel about the COVID-19 vaccine, you can use 1 for “don’t believe in the vaccine at all”  and 10 for “firmly believe in the vaccine” and a scale of 2 to 9 for in between responses.  When analyzing an MDS map the only thing that matters is the distance between the objects, the orientation of the dimensions is arbitrary and has no meaning at all. 

Multidimensional scaling is a valuable technique for market research, especially when it comes to evaluating product or brand positioning. For instance, if a cupcake brand wants to know how they are positioned compared to competitors, it can define 2-3 dimensions such as taste, ingredients, shopping experience, or more, and do a multidimensional scaling analysis to find improvement opportunities as well as areas in which competitors are currently leading. 

Another business example is in procurement when deciding on different suppliers. Decision makers can generate an MDS map to see how the different prices, delivery times, technical services, and more of the different suppliers differ and pick the one that suits their needs the best. 

A final example proposed by a research paper on "An Improved Study of Multilevel Semantic Network Visualization for Analyzing Sentiment Word of Movie Review Data". Researchers picked a two-dimensional MDS map to display the distances and relationships between different sentiments in movie reviews. They used 36 sentiment words and distributed them based on their emotional distance as we can see in the image below where the words "outraged" and "sweet" are on opposite sides of the map, marking the distance between the two emotions very clearly.

Example of multidimensional scaling analysis

Aside from being a valuable technique to analyze dissimilarities, MDS also serves as a dimension-reduction technique for large dimensional data. 

B. Qualitative Methods

Qualitative data analysis methods are defined as the observation of non-numerical data that is gathered and produced using methods of observation such as interviews, focus groups, questionnaires, and more. As opposed to quantitative methods, qualitative data is more subjective and highly valuable in analyzing customer retention and product development.

12. Text analysis

Text analysis, also known in the industry as text mining, works by taking large sets of textual data and arranging them in a way that makes it easier to manage. By working through this cleansing process in stringent detail, you will be able to extract the data that is truly relevant to your organization and use it to develop actionable insights that will propel you forward.

Modern software accelerate the application of text analytics. Thanks to the combination of machine learning and intelligent algorithms, you can perform advanced analytical processes such as sentiment analysis. This technique allows you to understand the intentions and emotions of a text, for example, if it's positive, negative, or neutral, and then give it a score depending on certain factors and categories that are relevant to your brand. Sentiment analysis is often used to monitor brand and product reputation and to understand how successful your customer experience is. To learn more about the topic check out this insightful article .

By analyzing data from various word-based sources, including product reviews, articles, social media communications, and survey responses, you will gain invaluable insights into your audience, as well as their needs, preferences, and pain points. This will allow you to create campaigns, services, and communications that meet your prospects’ needs on a personal level, growing your audience while boosting customer retention. There are various other “sub-methods” that are an extension of text analysis. Each of them serves a more specific purpose and we will look at them in detail next. 

13. Content Analysis

This is a straightforward and very popular method that examines the presence and frequency of certain words, concepts, and subjects in different content formats such as text, image, audio, or video. For example, the number of times the name of a celebrity is mentioned on social media or online tabloids. It does this by coding text data that is later categorized and tabulated in a way that can provide valuable insights, making it the perfect mix of quantitative and qualitative analysis.

There are two types of content analysis. The first one is the conceptual analysis which focuses on explicit data, for instance, the number of times a concept or word is mentioned in a piece of content. The second one is relational analysis, which focuses on the relationship between different concepts or words and how they are connected within a specific context. 

Content analysis is often used by marketers to measure brand reputation and customer behavior. For example, by analyzing customer reviews. It can also be used to analyze customer interviews and find directions for new product development. It is also important to note, that in order to extract the maximum potential out of this analysis method, it is necessary to have a clearly defined research question. 

14. Thematic Analysis

Very similar to content analysis, thematic analysis also helps in identifying and interpreting patterns in qualitative data with the main difference being that the first one can also be applied to quantitative analysis. The thematic method analyzes large pieces of text data such as focus group transcripts or interviews and groups them into themes or categories that come up frequently within the text. It is a great method when trying to figure out peoples view’s and opinions about a certain topic. For example, if you are a brand that cares about sustainability, you can do a survey of your customers to analyze their views and opinions about sustainability and how they apply it to their lives. You can also analyze customer service calls transcripts to find common issues and improve your service. 

Thematic analysis is a very subjective technique that relies on the researcher’s judgment. Therefore,  to avoid biases, it has 6 steps that include familiarization, coding, generating themes, reviewing themes, defining and naming themes, and writing up. It is also important to note that, because it is a flexible approach, the data can be interpreted in multiple ways and it can be hard to select what data is more important to emphasize. 

15. Narrative Analysis 

A bit more complex in nature than the two previous ones, narrative analysis is used to explore the meaning behind the stories that people tell and most importantly, how they tell them. By looking into the words that people use to describe a situation you can extract valuable conclusions about their perspective on a specific topic. Common sources for narrative data include autobiographies, family stories, opinion pieces, and testimonials, among others. 

From a business perspective, narrative analysis can be useful to analyze customer behaviors and feelings towards a specific product, service, feature, or others. It provides unique and deep insights that can be extremely valuable. However, it has some drawbacks.  

The biggest weakness of this method is that the sample sizes are usually very small due to the complexity and time-consuming nature of the collection of narrative data. Plus, the way a subject tells a story will be significantly influenced by his or her specific experiences, making it very hard to replicate in a subsequent study. 

16. Discourse Analysis

Discourse analysis is used to understand the meaning behind any type of written, verbal, or symbolic discourse based on its political, social, or cultural context. It mixes the analysis of languages and situations together. This means that the way the content is constructed and the meaning behind it is significantly influenced by the culture and society it takes place in. For example, if you are analyzing political speeches you need to consider different context elements such as the politician's background, the current political context of the country, the audience to which the speech is directed, and so on. 

From a business point of view, discourse analysis is a great market research tool. It allows marketers to understand how the norms and ideas of the specific market work and how their customers relate to those ideas. It can be very useful to build a brand mission or develop a unique tone of voice. 

17. Grounded Theory Analysis

Traditionally, researchers decide on a method and hypothesis and start to collect the data to prove that hypothesis. The grounded theory is the only method that doesn’t require an initial research question or hypothesis as its value lies in the generation of new theories. With the grounded theory method, you can go into the analysis process with an open mind and explore the data to generate new theories through tests and revisions. In fact, it is not necessary to collect the data and then start to analyze it. Researchers usually start to find valuable insights as they are gathering the data. 

All of these elements make grounded theory a very valuable method as theories are fully backed by data instead of initial assumptions. It is a great technique to analyze poorly researched topics or find the causes behind specific company outcomes. For example, product managers and marketers might use the grounded theory to find the causes of high levels of customer churn and look into customer surveys and reviews to develop new theories about the causes. 

How To Analyze Data? Top 17 Data Analysis Techniques To Apply

17 top data analysis techniques by datapine

Now that we’ve answered the questions “what is data analysis’”, why is it important, and covered the different data analysis types, it’s time to dig deeper into how to perform your analysis by working through these 17 essential techniques.

1. Collaborate your needs

Before you begin analyzing or drilling down into any techniques, it’s crucial to sit down collaboratively with all key stakeholders within your organization, decide on your primary campaign or strategic goals, and gain a fundamental understanding of the types of insights that will best benefit your progress or provide you with the level of vision you need to evolve your organization.

2. Establish your questions

Once you’ve outlined your core objectives, you should consider which questions will need answering to help you achieve your mission. This is one of the most important techniques as it will shape the very foundations of your success.

To help you ask the right things and ensure your data works for you, you have to ask the right data analysis questions .

3. Data democratization

After giving your data analytics methodology some real direction, and knowing which questions need answering to extract optimum value from the information available to your organization, you should continue with democratization.

Data democratization is an action that aims to connect data from various sources efficiently and quickly so that anyone in your organization can access it at any given moment. You can extract data in text, images, videos, numbers, or any other format. And then perform cross-database analysis to achieve more advanced insights to share with the rest of the company interactively.  

Once you have decided on your most valuable sources, you need to take all of this into a structured format to start collecting your insights. For this purpose, datapine offers an easy all-in-one data connectors feature to integrate all your internal and external sources and manage them at your will. Additionally, datapine’s end-to-end solution automatically updates your data, allowing you to save time and focus on performing the right analysis to grow your company.

data connectors from datapine

4. Think of governance 

When collecting data in a business or research context you always need to think about security and privacy. With data breaches becoming a topic of concern for businesses, the need to protect your client's or subject’s sensitive information becomes critical. 

To ensure that all this is taken care of, you need to think of a data governance strategy. According to Gartner , this concept refers to “ the specification of decision rights and an accountability framework to ensure the appropriate behavior in the valuation, creation, consumption, and control of data and analytics .” In simpler words, data governance is a collection of processes, roles, and policies, that ensure the efficient use of data while still achieving the main company goals. It ensures that clear roles are in place for who can access the information and how they can access it. In time, this not only ensures that sensitive information is protected but also allows for an efficient analysis as a whole. 

5. Clean your data

After harvesting from so many sources you will be left with a vast amount of information that can be overwhelming to deal with. At the same time, you can be faced with incorrect data that can be misleading to your analysis. The smartest thing you can do to avoid dealing with this in the future is to clean the data. This is fundamental before visualizing it, as it will ensure that the insights you extract from it are correct.

There are many things that you need to look for in the cleaning process. The most important one is to eliminate any duplicate observations; this usually appears when using multiple internal and external sources of information. You can also add any missing codes, fix empty fields, and eliminate incorrectly formatted data.

Another usual form of cleaning is done with text data. As we mentioned earlier, most companies today analyze customer reviews, social media comments, questionnaires, and several other text inputs. In order for algorithms to detect patterns, text data needs to be revised to avoid invalid characters or any syntax or spelling errors. 

Most importantly, the aim of cleaning is to prevent you from arriving at false conclusions that can damage your company in the long run. By using clean data, you will also help BI solutions to interact better with your information and create better reports for your organization.

6. Set your KPIs

Once you’ve set your sources, cleaned your data, and established clear-cut questions you want your insights to answer, you need to set a host of key performance indicators (KPIs) that will help you track, measure, and shape your progress in a number of key areas.

KPIs are critical to both qualitative and quantitative analysis research. This is one of the primary methods of data analysis you certainly shouldn’t overlook.

To help you set the best possible KPIs for your initiatives and activities, here is an example of a relevant logistics KPI : transportation-related costs. If you want to see more go explore our collection of key performance indicator examples .

Transportation costs logistics KPIs

7. Omit useless data

Having bestowed your data analysis tools and techniques with true purpose and defined your mission, you should explore the raw data you’ve collected from all sources and use your KPIs as a reference for chopping out any information you deem to be useless.

Trimming the informational fat is one of the most crucial methods of analysis as it will allow you to focus your analytical efforts and squeeze every drop of value from the remaining ‘lean’ information.

Any stats, facts, figures, or metrics that don’t align with your business goals or fit with your KPI management strategies should be eliminated from the equation.

8. Build a data management roadmap

While, at this point, this particular step is optional (you will have already gained a wealth of insight and formed a fairly sound strategy by now), creating a data governance roadmap will help your data analysis methods and techniques become successful on a more sustainable basis. These roadmaps, if developed properly, are also built so they can be tweaked and scaled over time.

Invest ample time in developing a roadmap that will help you store, manage, and handle your data internally, and you will make your analysis techniques all the more fluid and functional – one of the most powerful types of data analysis methods available today.

9. Integrate technology

There are many ways to analyze data, but one of the most vital aspects of analytical success in a business context is integrating the right decision support software and technology.

Robust analysis platforms will not only allow you to pull critical data from your most valuable sources while working with dynamic KPIs that will offer you actionable insights; it will also present them in a digestible, visual, interactive format from one central, live dashboard . A data methodology you can count on.

By integrating the right technology within your data analysis methodology, you’ll avoid fragmenting your insights, saving you time and effort while allowing you to enjoy the maximum value from your business’s most valuable insights.

For a look at the power of software for the purpose of analysis and to enhance your methods of analyzing, glance over our selection of dashboard examples .

10. Answer your questions

By considering each of the above efforts, working with the right technology, and fostering a cohesive internal culture where everyone buys into the different ways to analyze data as well as the power of digital intelligence, you will swiftly start to answer your most burning business questions. Arguably, the best way to make your data concepts accessible across the organization is through data visualization.

11. Visualize your data

Online data visualization is a powerful tool as it lets you tell a story with your metrics, allowing users across the organization to extract meaningful insights that aid business evolution – and it covers all the different ways to analyze data.

The purpose of analyzing is to make your entire organization more informed and intelligent, and with the right platform or dashboard, this is simpler than you think, as demonstrated by our marketing dashboard .

An executive dashboard example showcasing high-level marketing KPIs such as cost per lead, MQL, SQL, and cost per customer.

This visual, dynamic, and interactive online dashboard is a data analysis example designed to give Chief Marketing Officers (CMO) an overview of relevant metrics to help them understand if they achieved their monthly goals.

In detail, this example generated with a modern dashboard creator displays interactive charts for monthly revenues, costs, net income, and net income per customer; all of them are compared with the previous month so that you can understand how the data fluctuated. In addition, it shows a detailed summary of the number of users, customers, SQLs, and MQLs per month to visualize the whole picture and extract relevant insights or trends for your marketing reports .

The CMO dashboard is perfect for c-level management as it can help them monitor the strategic outcome of their marketing efforts and make data-driven decisions that can benefit the company exponentially.

12. Be careful with the interpretation

We already dedicated an entire post to data interpretation as it is a fundamental part of the process of data analysis. It gives meaning to the analytical information and aims to drive a concise conclusion from the analysis results. Since most of the time companies are dealing with data from many different sources, the interpretation stage needs to be done carefully and properly in order to avoid misinterpretations. 

To help you through the process, here we list three common practices that you need to avoid at all costs when looking at your data:

  • Correlation vs. causation: The human brain is formatted to find patterns. This behavior leads to one of the most common mistakes when performing interpretation: confusing correlation with causation. Although these two aspects can exist simultaneously, it is not correct to assume that because two things happened together, one provoked the other. A piece of advice to avoid falling into this mistake is never to trust just intuition, trust the data. If there is no objective evidence of causation, then always stick to correlation. 
  • Confirmation bias: This phenomenon describes the tendency to select and interpret only the data necessary to prove one hypothesis, often ignoring the elements that might disprove it. Even if it's not done on purpose, confirmation bias can represent a real problem, as excluding relevant information can lead to false conclusions and, therefore, bad business decisions. To avoid it, always try to disprove your hypothesis instead of proving it, share your analysis with other team members, and avoid drawing any conclusions before the entire analytical project is finalized.
  • Statistical significance: To put it in short words, statistical significance helps analysts understand if a result is actually accurate or if it happened because of a sampling error or pure chance. The level of statistical significance needed might depend on the sample size and the industry being analyzed. In any case, ignoring the significance of a result when it might influence decision-making can be a huge mistake.

13. Build a narrative

Now, we’re going to look at how you can bring all of these elements together in a way that will benefit your business - starting with a little something called data storytelling.

The human brain responds incredibly well to strong stories or narratives. Once you’ve cleansed, shaped, and visualized your most invaluable data using various BI dashboard tools , you should strive to tell a story - one with a clear-cut beginning, middle, and end.

By doing so, you will make your analytical efforts more accessible, digestible, and universal, empowering more people within your organization to use your discoveries to their actionable advantage.

14. Consider autonomous technology

Autonomous technologies, such as artificial intelligence (AI) and machine learning (ML), play a significant role in the advancement of understanding how to analyze data more effectively.

Gartner predicts that by the end of this year, 80% of emerging technologies will be developed with AI foundations. This is a testament to the ever-growing power and value of autonomous technologies.

At the moment, these technologies are revolutionizing the analysis industry. Some examples that we mentioned earlier are neural networks, intelligent alarms, and sentiment analysis.

15. Share the load

If you work with the right tools and dashboards, you will be able to present your metrics in a digestible, value-driven format, allowing almost everyone in the organization to connect with and use relevant data to their advantage.

Modern dashboards consolidate data from various sources, providing access to a wealth of insights in one centralized location, no matter if you need to monitor recruitment metrics or generate reports that need to be sent across numerous departments. Moreover, these cutting-edge tools offer access to dashboards from a multitude of devices, meaning that everyone within the business can connect with practical insights remotely - and share the load.

Once everyone is able to work with a data-driven mindset, you will catalyze the success of your business in ways you never thought possible. And when it comes to knowing how to analyze data, this kind of collaborative approach is essential.

16. Data analysis tools

In order to perform high-quality analysis of data, it is fundamental to use tools and software that will ensure the best results. Here we leave you a small summary of four fundamental categories of data analysis tools for your organization.

  • Business Intelligence: BI tools allow you to process significant amounts of data from several sources in any format. Through this, you can not only analyze and monitor your data to extract relevant insights but also create interactive reports and dashboards to visualize your KPIs and use them for your company's good. datapine is an amazing online BI software that is focused on delivering powerful online analysis features that are accessible to beginner and advanced users. Like this, it offers a full-service solution that includes cutting-edge analysis of data, KPIs visualization, live dashboards, reporting, and artificial intelligence technologies to predict trends and minimize risk.
  • Statistical analysis: These tools are usually designed for scientists, statisticians, market researchers, and mathematicians, as they allow them to perform complex statistical analyses with methods like regression analysis, predictive analysis, and statistical modeling. A good tool to perform this type of analysis is R-Studio as it offers a powerful data modeling and hypothesis testing feature that can cover both academic and general data analysis. This tool is one of the favorite ones in the industry, due to its capability for data cleaning, data reduction, and performing advanced analysis with several statistical methods. Another relevant tool to mention is SPSS from IBM. The software offers advanced statistical analysis for users of all skill levels. Thanks to a vast library of machine learning algorithms, text analysis, and a hypothesis testing approach it can help your company find relevant insights to drive better decisions. SPSS also works as a cloud service that enables you to run it anywhere.
  • SQL Consoles: SQL is a programming language often used to handle structured data in relational databases. Tools like these are popular among data scientists as they are extremely effective in unlocking these databases' value. Undoubtedly, one of the most used SQL software in the market is MySQL Workbench . This tool offers several features such as a visual tool for database modeling and monitoring, complete SQL optimization, administration tools, and visual performance dashboards to keep track of KPIs.
  • Data Visualization: These tools are used to represent your data through charts, graphs, and maps that allow you to find patterns and trends in the data. datapine's already mentioned BI platform also offers a wealth of powerful online data visualization tools with several benefits. Some of them include: delivering compelling data-driven presentations to share with your entire company, the ability to see your data online with any device wherever you are, an interactive dashboard design feature that enables you to showcase your results in an interactive and understandable way, and to perform online self-service reports that can be used simultaneously with several other people to enhance team productivity.

17. Refine your process constantly 

Last is a step that might seem obvious to some people, but it can be easily ignored if you think you are done. Once you have extracted the needed results, you should always take a retrospective look at your project and think about what you can improve. As you saw throughout this long list of techniques, data analysis is a complex process that requires constant refinement. For this reason, you should always go one step further and keep improving. 

Quality Criteria For Data Analysis

So far we’ve covered a list of methods and techniques that should help you perform efficient data analysis. But how do you measure the quality and validity of your results? This is done with the help of some science quality criteria. Here we will go into a more theoretical area that is critical to understanding the fundamentals of statistical analysis in science. However, you should also be aware of these steps in a business context, as they will allow you to assess the quality of your results in the correct way. Let’s dig in. 

  • Internal validity: The results of a survey are internally valid if they measure what they are supposed to measure and thus provide credible results. In other words , internal validity measures the trustworthiness of the results and how they can be affected by factors such as the research design, operational definitions, how the variables are measured, and more. For instance, imagine you are doing an interview to ask people if they brush their teeth two times a day. While most of them will answer yes, you can still notice that their answers correspond to what is socially acceptable, which is to brush your teeth at least twice a day. In this case, you can’t be 100% sure if respondents actually brush their teeth twice a day or if they just say that they do, therefore, the internal validity of this interview is very low. 
  • External validity: Essentially, external validity refers to the extent to which the results of your research can be applied to a broader context. It basically aims to prove that the findings of a study can be applied in the real world. If the research can be applied to other settings, individuals, and times, then the external validity is high. 
  • Reliability : If your research is reliable, it means that it can be reproduced. If your measurement were repeated under the same conditions, it would produce similar results. This means that your measuring instrument consistently produces reliable results. For example, imagine a doctor building a symptoms questionnaire to detect a specific disease in a patient. Then, various other doctors use this questionnaire but end up diagnosing the same patient with a different condition. This means the questionnaire is not reliable in detecting the initial disease. Another important note here is that in order for your research to be reliable, it also needs to be objective. If the results of a study are the same, independent of who assesses them or interprets them, the study can be considered reliable. Let’s see the objectivity criteria in more detail now. 
  • Objectivity: In data science, objectivity means that the researcher needs to stay fully objective when it comes to its analysis. The results of a study need to be affected by objective criteria and not by the beliefs, personality, or values of the researcher. Objectivity needs to be ensured when you are gathering the data, for example, when interviewing individuals, the questions need to be asked in a way that doesn't influence the results. Paired with this, objectivity also needs to be thought of when interpreting the data. If different researchers reach the same conclusions, then the study is objective. For this last point, you can set predefined criteria to interpret the results to ensure all researchers follow the same steps. 

The discussed quality criteria cover mostly potential influences in a quantitative context. Analysis in qualitative research has by default additional subjective influences that must be controlled in a different way. Therefore, there are other quality criteria for this kind of research such as credibility, transferability, dependability, and confirmability. You can see each of them more in detail on this resource . 

Data Analysis Limitations & Barriers

Analyzing data is not an easy task. As you’ve seen throughout this post, there are many steps and techniques that you need to apply in order to extract useful information from your research. While a well-performed analysis can bring various benefits to your organization it doesn't come without limitations. In this section, we will discuss some of the main barriers you might encounter when conducting an analysis. Let’s see them more in detail. 

  • Lack of clear goals: No matter how good your data or analysis might be if you don’t have clear goals or a hypothesis the process might be worthless. While we mentioned some methods that don’t require a predefined hypothesis, it is always better to enter the analytical process with some clear guidelines of what you are expecting to get out of it, especially in a business context in which data is utilized to support important strategic decisions. 
  • Objectivity: Arguably one of the biggest barriers when it comes to data analysis in research is to stay objective. When trying to prove a hypothesis, researchers might find themselves, intentionally or unintentionally, directing the results toward an outcome that they want. To avoid this, always question your assumptions and avoid confusing facts with opinions. You can also show your findings to a research partner or external person to confirm that your results are objective. 
  • Data representation: A fundamental part of the analytical procedure is the way you represent your data. You can use various graphs and charts to represent your findings, but not all of them will work for all purposes. Choosing the wrong visual can not only damage your analysis but can mislead your audience, therefore, it is important to understand when to use each type of data depending on your analytical goals. Our complete guide on the types of graphs and charts lists 20 different visuals with examples of when to use them. 
  • Flawed correlation : Misleading statistics can significantly damage your research. We’ve already pointed out a few interpretation issues previously in the post, but it is an important barrier that we can't avoid addressing here as well. Flawed correlations occur when two variables appear related to each other but they are not. Confusing correlations with causation can lead to a wrong interpretation of results which can lead to building wrong strategies and loss of resources, therefore, it is very important to identify the different interpretation mistakes and avoid them. 
  • Sample size: A very common barrier to a reliable and efficient analysis process is the sample size. In order for the results to be trustworthy, the sample size should be representative of what you are analyzing. For example, imagine you have a company of 1000 employees and you ask the question “do you like working here?” to 50 employees of which 49 say yes, which means 95%. Now, imagine you ask the same question to the 1000 employees and 950 say yes, which also means 95%. Saying that 95% of employees like working in the company when the sample size was only 50 is not a representative or trustworthy conclusion. The significance of the results is way more accurate when surveying a bigger sample size.   
  • Privacy concerns: In some cases, data collection can be subjected to privacy regulations. Businesses gather all kinds of information from their customers from purchasing behaviors to addresses and phone numbers. If this falls into the wrong hands due to a breach, it can affect the security and confidentiality of your clients. To avoid this issue, you need to collect only the data that is needed for your research and, if you are using sensitive facts, make it anonymous so customers are protected. The misuse of customer data can severely damage a business's reputation, so it is important to keep an eye on privacy. 
  • Lack of communication between teams : When it comes to performing data analysis on a business level, it is very likely that each department and team will have different goals and strategies. However, they are all working for the same common goal of helping the business run smoothly and keep growing. When teams are not connected and communicating with each other, it can directly affect the way general strategies are built. To avoid these issues, tools such as data dashboards enable teams to stay connected through data in a visually appealing way. 
  • Innumeracy : Businesses are working with data more and more every day. While there are many BI tools available to perform effective analysis, data literacy is still a constant barrier. Not all employees know how to apply analysis techniques or extract insights from them. To prevent this from happening, you can implement different training opportunities that will prepare every relevant user to deal with data. 

Key Data Analysis Skills

As you've learned throughout this lengthy guide, analyzing data is a complex task that requires a lot of knowledge and skills. That said, thanks to the rise of self-service tools the process is way more accessible and agile than it once was. Regardless, there are still some key skills that are valuable to have when working with data, we list the most important ones below.

  • Critical and statistical thinking: To successfully analyze data you need to be creative and think out of the box. Yes, that might sound like a weird statement considering that data is often tight to facts. However, a great level of critical thinking is required to uncover connections, come up with a valuable hypothesis, and extract conclusions that go a step further from the surface. This, of course, needs to be complemented by statistical thinking and an understanding of numbers. 
  • Data cleaning: Anyone who has ever worked with data before will tell you that the cleaning and preparation process accounts for 80% of a data analyst's work, therefore, the skill is fundamental. But not just that, not cleaning the data adequately can also significantly damage the analysis which can lead to poor decision-making in a business scenario. While there are multiple tools that automate the cleaning process and eliminate the possibility of human error, it is still a valuable skill to dominate. 
  • Data visualization: Visuals make the information easier to understand and analyze, not only for professional users but especially for non-technical ones. Having the necessary skills to not only choose the right chart type but know when to apply it correctly is key. This also means being able to design visually compelling charts that make the data exploration process more efficient. 
  • SQL: The Structured Query Language or SQL is a programming language used to communicate with databases. It is fundamental knowledge as it enables you to update, manipulate, and organize data from relational databases which are the most common databases used by companies. It is fairly easy to learn and one of the most valuable skills when it comes to data analysis. 
  • Communication skills: This is a skill that is especially valuable in a business environment. Being able to clearly communicate analytical outcomes to colleagues is incredibly important, especially when the information you are trying to convey is complex for non-technical people. This applies to in-person communication as well as written format, for example, when generating a dashboard or report. While this might be considered a “soft” skill compared to the other ones we mentioned, it should not be ignored as you most likely will need to share analytical findings with others no matter the context. 

Data Analysis In The Big Data Environment

Big data is invaluable to today’s businesses, and by using different methods for data analysis, it’s possible to view your data in a way that can help you turn insight into positive action.

To inspire your efforts and put the importance of big data into context, here are some insights that you should know:

  • By 2026 the industry of big data is expected to be worth approximately $273.4 billion.
  • 94% of enterprises say that analyzing data is important for their growth and digital transformation. 
  • Companies that exploit the full potential of their data can increase their operating margins by 60% .
  • We already told you the benefits of Artificial Intelligence through this article. This industry's financial impact is expected to grow up to $40 billion by 2025.

Data analysis concepts may come in many forms, but fundamentally, any solid methodology will help to make your business more streamlined, cohesive, insightful, and successful than ever before.

Key Takeaways From Data Analysis 

As we reach the end of our data analysis journey, we leave a small summary of the main methods and techniques to perform excellent analysis and grow your business.

17 Essential Types of Data Analysis Methods:

  • Cluster analysis
  • Cohort analysis
  • Regression analysis
  • Factor analysis
  • Neural Networks
  • Data Mining
  • Text analysis
  • Time series analysis
  • Decision trees
  • Conjoint analysis 
  • Correspondence Analysis
  • Multidimensional Scaling 
  • Content analysis 
  • Thematic analysis
  • Narrative analysis 
  • Grounded theory analysis
  • Discourse analysis 

Top 17 Data Analysis Techniques:

  • Collaborate your needs
  • Establish your questions
  • Data democratization
  • Think of data governance 
  • Clean your data
  • Set your KPIs
  • Omit useless data
  • Build a data management roadmap
  • Integrate technology
  • Answer your questions
  • Visualize your data
  • Interpretation of data
  • Consider autonomous technology
  • Build a narrative
  • Share the load
  • Data Analysis tools
  • Refine your process constantly 

We’ve pondered the data analysis definition and drilled down into the practical applications of data-centric analytics, and one thing is clear: by taking measures to arrange your data and making your metrics work for you, it’s possible to transform raw information into action - the kind of that will push your business to the next level.

Yes, good data analytics techniques result in enhanced business intelligence (BI). To help you understand this notion in more detail, read our exploration of business intelligence reporting .

And, if you’re ready to perform your own analysis, drill down into your facts and figures while interacting with your data on astonishing visuals, you can try our software for a free, 14-day trial .

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Present Your Data Like a Pro

  • Joel Schwartzberg

what is not included in presentation and analysis of data

Demystify the numbers. Your audience will thank you.

While a good presentation has data, data alone doesn’t guarantee a good presentation. It’s all about how that data is presented. The quickest way to confuse your audience is by sharing too many details at once. The only data points you should share are those that significantly support your point — and ideally, one point per chart. To avoid the debacle of sheepishly translating hard-to-see numbers and labels, rehearse your presentation with colleagues sitting as far away as the actual audience would. While you’ve been working with the same chart for weeks or months, your audience will be exposed to it for mere seconds. Give them the best chance of comprehending your data by using simple, clear, and complete language to identify X and Y axes, pie pieces, bars, and other diagrammatic elements. Try to avoid abbreviations that aren’t obvious, and don’t assume labeled components on one slide will be remembered on subsequent slides. Every valuable chart or pie graph has an “Aha!” zone — a number or range of data that reveals something crucial to your point. Make sure you visually highlight the “Aha!” zone, reinforcing the moment by explaining it to your audience.

With so many ways to spin and distort information these days, a presentation needs to do more than simply share great ideas — it needs to support those ideas with credible data. That’s true whether you’re an executive pitching new business clients, a vendor selling her services, or a CEO making a case for change.

what is not included in presentation and analysis of data

  • JS Joel Schwartzberg oversees executive communications for a major national nonprofit, is a professional presentation coach, and is the author of Get to the Point! Sharpen Your Message and Make Your Words Matter and The Language of Leadership: How to Engage and Inspire Your Team . You can find him on LinkedIn and X. TheJoelTruth

Partner Center

Data presentation: A comprehensive guide

Learn how to create data presentation effectively and communicate your insights in a way that is clear, concise, and engaging.

Raja Bothra

Building presentations

team preparing data presentation

Hey there, fellow data enthusiast!

Welcome to our comprehensive guide on data presentation.

Whether you're an experienced presenter or just starting, this guide will help you present your data like a pro.

We'll dive deep into what data presentation is, why it's crucial, and how to master it. So, let's embark on this data-driven journey together.

What is data presentation?

Data presentation is the art of transforming raw data into a visual format that's easy to understand and interpret. It's like turning numbers and statistics into a captivating story that your audience can quickly grasp. When done right, data presentation can be a game-changer, enabling you to convey complex information effectively.

Why are data presentations important?

Imagine drowning in a sea of numbers and figures. That's how your audience might feel without proper data presentation. Here's why it's essential:

  • Clarity : Data presentations make complex information clear and concise.
  • Engagement : Visuals, such as charts and graphs, grab your audience's attention.
  • Comprehension : Visual data is easier to understand than long, numerical reports.
  • Decision-making : Well-presented data aids informed decision-making.
  • Impact : It leaves a lasting impression on your audience.

Types of data presentation

Now, let's delve into the diverse array of data presentation methods, each with its own unique strengths and applications. We have three primary types of data presentation, and within these categories, numerous specific visualization techniques can be employed to effectively convey your data.

1. Textual presentation

Textual presentation harnesses the power of words and sentences to elucidate and contextualize your data. This method is commonly used to provide a narrative framework for the data, offering explanations, insights, and the broader implications of your findings. It serves as a foundation for a deeper understanding of the data's significance.

2. Tabular presentation

Tabular presentation employs tables to arrange and structure your data systematically. These tables are invaluable for comparing various data groups or illustrating how data evolves over time. They present information in a neat and organized format, facilitating straightforward comparisons and reference points.

3. Graphical presentation

Graphical presentation harnesses the visual impact of charts and graphs to breathe life into your data. Charts and graphs are powerful tools for spotlighting trends, patterns, and relationships hidden within the data. Let's explore some common graphical presentation methods:

  • Bar charts: They are ideal for comparing different categories of data. In this method, each category is represented by a distinct bar, and the height of the bar corresponds to the value it represents. Bar charts provide a clear and intuitive way to discern differences between categories.
  • Pie charts: It excel at illustrating the relative proportions of different data categories. Each category is depicted as a slice of the pie, with the size of each slice corresponding to the percentage of the total value it represents. Pie charts are particularly effective for showcasing the distribution of data.
  • Line graphs: They are the go-to choice when showcasing how data evolves over time. Each point on the line represents a specific value at a particular time period. This method enables viewers to track trends and fluctuations effortlessly, making it perfect for visualizing data with temporal dimensions.
  • Scatter plots: They are the tool of choice when exploring the relationship between two variables. In this method, each point on the plot represents a pair of values for the two variables in question. Scatter plots help identify correlations, outliers, and patterns within data pairs.

The selection of the most suitable data presentation method hinges on the specific dataset and the presentation's objectives. For instance, when comparing sales figures of different products, a bar chart shines in its simplicity and clarity. On the other hand, if your aim is to display how a product's sales have changed over time, a line graph provides the ideal visual narrative.

Additionally, it's crucial to factor in your audience's level of familiarity with data presentations. For a technical audience, more intricate visualization methods may be appropriate. However, when presenting to a general audience, opting for straightforward and easily understandable visuals is often the wisest choice.

In the world of data presentation, choosing the right method is akin to selecting the perfect brush for a masterpiece. Each tool has its place, and understanding when and how to use them is key to crafting compelling and insightful presentations. So, consider your data carefully, align your purpose, and paint a vivid picture that resonates with your audience.

What to include in data presentation

When creating your data presentation, remember these key components:

  • Data points : Clearly state the data points you're presenting.
  • Comparison : Highlight comparisons and trends in your data.
  • Graphical methods : Choose the right chart or graph for your data.
  • Infographics : Use visuals like infographics to make information more digestible.
  • Numerical values : Include numerical values to support your visuals.
  • Qualitative information : Explain the significance of the data.
  • Source citation : Always cite your data sources.

How to structure an effective data presentation

Creating a well-structured data presentation is not just important; it's the backbone of a successful presentation. Here's a step-by-step guide to help you craft a compelling and organized presentation that captivates your audience:

1. Know your audience

Understanding your audience is paramount. Consider their needs, interests, and existing knowledge about your topic. Tailor your presentation to their level of understanding, ensuring that it resonates with them on a personal level. Relevance is the key.

2. Have a clear message

Every effective data presentation should convey a clear and concise message. Determine what you want your audience to learn or take away from your presentation, and make sure your message is the guiding light throughout your presentation. Ensure that all your data points align with and support this central message.

3. Tell a compelling story

Human beings are naturally wired to remember stories. Incorporate storytelling techniques into your presentation to make your data more relatable and memorable. Your data can be the backbone of a captivating narrative, whether it's about a trend, a problem, or a solution. Take your audience on a journey through your data.

4. Leverage visuals

Visuals are a powerful tool in data presentation. They make complex information accessible and engaging. Utilize charts, graphs, and images to illustrate your points and enhance the visual appeal of your presentation. Visuals should not just be an accessory; they should be an integral part of your storytelling.

5. Be clear and concise

Avoid jargon or technical language that your audience may not comprehend. Use plain language and explain your data points clearly. Remember, clarity is king. Each piece of information should be easy for your audience to digest.

6. Practice your delivery

Practice makes perfect. Rehearse your presentation multiple times before the actual delivery. This will help you deliver it smoothly and confidently, reducing the chances of stumbling over your words or losing track of your message.

A basic structure for an effective data presentation

Armed with a comprehensive comprehension of how to construct a compelling data presentation, you can now utilize this fundamental template for guidance:

In the introduction, initiate your presentation by introducing both yourself and the topic at hand. Clearly articulate your main message or the fundamental concept you intend to communicate.

Moving on to the body of your presentation, organize your data in a coherent and easily understandable sequence. Employ visuals generously to elucidate your points and weave a narrative that enhances the overall story. Ensure that the arrangement of your data aligns with and reinforces your central message.

As you approach the conclusion, succinctly recapitulate your key points and emphasize your core message once more. Conclude by leaving your audience with a distinct and memorable takeaway, ensuring that your presentation has a lasting impact.

Additional tips for enhancing your data presentation

To take your data presentation to the next level, consider these additional tips:

  • Consistent design : Maintain a uniform design throughout your presentation. This not only enhances visual appeal but also aids in seamless comprehension.
  • High-quality visuals : Ensure that your visuals are of high quality, easy to read, and directly relevant to your topic.
  • Concise text : Avoid overwhelming your slides with excessive text. Focus on the most critical points, using visuals to support and elaborate.
  • Anticipate questions : Think ahead about the questions your audience might pose. Be prepared with well-thought-out answers to foster productive discussions.

By following these guidelines, you can structure an effective data presentation that not only informs but also engages and inspires your audience. Remember, a well-structured presentation is the bridge that connects your data to your audience's understanding and appreciation.

Do’s and don'ts on a data presentation

  • Use visuals : Incorporate charts and graphs to enhance understanding.
  • Keep it simple : Avoid clutter and complexity.
  • Highlight key points : Emphasize crucial data.
  • Engage the audience : Encourage questions and discussions.
  • Practice : Rehearse your presentation.

Don'ts:

  • Overload with data : Less is often more; don't overwhelm your audience.
  • Fit Unrelated data : Stay on topic; don't include irrelevant information.
  • Neglect the audience : Ensure your presentation suits your audience's level of expertise.
  • Read word-for-word : Avoid reading directly from slides.
  • Lose focus : Stick to your presentation's purpose.

Summarizing key takeaways

  • Definition : Data presentation is the art of visualizing complex data for better understanding.
  • Importance : Data presentations enhance clarity, engage the audience, aid decision-making, and leave a lasting impact.
  • Types : Textual, Tabular, and Graphical presentations offer various ways to present data.
  • Choosing methods : Select the right method based on data, audience, and purpose.
  • Components : Include data points, comparisons, visuals, infographics, numerical values, and source citations.
  • Structure : Know your audience, have a clear message, tell a compelling story, use visuals, be concise, and practice.
  • Do's and don'ts : Do use visuals, keep it simple, highlight key points, engage the audience, and practice. Don't overload with data, include unrelated information, neglect the audience's expertise, read word-for-word, or lose focus.

1. What is data presentation, and why is it important in 2023?

Data presentation is the process of visually representing data sets to convey information effectively to an audience. In an era where the amount of data generated is vast, visually presenting data using methods such as diagrams, graphs, and charts has become crucial. By simplifying complex data sets, presentation of the data may helps your audience quickly grasp much information without drowning in a sea of chart's, analytics, facts and figures.

2. What are some common methods of data presentation?

There are various methods of data presentation, including graphs and charts, histograms, and cumulative frequency polygons. Each method has its strengths and is often used depending on the type of data you're using and the message you want to convey. For instance, if you want to show data over time, try using a line graph. If you're presenting geographical data, consider to use a heat map.

3. How can I ensure that my data presentation is clear and readable?

To ensure that your data presentation is clear and readable, pay attention to the design and labeling of your charts. Don't forget to label the axes appropriately, as they are critical for understanding the values they represent. Don't fit all the information in one slide or in a single paragraph. Presentation software like Prezent and PowerPoint can help you simplify your vertical axis, charts and tables, making them much easier to understand.

4. What are some common mistakes presenters make when presenting data?

One common mistake is trying to fit too much data into a single chart, which can distort the information and confuse the audience. Another mistake is not considering the needs of the audience. Remember that your audience won't have the same level of familiarity with the data as you do, so it's essential to present the data effectively and respond to questions during a Q&A session.

5. How can I use data visualization to present important data effectively on platforms like LinkedIn?

When presenting data on platforms like LinkedIn, consider using eye-catching visuals like bar graphs or charts. Use concise captions and e.g., examples to highlight the single most important information in your data report. Visuals, such as graphs and tables, can help you stand out in the sea of textual content, making your data presentation more engaging and shareable among your LinkedIn connections.

Create your data presentation with prezent

Prezent can be a valuable tool for creating data presentations. Here's how Prezent can help you in this regard:

  • Time savings : Prezent saves up to 70% of presentation creation time, allowing you to focus on data analysis and insights.
  • On-brand consistency : Ensure 100% brand alignment with Prezent's brand-approved designs for professional-looking data presentations.
  • Effortless collaboration : Real-time sharing and collaboration features make it easy for teams to work together on data presentations.
  • Data storytelling : Choose from 50+ storylines to effectively communicate data insights and engage your audience.
  • Personalization : Create tailored data presentations that resonate with your audience's preferences, enhancing the impact of your data.

In summary, Prezent streamlines the process of creating data presentations by offering time-saving features, ensuring brand consistency, promoting collaboration, and providing tools for effective data storytelling. Whether you need to present data to clients, stakeholders, or within your organization, Prezent can significantly enhance your presentation-making process.

So, go ahead, present your data with confidence, and watch your audience be wowed by your expertise.

Thank you for joining us on this data-driven journey. Stay tuned for more insights, and remember, data presentation is your ticket to making numbers come alive!

Sign up for our free trial or book a demo !

Get the latest from Prezent community

Join thousands of subscribers who receive our best practices on communication, storytelling, presentation design, and more. New tips weekly. (No spam, we promise!)

websights

  • Online Degree Explore Bachelor’s & Master’s degrees
  • MasterTrack™ Earn credit towards a Master’s degree
  • University Certificates Advance your career with graduate-level learning
  • Top Courses
  • Join for Free

Data Analytics: Definition, Uses, Examples, and More

Learn about data analytics, how it's used, common skills, and careers that implement analytical concepts.

[Featured image] Two coworkers sit at a desk and analyze data on a computer screen. On the wall behind them is a large screen presenting more data sequences.

Data analytics is the collection, transformation, and organization of data in order to draw conclusions, make predictions, and drive informed decision making. 

Data analytics is often confused with data analysis . While these are related terms, they aren’t exactly the same. In fact, data analysis is a subcategory of data analytics that deals specifically with extracting meaning from data. Data analytics, as a whole, includes processes beyond analysis, including data science (using data to theorize and forecast) and data engineering (building data systems).

In this article, you'll learn more about what data analytics is, how its used, and its key concepts. You'll also explore data analytics skills, jobs, and cost-effective specializations that can help you get started today.

Beginner-friendly data analysis courses

Interested in building your knowledge of data analysis today? Consider enrolling in one of these popular courses on Coursera:

In Google's Foundations: Data, Data, Everywhere course, you'll explore key data analysis concepts, tools, and jobs.

In Duke University's Data Analysis and Visualization course, you'll learn how to identify key components for data analytics projects, explore data visualization, and find out how to create a compelling data story.

What is data analytics?

Data analytics is a multidisciplinary field that employs a wide range of analysis techniques, including math, statistics, and computer science, to draw insights from data sets. Data analytics is a broad term that includes everything from simply analyzing data to theorizing ways of collecting data and creating the frameworks needed to store it.

How is data analytics used? Data analytics examples

Data is everywhere, and people use data every day, whether they realize it or not. Daily tasks such as measuring coffee beans to make your morning cup, checking the weather report before deciding what to wear, or tracking your steps throughout the day with a fitness tracker can all be forms of analyzing and using data.

Data analytics is important across many industries, as many business leaders use data to make informed decisions. A sneaker manufacturer might look at sales data to determine which designs to continue and which to retire, or a health care administrator may look at inventory data to determine the medical supplies they should order. At Coursera, we may look at enrollment data to determine what kind of courses to add to our offerings.

Organizations that use data to drive business strategies often find that they are more confident, proactive, and financially savvy.

Learn more about how data is used in the real world in this lecture from Google's Data Analytics Professional Certificate :

Read more: Health Care Analytics: Definition, Impact, and More

Data analytics: Key concepts

There are four key types of data analytics: descriptive, diagnostic, predictive, and prescriptive.  Together, these four types of data analytics can help an organization make data-driven decisions. At a glance, each of them tells us the following:

Descriptive analytics tell us what happened.

Diagnostic analytics tell us why something happened.

Predictive analytics tell us what will likely happen in the future.

Prescriptive analytics tell us how to act.

People who work with data analytics will typically explore each of these four areas using the data analysis process, which includes identifying the question, collecting raw data, cleaning data, analyzing data, and interpreting the results.

Read more: What Is Data Analysis? (With Examples)

Data analytics skills

Data analytics requires a wide range of skills to be performed effectively. According to search and enrollment data among Coursera’s community of 87 million global learners, these are the top in-demand data science skills, as of December 2021:

Structured Query Language (SQL) , a programming language commonly used for databases

Statistical programming languages , such as R and Python , commonly used to create advanced data analysis programs

Machine learning , a branch of artificial intelligence that involves using algorithms to spot data patterns

Probability and statistics , in order to better analyze and interpret data trends

Data management , or the practices around collecting, organizing and storing data

Data visualization , or the ability to use charts and graphs to tell a story with data

Econometrics , or the ability to use data trends to create mathematical models that forecast future trends based

While careers in data analytics require a certain amount of technical knowledge, approaching the above skills methodically—for example by learning a little bit each day or learning from your mistakes—can help lead to mastery, and it’s never too late to get started. 

Read more: Is Data Analytics Hard? Tips for Rising to the Challenge

Build your programming skills today

Start building your programming skills today by enrolling in one of these courses on Coursera today:

In the University of Michigan's Python for Everybody Specialization , you'll learn how to program and analyze data with Python.

In Duke University's Data Analysis with R Specialization , you'll learn how to analyze and visualize data in the R programming language.

The University of Michigan's PostgreSQL for Everybody Specialization teaches SQL skills you can use in an actual, real-world environment.

Data analytics jobs

Typically, data analytics professionals make higher-than-average salaries and are in high demand within the labor market. The US Bureau of Labor Statistics (BLS) projects that careers in data analytics fields will grow by 23 percent between 2022 and 2032—much faster than average—and are estimated to pay a higher-than-average annual income of $85,720 [ 1 ]. But, according to the Anaconda 2022 State of Data Science report, 63% of commercial organizations surveyed expressed concern over a talent shortage in the face of such rapid growth [ 2 ].

Entry-level careers in data analytics include roles such as:

Junior data analyst

Associate data analyst

Junior data scientist

You can practice statistical analysis, data management, and programming using SQL, Tableau, and Python in Meta's beginner-friendly Data Analyst Professional Certificate . Designed to prepare you for an entry-level role , this self-paced program can be completed in just 5 months .

As you gain more experience in the field, you may qualify for mid- to upper-level roles like:

Data analyst

Data scientist

Data architect

Data engineer

Business analyst

Marketing analyst

Click through the links above to learn more about each career path, including what the roles entail as well as average salary and job growth.

Read more : How Much Do Data Analysts Make? Salary Guide

Have career questions? We have answers.

Subscribe to Coursera Career Chat on LinkedIn to receive our weekly, bite-sized newsletter for more work insights, tips, and updates from our in-house team.

Learn more about data analytics

Data analytics is all about using data to gain insights and make better, more informed decisions. Learn from the best in Google's Data Analytics Professional Certificate , which will have you job-ready for an entry-level data analytics position in approximately six months. There, you’ll learn key skills like data cleaning and visualization and get hands-on experience with common data analytics tools through video instruction and an applied learning project. 

Article sources

US Bureau of Labor Statistics. " Occupational Outlook Handbook: Operations Research Analysts , https://www.bls.gov/ooh/math/operations-research-analysts.htm." Accessed March 19, 2024.

Anaconda. " 2022 State of Data Science report , https://know.anaconda.com/rs/387-XNW-688/images/ANA_2022SODSReport.pdf." Accessed March 19, 2024.

Keep reading

Coursera staff.

Editorial Team

Coursera’s editorial team is comprised of highly experienced professional editors, writers, and fact...

This content has been made available for informational purposes only. Learners are advised to conduct additional research to ensure that courses and other credentials pursued meet their personal, professional, and financial goals.

Leeds Beckett University

Skills for Learning : Research Skills

Data analysis is an ongoing process that should occur throughout your research project. Suitable data-analysis methods must be selected when you write your research proposal. The nature of your data (i.e. quantitative or qualitative) will be influenced by your research design and purpose. The data will also influence the analysis methods selected.

We run interactive workshops to help you develop skills related to doing research, such as data analysis, writing literature reviews and preparing for dissertations. Find out more on the Skills for Learning Workshops page.

We have online academic skills modules within MyBeckett for all levels of university study. These modules will help your academic development and support your success at LBU. You can work through the modules at your own pace, revisiting them as required. Find out more from our FAQ What academic skills modules are available?

Quantitative data analysis

Broadly speaking, 'statistics' refers to methods, tools and techniques used to collect, organise and interpret data. The goal of statistics is to gain understanding from data. Therefore, you need to know how to:

  • Produce data – for example, by handing out a questionnaire or doing an experiment.
  • Organise, summarise, present and analyse data.
  • Draw valid conclusions from findings.

There are a number of statistical methods you can use to analyse data. Choosing an appropriate statistical method should follow naturally, however, from your research design. Therefore, you should think about data analysis at the early stages of your study design. You may need to consult a statistician for help with this.

Tips for working with statistical data

  • Plan so that the data you get has a good chance of successfully tackling the research problem. This will involve reading literature on your subject, as well as on what makes a good study.
  • To reach useful conclusions, you need to reduce uncertainties or 'noise'. Thus, you will need a sufficiently large data sample. A large sample will improve precision. However, this must be balanced against the 'costs' (time and money) of collection.
  • Consider the logistics. Will there be problems in obtaining sufficient high-quality data? Think about accuracy, trustworthiness and completeness.
  • Statistics are based on random samples. Consider whether your sample will be suited to this sort of analysis. Might there be biases to think about?
  • How will you deal with missing values (any data that is not recorded for some reason)? These can result from gaps in a record or whole records being missed out.
  • When analysing data, start by looking at each variable separately. Conduct initial/exploratory data analysis using graphical displays. Do this before looking at variables in conjunction or anything more complicated. This process can help locate errors in the data and also gives you a 'feel' for the data.
  • Look out for patterns of 'missingness'. They are likely to alert you if there’s a problem. If the 'missingness' is not random, then it will have an impact on the results.
  • Be vigilant and think through what you are doing at all times. Think critically. Statistics are not just mathematical tricks that a computer sorts out. Rather, analysing statistical data is a process that the human mind must interpret!

Top tips! Try inventing or generating the sort of data you might get and see if you can analyse it. Make sure that your process works before gathering actual data. Think what the output of an analytic procedure will look like before doing it for real.

(Note: it is actually difficult to generate realistic data. There are fraud-detection methods in place to identify data that has been fabricated. So, remember to get rid of your practice data before analysing the real stuff!)

Statistical software packages

Software packages can be used to analyse and present data. The most widely used ones are SPSS and NVivo.

SPSS is a statistical-analysis and data-management package for quantitative data analysis. Click on ‘ How do I install SPSS? ’ to learn how to download SPSS to your personal device. SPSS can perform a wide variety of statistical procedures. Some examples are:

  • Data management (i.e. creating subsets of data or transforming data).
  • Summarising, describing or presenting data (i.e. mean, median and frequency).
  • Looking at the distribution of data (i.e. standard deviation).
  • Comparing groups for significant differences using parametric (i.e. t-test) and non-parametric (i.e. Chi-square) tests.
  • Identifying significant relationships between variables (i.e. correlation).

NVivo can be used for qualitative data analysis. It is suitable for use with a wide range of methodologies. Click on ‘ How do I access NVivo ’ to learn how to download NVivo to your personal device. NVivo supports grounded theory, survey data, case studies, focus groups, phenomenology, field research and action research.

  • Process data such as interview transcripts, literature or media extracts, and historical documents.
  • Code data on screen and explore all coding and documents interactively.
  • Rearrange, restructure, extend and edit text, coding and coding relationships.
  • Search imported text for words, phrases or patterns, and automatically code the results.

Qualitative data analysis

Miles and Huberman (1994) point out that there are diverse approaches to qualitative research and analysis. They suggest, however, that it is possible to identify 'a fairly classic set of analytic moves arranged in sequence'. This involves:

  • Affixing codes to a set of field notes drawn from observation or interviews.
  • Noting reflections or other remarks in the margins.
  • Sorting/sifting through these materials to identify: a) similar phrases, relationships between variables, patterns and themes and b) distinct differences between subgroups and common sequences.
  • Isolating these patterns/processes and commonalties/differences. Then, taking them out to the field in the next wave of data collection.
  • Highlighting generalisations and relating them to your original research themes.
  • Taking the generalisations and analysing them in relation to theoretical perspectives.

        (Miles and Huberman, 1994.)

Patterns and generalisations are usually arrived at through a process of analytic induction (see above points 5 and 6). Qualitative analysis rarely involves statistical analysis of relationships between variables. Qualitative analysis aims to gain in-depth understanding of concepts, opinions or experiences.

Presenting information

There are a number of different ways of presenting and communicating information. The particular format you use is dependent upon the type of data generated from the methods you have employed.

Here are some appropriate ways of presenting information for different types of data:

Bar charts: These   may be useful for comparing relative sizes. However, they tend to use a large amount of ink to display a relatively small amount of information. Consider a simple line chart as an alternative.

Pie charts: These have the benefit of indicating that the data must add up to 100%. However, they make it difficult for viewers to distinguish relative sizes, especially if two slices have a difference of less than 10%.

Other examples of presenting data in graphical form include line charts and  scatter plots .

Qualitative data is more likely to be presented in text form. For example, using quotations from interviews or field diaries.

  • Plan ahead, thinking carefully about how you will analyse and present your data.
  • Think through possible restrictions to resources you may encounter and plan accordingly.
  • Find out about the different IT packages available for analysing your data and select the most appropriate.
  • If necessary, allow time to attend an introductory course on a particular computer package. You can book SPSS and NVivo workshops via MyHub .
  • Code your data appropriately, assigning conceptual or numerical codes as suitable.
  • Organise your data so it can be analysed and presented easily.
  • Choose the most suitable way of presenting your information, according to the type of data collected. This will allow your information to be understood and interpreted better.

Primary, secondary and tertiary sources

Information sources are sometimes categorised as primary, secondary or tertiary sources depending on whether or not they are ‘original’ materials or data. For some research projects, you may need to use primary sources as well as secondary or tertiary sources. However the distinction between primary and secondary sources is not always clear and depends on the context. For example, a newspaper article might usually be categorised as a secondary source. But it could also be regarded as a primary source if it were an article giving a first-hand account of a historical event written close to the time it occurred.

  • Primary sources
  • Secondary sources
  • Tertiary sources
  • Grey literature

Primary sources are original sources of information that provide first-hand accounts of what is being experienced or researched. They enable you to get as close to the actual event or research as possible. They are useful for getting the most contemporary information about a topic.

Examples include diary entries, newspaper articles, census data, journal articles with original reports of research, letters, email or other correspondence, original manuscripts and archives, interviews, research data and reports, statistics, autobiographies, exhibitions, films, and artists' writings.

Some information will be available on an Open Access basis, freely accessible online. However, many academic sources are paywalled, and you may need to login as a Leeds Beckett student to access them. Where Leeds Beckett does not have access to a source, you can use our  Request It! Service .

Secondary sources interpret, evaluate or analyse primary sources. They're useful for providing background information on a topic, or for looking back at an event from a current perspective. The majority of your literature searching will probably be done to find secondary sources on your topic.

Examples include journal articles which review or interpret original findings, popular magazine articles commenting on more serious research, textbooks and biographies.

The term tertiary sources isn't used a great deal. There's overlap between what might be considered a secondary source and a tertiary source. One definition is that a tertiary source brings together secondary sources.

Examples include almanacs, fact books, bibliographies, dictionaries and encyclopaedias, directories, indexes and abstracts. They can be useful for introductory information or an overview of a topic in the early stages of research.

Depending on your subject of study, grey literature may be another source you need to use. Grey literature includes technical or research reports, theses and dissertations, conference papers, government documents, white papers, and so on.

Artificial intelligence tools

Before using any generative artificial intelligence or paraphrasing tools in your assessments, you should check if this is permitted on your course.

If their use is permitted on your course, you must  acknowledge any use of generative artificial intelligence tools  such as ChatGPT or paraphrasing tools (e.g., Grammarly, Quillbot, etc.), even if you have only used them to generate ideas for your assessments or for proofreading.

  • Academic Integrity Module in MyBeckett
  • Assignment Calculator
  • Building on Feedback
  • Disability Advice
  • Essay X-ray tool
  • International Students' Academic Introduction
  • Manchester Academic Phrasebank
  • Quote, Unquote
  • Skills and Subject Suppor t
  • Turnitin Grammar Checker

{{You can add more boxes below for links specific to this page [this note will not appear on user pages] }}

  • Research Methods Checklist
  • Sampling Checklist

Skills for Learning FAQs

Library & Student Services

0113 812 1000

  • University Disclaimer
  • Accessibility

Data Collection, Presentation and Analysis

  • First Online: 25 May 2023

Cite this chapter

Book cover

  • Uche M. Mbanaso 4 ,
  • Lucienne Abrahams 5 &
  • Kennedy Chinedu Okafor 6  

506 Accesses

This chapter covers the topics of data collection, data presentation and data analysis. It gives attention to data collection for studies based on experiments, on data derived from existing published or unpublished data sets, on observation, on simulation and digital twins, on surveys, on interviews and on focus group discussions. One of the interesting features of this chapter is the section dealing with using measurement scales in quantitative research, including nominal scales, ordinal scales, interval scales and ratio scales. It explains key facets of qualitative research including ethical clearance requirements. The chapter discusses the importance of data visualization as key to effective presentation of data, including tabular forms, graphical forms and visual charts such as those generated by Atlas.ti analytical software.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Bibliography

Abdullah, M. F., & Ahmad, K. (2013). The mapping process of unstructured data to structured data. Proceedings of the 2013 International Conference on Research and Innovation in Information Systems (ICRIIS) , Malaysia , 151–155. https://doi.org/10.1109/ICRIIS.2013.6716700

Adnan, K., & Akbar, R. (2019). An analytical study of information extraction from unstructured and multidimensional big data. Journal of Big Data, 6 , 91. https://doi.org/10.1186/s40537-019-0254-8

Article   Google Scholar  

Alsheref, F. K., & Fattoh, I. E. (2020). Medical text annotation tool based on IBM Watson Platform. Proceedings of the 2020 6th international conference on advanced computing and communication systems (ICACCS) , India , 1312–1316. https://doi.org/10.1109/ICACCS48705.2020.9074309

Cinque, M., Cotroneo, D., Della Corte, R., & Pecchia, A. (2014). What logs should you look at when an application fails? Insights from an industrial case study. Proceedings of the 2014 44th Annual IEEE/IFIP International Conference on Dependable Systems and Networks , USA , 690–695. https://doi.org/10.1109/DSN.2014.69

Gideon, L. (Ed.). (2012). Handbook of survey methodology for the social sciences . Springer.

Google Scholar  

Leedy, P., & Ormrod, J. (2015). Practical research planning and design (12th ed.). Pearson Education.

Madaan, A., Wang, X., Hall, W., & Tiropanis, T. (2018). Observing data in IoT worlds: What and how to observe? In Living in the Internet of Things: Cybersecurity of the IoT – 2018 (pp. 1–7). https://doi.org/10.1049/cp.2018.0032

Chapter   Google Scholar  

Mahajan, P., & Naik, C. (2019). Development of integrated IoT and machine learning based data collection and analysis system for the effective prediction of agricultural residue/biomass availability to regenerate clean energy. Proceedings of the 2019 9th International Conference on Emerging Trends in Engineering and Technology – Signal and Information Processing (ICETET-SIP-19) , India , 1–5. https://doi.org/10.1109/ICETET-SIP-1946815.2019.9092156 .

Mahmud, M. S., Huang, J. Z., Salloum, S., Emara, T. Z., & Sadatdiynov, K. (2020). A survey of data partitioning and sampling methods to support big data analysis. Big Data Mining and Analytics, 3 (2), 85–101. https://doi.org/10.26599/BDMA.2019.9020015

Miswar, S., & Kurniawan, N. B. (2018). A systematic literature review on survey data collection system. Proceedings of the 2018 International Conference on Information Technology Systems and Innovation (ICITSI) , Indonesia , 177–181. https://doi.org/10.1109/ICITSI.2018.8696036

Mosina, C. (2020). Understanding the diffusion of the internet: Redesigning the global diffusion of the internet framework (Research report, Master of Arts in ICT Policy and Regulation). LINK Centre, University of the Witwatersrand. https://hdl.handle.net/10539/30723

Nkamisa, S. (2021). Investigating the integration of drone management systems to create an enabling remote piloted aircraft regulatory environment in South Africa (Research report, Master of Arts in ICT Policy and Regulation). LINK Centre, University of the Witwatersrand. https://hdl.handle.net/10539/33883

QuestionPro. (2020). Survey research: Definition, examples and methods . https://www.questionpro.com/article/survey-research.html

Rajanikanth, J. & Kanth, T. V. R. (2017). An explorative data analysis on Bangalore City Weather with hybrid data mining techniques using R. Proceedings of the 2017 International Conference on Current Trends in Computer, Electrical, Electronics and Communication (CTCEEC) , India , 1121-1125. https://doi/10.1109/CTCEEC.2017.8455008

Rao, R. (2003). From unstructured data to actionable intelligence. IT Professional, 5 , 29–35. https://www.researchgate.net/publication/3426648_From_Unstructured_Data_to_Actionable_Intelligence

Schulze, P. (2009). Design of the research instrument. In P. Schulze (Ed.), Balancing exploitation and exploration: Organizational antecedents and performance effects of innovation strategies (pp. 116–141). Gabler. https://doi.org/10.1007/978-3-8349-8397-8_6

Usanov, A. (2015). Assessing cybersecurity: A meta-analysis of threats, trends and responses to cyber attacks . The Hague Centre for Strategic Studies. https://www.researchgate.net/publication/319677972_Assessing_Cyber_Security_A_Meta-analysis_of_Threats_Trends_and_Responses_to_Cyber_Attacks

Van de Kaa, G., De Vries, H. J., van Heck, E., & van den Ende, J. (2007). The emergence of standards: A meta-analysis. Proceedings of the 2007 40th Annual Hawaii International Conference on Systems Science (HICSS’07) , USA , 173a–173a. https://doi.org/10.1109/HICSS.2007.529

Download references

Author information

Authors and affiliations.

Centre for Cybersecurity Studies, Nasarawa State University, Keffi, Nigeria

Uche M. Mbanaso

LINK Centre, University of the Witwatersrand, Johannesburg, South Africa

Lucienne Abrahams

Department of Mechatronics Engineering, Federal University of Technology, Owerri, Nigeria

Kennedy Chinedu Okafor

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Mbanaso, U.M., Abrahams, L., Okafor, K.C. (2023). Data Collection, Presentation and Analysis. In: Research Techniques for Computer Science, Information Systems and Cybersecurity. Springer, Cham. https://doi.org/10.1007/978-3-031-30031-8_7

Download citation

DOI : https://doi.org/10.1007/978-3-031-30031-8_7

Published : 25 May 2023

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-30030-1

Online ISBN : 978-3-031-30031-8

eBook Packages : Engineering Engineering (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Logo for Rhode Island College Digital Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Qualitative Data Analysis

23 Presenting the Results of Qualitative Analysis

Mikaila Mariel Lemonik Arthur

Qualitative research is not finished just because you have determined the main findings or conclusions of your study. Indeed, disseminating the results is an essential part of the research process. By sharing your results with others, whether in written form as scholarly paper or an applied report or in some alternative format like an oral presentation, an infographic, or a video, you ensure that your findings become part of the ongoing conversation of scholarship in your field, forming part of the foundation for future researchers. This chapter provides an introduction to writing about qualitative research findings. It will outline how writing continues to contribute to the analysis process, what concerns researchers should keep in mind as they draft their presentations of findings, and how best to organize qualitative research writing

As you move through the research process, it is essential to keep yourself organized. Organizing your data, memos, and notes aids both the analytical and the writing processes. Whether you use electronic or physical, real-world filing and organizational systems, these systems help make sense of the mountains of data you have and assure you focus your attention on the themes and ideas you have determined are important (Warren and Karner 2015). Be sure that you have kept detailed notes on all of the decisions you have made and procedures you have followed in carrying out research design, data collection, and analysis, as these will guide your ultimate write-up.

First and foremost, researchers should keep in mind that writing is in fact a form of thinking. Writing is an excellent way to discover ideas and arguments and to further develop an analysis. As you write, more ideas will occur to you, things that were previously confusing will start to make sense, and arguments will take a clear shape rather than being amorphous and poorly-organized. However, writing-as-thinking cannot be the final version that you share with others. Good-quality writing does not display the workings of your thought process. It is reorganized and revised (more on that later) to present the data and arguments important in a particular piece. And revision is totally normal! No one expects the first draft of a piece of writing to be ready for prime time. So write rough drafts and memos and notes to yourself and use them to think, and then revise them until the piece is the way you want it to be for sharing.

Bergin (2018) lays out a set of key concerns for appropriate writing about research. First, present your results accurately, without exaggerating or misrepresenting. It is very easy to overstate your findings by accident if you are enthusiastic about what you have found, so it is important to take care and use appropriate cautions about the limitations of the research. You also need to work to ensure that you communicate your findings in a way people can understand, using clear and appropriate language that is adjusted to the level of those you are communicating with. And you must be clear and transparent about the methodological strategies employed in the research. Remember, the goal is, as much as possible, to describe your research in a way that would permit others to replicate the study. There are a variety of other concerns and decision points that qualitative researchers must keep in mind, including the extent to which to include quantification in their presentation of results, ethics, considerations of audience and voice, and how to bring the richness of qualitative data to life.

Quantification, as you have learned, refers to the process of turning data into numbers. It can indeed be very useful to count and tabulate quantitative data drawn from qualitative research. For instance, if you were doing a study of dual-earner households and wanted to know how many had an equal division of household labor and how many did not, you might want to count those numbers up and include them as part of the final write-up. However, researchers need to take care when they are writing about quantified qualitative data. Qualitative data is not as generalizable as quantitative data, so quantification can be very misleading. Thus, qualitative researchers should strive to use raw numbers instead of the percentages that are more appropriate for quantitative research. Writing, for instance, “15 of the 20 people I interviewed prefer pancakes to waffles” is a simple description of the data; writing “75% of people prefer pancakes” suggests a generalizable claim that is not likely supported by the data. Note that mixing numbers with qualitative data is really a type of mixed-methods approach. Mixed-methods approaches are good, but sometimes they seduce researchers into focusing on the persuasive power of numbers and tables rather than capitalizing on the inherent richness of their qualitative data.

A variety of issues of scholarly ethics and research integrity are raised by the writing process. Some of these are unique to qualitative research, while others are more universal concerns for all academic and professional writing. For example, it is essential to avoid plagiarism and misuse of sources. All quotations that appear in a text must be properly cited, whether with in-text and bibliographic citations to the source or with an attribution to the research participant (or the participant’s pseudonym or description in order to protect confidentiality) who said those words. Where writers will paraphrase a text or a participant’s words, they need to make sure that the paraphrase they develop accurately reflects the meaning of the original words. Thus, some scholars suggest that participants should have the opportunity to read (or to have read to them, if they cannot read the text themselves) all sections of the text in which they, their words, or their ideas are presented to ensure accuracy and enable participants to maintain control over their lives.

Audience and Voice

When writing, researchers must consider their audience(s) and the effects they want their writing to have on these audiences. The designated audience will dictate the voice used in the writing, or the individual style and personality of a piece of text. Keep in mind that the potential audience for qualitative research is often much more diverse than that for quantitative research because of the accessibility of the data and the extent to which the writing can be accessible and interesting. Yet individual pieces of writing are typically pitched to a more specific subset of the audience.

Let us consider one potential research study, an ethnography involving participant-observation of the same children both when they are at daycare facility and when they are at home with their families to try to understand how daycare might impact behavior and social development. The findings of this study might be of interest to a wide variety of potential audiences: academic peers, whether at your own academic institution, in your broader discipline, or multidisciplinary; people responsible for creating laws and policies; practitioners who run or teach at day care centers; and the general public, including both people who are interested in child development more generally and those who are themselves parents making decisions about child care for their own children. And the way you write for each of these audiences will be somewhat different. Take a moment and think through what some of these differences might look like.

If you are writing to academic audiences, using specialized academic language and working within the typical constraints of scholarly genres, as will be discussed below, can be an important part of convincing others that your work is legitimate and should be taken seriously. Your writing will be formal. Even if you are writing for students and faculty you already know—your classmates, for instance—you are often asked to imitate the style of academic writing that is used in publications, as this is part of learning to become part of the scholarly conversation. When speaking to academic audiences outside your discipline, you may need to be more careful about jargon and specialized language, as disciplines do not always share the same key terms. For instance, in sociology, scholars use the term diffusion to refer to the way new ideas or practices spread from organization to organization. In the field of international relations, scholars often used the term cascade to refer to the way ideas or practices spread from nation to nation. These terms are describing what is fundamentally the same concept, but they are different terms—and a scholar from one field might have no idea what a scholar from a different field is talking about! Therefore, while the formality and academic structure of the text would stay the same, a writer with a multidisciplinary audience might need to pay more attention to defining their terms in the body of the text.

It is not only other academic scholars who expect to see formal writing. Policymakers tend to expect formality when ideas are presented to them, as well. However, the content and style of the writing will be different. Much less academic jargon should be used, and the most important findings and policy implications should be emphasized right from the start rather than initially focusing on prior literature and theoretical models as you might for an academic audience. Long discussions of research methods should also be minimized. Similarly, when you write for practitioners, the findings and implications for practice should be highlighted. The reading level of the text will vary depending on the typical background of the practitioners to whom you are writing—you can make very different assumptions about the general knowledge and reading abilities of a group of hospital medical directors with MDs than you can about a group of case workers who have a post-high-school certificate. Consider the primary language of your audience as well. The fact that someone can get by in spoken English does not mean they have the vocabulary or English reading skills to digest a complex report. But the fact that someone’s vocabulary is limited says little about their intellectual abilities, so try your best to convey the important complexity of the ideas and findings from your research without dumbing them down—even if you must limit your vocabulary usage.

When writing for the general public, you will want to move even further towards emphasizing key findings and policy implications, but you also want to draw on the most interesting aspects of your data. General readers will read sociological texts that are rich with ethnographic or other kinds of detail—it is almost like reality television on a page! And this is a contrast to busy policymakers and practitioners, who probably want to learn the main findings as quickly as possible so they can go about their busy lives. But also keep in mind that there is a wide variation in reading levels. Journalists at publications pegged to the general public are often advised to write at about a tenth-grade reading level, which would leave most of the specialized terminology we develop in our research fields out of reach. If you want to be accessible to even more people, your vocabulary must be even more limited. The excellent exercise of trying to write using the 1,000 most common English words, available at the Up-Goer Five website ( https://www.splasho.com/upgoer5/ ) does a good job of illustrating this challenge (Sanderson n.d.).

Another element of voice is whether to write in the first person. While many students are instructed to avoid the use of the first person in academic writing, this advice needs to be taken with a grain of salt. There are indeed many contexts in which the first person is best avoided, at least as long as writers can find ways to build strong, comprehensible sentences without its use, including most quantitative research writing. However, if the alternative to using the first person is crafting a sentence like “it is proposed that the researcher will conduct interviews,” it is preferable to write “I propose to conduct interviews.” In qualitative research, in fact, the use of the first person is far more common. This is because the researcher is central to the research project. Qualitative researchers can themselves be understood as research instruments, and thus eliminating the use of the first person in writing is in a sense eliminating information about the conduct of the researchers themselves.

But the question really extends beyond the issue of first-person or third-person. Qualitative researchers have choices about how and whether to foreground themselves in their writing, not just in terms of using the first person, but also in terms of whether to emphasize their own subjectivity and reflexivity, their impressions and ideas, and their role in the setting. In contrast, conventional quantitative research in the positivist tradition really tries to eliminate the author from the study—which indeed is exactly why typical quantitative research avoids the use of the first person. Keep in mind that emphasizing researchers’ roles and reflexivity and using the first person does not mean crafting articles that provide overwhelming detail about the author’s thoughts and practices. Readers do not need to hear, and should not be told, which database you used to search for journal articles, how many hours you spent transcribing, or whether the research process was stressful—save these things for the memos you write to yourself. Rather, readers need to hear how you interacted with research participants, how your standpoint may have shaped the findings, and what analytical procedures you carried out.

Making Data Come Alive

One of the most important parts of writing about qualitative research is presenting the data in a way that makes its richness and value accessible to readers. As the discussion of analysis in the prior chapter suggests, there are a variety of ways to do this. Researchers may select key quotes or images to illustrate points, write up specific case studies that exemplify their argument, or develop vignettes (little stories) that illustrate ideas and themes, all drawing directly on the research data. Researchers can also write more lengthy summaries, narratives, and thick descriptions.

Nearly all qualitative work includes quotes from research participants or documents to some extent, though ethnographic work may focus more on thick description than on relaying participants’ own words. When quotes are presented, they must be explained and interpreted—they cannot stand on their own. This is one of the ways in which qualitative research can be distinguished from journalism. Journalism presents what happened, but social science needs to present the “why,” and the why is best explained by the researcher.

So how do authors go about integrating quotes into their written work? Julie Posselt (2017), a sociologist who studies graduate education, provides a set of instructions. First of all, authors need to remain focused on the core questions of their research, and avoid getting distracted by quotes that are interesting or attention-grabbing but not so relevant to the research question. Selecting the right quotes, those that illustrate the ideas and arguments of the paper, is an important part of the writing process. Second, not all quotes should be the same length (just like not all sentences or paragraphs in a paper should be the same length). Include some quotes that are just phrases, others that are a sentence or so, and others that are longer. We call longer quotes, generally those more than about three lines long, block quotes , and they are typically indented on both sides to set them off from the surrounding text. For all quotes, be sure to summarize what the quote should be telling or showing the reader, connect this quote to other quotes that are similar or different, and provide transitions in the discussion to move from quote to quote and from topic to topic. Especially for longer quotes, it is helpful to do some of this writing before the quote to preview what is coming and other writing after the quote to make clear what readers should have come to understand. Remember, it is always the author’s job to interpret the data. Presenting excerpts of the data, like quotes, in a form the reader can access does not minimize the importance of this job. Be sure that you are explaining the meaning of the data you present.

A few more notes about writing with quotes: avoid patchwriting, whether in your literature review or the section of your paper in which quotes from respondents are presented. Patchwriting is a writing practice wherein the author lightly paraphrases original texts but stays so close to those texts that there is little the author has added. Sometimes, this even takes the form of presenting a series of quotes, properly documented, with nothing much in the way of text generated by the author. A patchwriting approach does not build the scholarly conversation forward, as it does not represent any kind of new contribution on the part of the author. It is of course fine to paraphrase quotes, as long as the meaning is not changed. But if you use direct quotes, do not edit the text of the quotes unless how you edit them does not change the meaning and you have made clear through the use of ellipses (…) and brackets ([])what kinds of edits have been made. For example, consider this exchange from Matthew Desmond’s (2012:1317) research on evictions:

The thing was, I wasn’t never gonna let Crystal come and stay with me from the get go. I just told her that to throw her off. And she wasn’t fittin’ to come stay with me with no money…No. Nope. You might as well stay in that shelter.

A paraphrase of this exchange might read “She said that she was going to let Crystal stay with her if Crystal did not have any money.” Paraphrases like that are fine. What is not fine is rewording the statement but treating it like a quote, for instance writing:

The thing was, I was not going to let Crystal come and stay with me from beginning. I just told her that to throw her off. And it was not proper for her to come stay with me without any money…No. Nope. You might as well stay in that shelter.

But as you can see, the change in language and style removes some of the distinct meaning of the original quote. Instead, writers should leave as much of the original language as possible. If some text in the middle of the quote needs to be removed, as in this example, ellipses are used to show that this has occurred. And if a word needs to be added to clarify, it is placed in square brackets to show that it was not part of the original quote.

Data can also be presented through the use of data displays like tables, charts, graphs, diagrams, and infographics created for publication or presentation, as well as through the use of visual material collected during the research process. Note that if visuals are used, the author must have the legal right to use them. Photographs or diagrams created by the author themselves—or by research participants who have signed consent forms for their work to be used, are fine. But photographs, and sometimes even excerpts from archival documents, may be owned by others from whom researchers must get permission in order to use them.

A large percentage of qualitative research does not include any data displays or visualizations. Therefore, researchers should carefully consider whether the use of data displays will help the reader understand the data. One of the most common types of data displays used by qualitative researchers are simple tables. These might include tables summarizing key data about cases included in the study; tables laying out the characteristics of different taxonomic elements or types developed as part of the analysis; tables counting the incidence of various elements; and 2×2 tables (two columns and two rows) illuminating a theory. Basic network or process diagrams are also commonly included. If data displays are used, it is essential that researchers include context and analysis alongside data displays rather than letting them stand by themselves, and it is preferable to continue to present excerpts and examples from the data rather than just relying on summaries in the tables.

If you will be using graphs, infographics, or other data visualizations, it is important that you attend to making them useful and accurate (Bergin 2018). Think about the viewer or user as your audience and ensure the data visualizations will be comprehensible. You may need to include more detail or labels than you might think. Ensure that data visualizations are laid out and labeled clearly and that you make visual choices that enhance viewers’ ability to understand the points you intend to communicate using the visual in question. Finally, given the ease with which it is possible to design visuals that are deceptive or misleading, it is essential to make ethical and responsible choices in the construction of visualization so that viewers will interpret them in accurate ways.

The Genre of Research Writing

As discussed above, the style and format in which results are presented depends on the audience they are intended for. These differences in styles and format are part of the genre of writing. Genre is a term referring to the rules of a specific form of creative or productive work. Thus, the academic journal article—and student papers based on this form—is one genre. A report or policy paper is another. The discussion below will focus on the academic journal article, but note that reports and policy papers follow somewhat different formats. They might begin with an executive summary of one or a few pages, include minimal background, focus on key findings, and conclude with policy implications, shifting methods and details about the data to an appendix. But both academic journal articles and policy papers share some things in common, for instance the necessity for clear writing, a well-organized structure, and the use of headings.

So what factors make up the genre of the academic journal article in sociology? While there is some flexibility, particularly for ethnographic work, academic journal articles tend to follow a fairly standard format. They begin with a “title page” that includes the article title (often witty and involving scholarly inside jokes, but more importantly clearly describing the content of the article); the authors’ names and institutional affiliations, an abstract , and sometimes keywords designed to help others find the article in databases. An abstract is a short summary of the article that appears both at the very beginning of the article and in search databases. Abstracts are designed to aid readers by giving them the opportunity to learn enough about an article that they can determine whether it is worth their time to read the complete text. They are written about the article, and thus not in the first person, and clearly summarize the research question, methodological approach, main findings, and often the implications of the research.

After the abstract comes an “introduction” of a page or two that details the research question, why it matters, and what approach the paper will take. This is followed by a literature review of about a quarter to a third the length of the entire paper. The literature review is often divided, with headings, into topical subsections, and is designed to provide a clear, thorough overview of the prior research literature on which a paper has built—including prior literature the new paper contradicts. At the end of the literature review it should be made clear what researchers know about the research topic and question, what they do not know, and what this new paper aims to do to address what is not known.

The next major section of the paper is the section that describes research design, data collection, and data analysis, often referred to as “research methods” or “methodology.” This section is an essential part of any written or oral presentation of your research. Here, you tell your readers or listeners “how you collected and interpreted your data” (Taylor, Bogdan, and DeVault 2016:215). Taylor, Bogdan, and DeVault suggest that the discussion of your research methods include the following:

  • The particular approach to data collection used in the study;
  • Any theoretical perspective(s) that shaped your data collection and analytical approach;
  • When the study occurred, over how long, and where (concealing identifiable details as needed);
  • A description of the setting and participants, including sampling and selection criteria (if an interview-based study, the number of participants should be clearly stated);
  • The researcher’s perspective in carrying out the study, including relevant elements of their identity and standpoint, as well as their role (if any) in research settings; and
  • The approach to analyzing the data.

After the methods section comes a section, variously titled but often called “data,” that takes readers through the analysis. This section is where the thick description narrative; the quotes, broken up by theme or topic, with their interpretation; the discussions of case studies; most data displays (other than perhaps those outlining a theoretical model or summarizing descriptive data about cases); and other similar material appears. The idea of the data section is to give readers the ability to see the data for themselves and to understand how this data supports the ultimate conclusions. Note that all tables and figures included in formal publications should be titled and numbered.

At the end of the paper come one or two summary sections, often called “discussion” and/or “conclusion.” If there is a separate discussion section, it will focus on exploring the overall themes and findings of the paper. The conclusion clearly and succinctly summarizes the findings and conclusions of the paper, the limitations of the research and analysis, any suggestions for future research building on the paper or addressing these limitations, and implications, be they for scholarship and theory or policy and practice.

After the end of the textual material in the paper comes the bibliography, typically called “works cited” or “references.” The references should appear in a consistent citation style—in sociology, we often use the American Sociological Association format (American Sociological Association 2019), but other formats may be used depending on where the piece will eventually be published. Care should be taken to ensure that in-text citations also reflect the chosen citation style. In some papers, there may be an appendix containing supplemental information such as a list of interview questions or an additional data visualization.

Note that when researchers give presentations to scholarly audiences, the presentations typically follow a format similar to that of scholarly papers, though given time limitations they are compressed. Abstracts and works cited are often not part of the presentation, though in-text citations are still used. The literature review presented will be shortened to only focus on the most important aspects of the prior literature, and only key examples from the discussion of data will be included. For long or complex papers, sometimes only one of several findings is the focus of the presentation. Of course, presentations for other audiences may be constructed differently, with greater attention to interesting elements of the data and findings as well as implications and less to the literature review and methods.

Concluding Your Work

After you have written a complete draft of the paper, be sure you take the time to revise and edit your work. There are several important strategies for revision. First, put your work away for a little while. Even waiting a day to revise is better than nothing, but it is best, if possible, to take much more time away from the text. This helps you forget what your writing looks like and makes it easier to find errors, mistakes, and omissions. Second, show your work to others. Ask them to read your work and critique it, pointing out places where the argument is weak, where you may have overlooked alternative explanations, where the writing could be improved, and what else you need to work on. Finally, read your work out loud to yourself (or, if you really need an audience, try reading to some stuffed animals). Reading out loud helps you catch wrong words, tricky sentences, and many other issues. But as important as revision is, try to avoid perfectionism in writing (Warren and Karner 2015). Writing can always be improved, no matter how much time you spend on it. Those improvements, however, have diminishing returns, and at some point the writing process needs to conclude so the writing can be shared with the world.

Of course, the main goal of writing up the results of a research project is to share with others. Thus, researchers should be considering how they intend to disseminate their results. What conferences might be appropriate? Where can the paper be submitted? Note that if you are an undergraduate student, there are a wide variety of journals that accept and publish research conducted by undergraduates. Some publish across disciplines, while others are specific to disciplines. Other work, such as reports, may be best disseminated by publication online on relevant organizational websites.

After a project is completed, be sure to take some time to organize your research materials and archive them for longer-term storage. Some Institutional Review Board (IRB) protocols require that original data, such as interview recordings, transcripts, and field notes, be preserved for a specific number of years in a protected (locked for paper or password-protected for digital) form and then destroyed, so be sure that your plans adhere to the IRB requirements. Be sure you keep any materials that might be relevant for future related research or for answering questions people may ask later about your project.

And then what? Well, then it is time to move on to your next research project. Research is a long-term endeavor, not a one-time-only activity. We build our skills and our expertise as we continue to pursue research. So keep at it.

  • Find a short article that uses qualitative methods. The sociological magazine Contexts is a good place to find such pieces. Write an abstract of the article.
  • Choose a sociological journal article on a topic you are interested in that uses some form of qualitative methods and is at least 20 pages long. Rewrite the article as a five-page research summary accessible to non-scholarly audiences.
  • Choose a concept or idea you have learned in this course and write an explanation of it using the Up-Goer Five Text Editor ( https://www.splasho.com/upgoer5/ ), a website that restricts your writing to the 1,000 most common English words. What was this experience like? What did it teach you about communicating with people who have a more limited English-language vocabulary—and what did it teach you about the utility of having access to complex academic language?
  • Select five or more sociological journal articles that all use the same basic type of qualitative methods (interviewing, ethnography, documents, or visual sociology). Using what you have learned about coding, code the methods sections of each article, and use your coding to figure out what is common in how such articles discuss their research design, data collection, and analysis methods.
  • Return to an exercise you completed earlier in this course and revise your work. What did you change? How did revising impact the final product?
  • Find a quote from the transcript of an interview, a social media post, or elsewhere that has not yet been interpreted or explained. Write a paragraph that includes the quote along with an explanation of its sociological meaning or significance.

The style or personality of a piece of writing, including such elements as tone, word choice, syntax, and rhythm.

A quotation, usually one of some length, which is set off from the main text by being indented on both sides rather than being placed in quotation marks.

A classification of written or artistic work based on form, content, and style.

A short summary of a text written from the perspective of a reader rather than from the perspective of an author.

Social Data Analysis Copyright © 2021 by Mikaila Mariel Lemonik Arthur is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

tableau.com is not available in your region.

skillfine

  • Certifications

Home

Data Analysis 101: How to Make Your Presentations Practical and Effective

  • December 27, 2022
  • 53 Comments

what is not included in presentation and analysis of data

Understanding Importance of Data Analysis

The results of data analysis can give business the vital insights they need to turn in to successful and profitable ventures. It could be the difference between a successful business operation and a business operation that is in trouble.

Data analysis, though one of the most in-demand job roles globally, doesn’t require a degree in statistics or mathematics to do well, and employers from a wide variety of industries are very keen to recruit data analysts.

Businesses hire data analysts in the field of finance, marketing, administration, HR, IT and procurement, to name just a few.  Understand the big picture and provide answers. By engaging in data analysis, you can actually delve deep and discover hidden truths that most business people would never be able to do.

What skills you should master to be a data analyst?

While Data Analyst roles are on the rise, there are certain skills that are vital for anyone who wants to become a data analyst . Before the job, a candidate needs to have either a degree in statistics, business or computer science or a related subject, or work experience in these areas. 

If you’re interested in becoming a data analyst, you’ll need to know: 

  • Programming and algorithms
  • Data Visualization 
  • Open-source and cloud technologies 
  • No coding experience is required. 

How much is a data analyst worth?  Data analysts earn an average salary of £32,403 per annum, according to jobs site Glassdoor. This pays for a salary, with benefits such as medical insurance and paid leave included in the starting salary.  If you think you have the right skills, there are plenty of roles on offer.

What data analysis entails

Data analysis is an analytical process which involves recording and tabulating (recording and entering, entering and tabulating) the quantities of a product, such as numbers of units produced, costs of materials and expenses.

While data analyst can take different forms, for example in databases, in other structures such as spreadsheets, numbers are the main means of data entry. This involves entering and entering the required data in a data analysis system such as Excel.

For example, although a database doesn’t require a data analyst, it can still benefit from data analysis techniques such as binomial testing, ANOVA and Fisher’s exact tests.  Where is the data analysis courses in IT?  Given the ever-increasing reliance on technology in business, data analysis courses are vital skills.

What are the types of data analysis methods?

  • Cluster analysis 

The act of grouping a specific set of data in a manner that those elements are more similar to one another than to those in other groups – hence the term ‘cluster.’ Since there is no special target variable while doing clustering, the method is often used to find hidden patterns in the data. The approach is purposely used to offer additional context to a particular trend or dataset.  

  • Cohort analysis 

This type of data analysis method uses historical data to examine and compare a determined segment of users’ behavior, which can then be grouped with others with similar characteristics. By using this data analysis methodology, it’s possible to gain a wealth of insight into consumer needs or a firm understanding of a broader target group.

A dependent variable is an element of a complex system that is assumed to have a single cause, but it’s affected by multiple factors, thus giving researchers an indication as to how a complex system function.  

  • Regression analysis

The regression analysis is used to predict how the value of a dependent variable changes when one or more independent variables change, stay the same or the dependent variable is not moved. Regression is a sophisticated statistical method that includes mathematical functions that are typically called “segmentation,” “distribution,” and “intercept” functions.

Regression is a type of regression analysis that only contains linear and quadratic functions. You can change the types of factors (or the independent variables) that are selected in regression analysis (it’s typically called “nonlinear regression analysis”) by changing the order in which the models are constructed.To begin, let’s explain how regression analysis works.  

Examples in business world

The Oracle Corporation is one of the first multinational companies to adopt this type of analysis method, based on which the company was able to develop predictive modelling systems for marketing purposes.

In a more specific sense, a Regression analysis is a popular type of data analysis used for analyzing the likelihood that a random variable will move up or down a range of parameters in response to a change in a specific control variable.

Companies who use this type of analysis are looking for trends and patterned performance over time. For example, how a company may respond to a rising cost of labor and its effect on its business bottom line, a weather-related issue like an earthquake, a new advertising campaign, or even a surge in customer demand in some areas.

What are basic pointers to consider while presenting data

Recognize that presentation matters.

Too often, analysts make the mistake of presenting information in order to show an abstracted version of it.  For instance, say a B2B company has 4 ways to improve their sales funnel:

  • More Visually Engaging 
  • More Easily Transacted 
  • More Cost Effective 

Then, “informative” would mean that a B2B company needs to optimize their sales funnel to each of these to be more “convenient, faster, easier, more visually engaging, or most cost effective.” Sure, it would be nice if they all improved – they would all provide a competitive advantage in some way. But that’s not what the data tells us.

Don’t scare people with numbers

When you’re presenting data, show as many as possible, in as many charts as possible. Then, try to talk through the implications of the data, rather than overwhelming people with an overwhelming amount of data.

Why? Research suggests that when a number is presented in a visual, people become more likely to process it and learn from it.  I recommend using video, text, graphs, and pictures to represent your numbers. This creates a more visually appealing data set. The number of followers on Twitter is visually appealing. The number of followers on Facebook is visually appealing. But nobody looks at their Twitter followers. If you don’t know what your numbers mean, how will your audience?  That doesn’t mean numbers aren’t important.

Maximize the data pixel ratio

The more data you show to a critical stakeholder, the more likely they are to get lost and distracted from what you’re actually trying to communicate. This is especially important in the case of people in the sales and marketing function.

Do you have a sales person out in the field who is trying to close a deal? It would be a shame if that person got lost in your Excel analytics and lost out on the sale.  This problem also occurs on the web.

Consider how web visitors respond to large, colorful charts and graphs. If we’re talking about visualizations that depict web performance, a visual might be helpful. But how often do we see this done?  Research shows that people respond better to web-based data in a simplified, less complex format.

Save 3-D for the movies

There are great stories in the universe. This is an oversimplification, but if you look at history, humans only understand stories. We are great storytellers. We develop, through trial and error, our own intuition about the “right” way to tell stories.

 One of the most powerful and effective ways to present data is to go beyond the visual to the audible, that is, to tell stories in a way that people can relate to. Everything you hear about computers being a series of numbers is wrong. We visualize numbers in a precise, quantitative way. But the numbers are not a collection of isolated events. To understand them, we need to understand the broader context.

Friends don’t let friends use pie charts

Businesses and analysts have done this since pie charts first appeared on Microsoft Excel sheets. When presenting data, break down your pie chart into its component segments.

 As opposed to an equal-sized circle for the average earnings for all the employees, share a pie chart where the percentages for each individual segment are different, with a link to the corresponding chart.

 Pair with explanatory text, show their correlation, and make your choice based on your audience, not on whether you want to scare or “educate” them. The majority of audiences will see the same image, regardless of whether it’s presented in a bar chart, bar chart, line chart, or something else.

Choose the appropriate chart

Does the data make logical sense? Check your assumptions against the data.  Are the graphs charting only part of the story? Include other variables in the graphs.  Avoid using axis labels to mislead. Never rely on axes to infer, “logical” conclusions.  Trust your eyes: you know what information your brain can process.

Think of numbers like music — they are pleasing, but not overwhelming.  Save 3D for the movies. When everyone is enjoying 4K, 8K, and beyond, it’s hard to envision your audience without the new stuff. I remember the first time I got to see HDTV. At home, I sat behind a chair and kept turning around to watch the TV. But at the theatre, I didn’t need a chair. All I had to do was look up, and see the giant screen, the contrast, and the detail.

Don’t mix chart types for no reason

Excel chart s with colored areas help people focus. Arrows give us scale. Assume your audience doesn’t understand what you’re saying, even if they do. Nobody wants to open a recipe book to learn how to cook soup. Instead, we start with a recipe.

Use a formula to communicate your analysis with as few words as possible. Keep it simple.  Resist the urge to over-complicate your presentation. A word cloud is not a word cloud. A bar chart is not a bar chart. If you use a word cloud to illustrate a chart, consider replacing a few words with a gif. A bar chart doesn’t need clouds. And a bar chart doesn’t need clouds.  If there’s one thing that’s sure to confuse your audience, it’s bar charts.

Use color with intention

Use color with intention. It’s not about pretty. When it comes to presenting data clearly, “informative” is more important than “beautiful.” 

However, visualizations like maps, axes, or snapshots can help visual communication to avoid this pitfall. If you are going to show a few locations on a map, make sure each location has a voice and uses a distinct color. Avoid repeating colors from the map or bottom bar in all the visuals. Be consistent with how you present the data .  A pie chart is not very interesting if all it shows is a bunch of varying sizes of the pie.

Data analysis in the workplace, and how it will impact the future of business

Business leaders are taking note of the importance of data analysis skills in their organisation, as it can make an enormous impact on business.

 Larger organisations such as Google, Amazon and Facebook employ huge teams of analysts to create their data and statistics. We are already seeing the rise of the next generation of big data analysts – those who can write code that analyses and visualizes the data and report back information to a company to help it improve efficiency and increase revenue. 

The increasing need for high-level understanding of data analysis has already led to the role of data analyst becoming available at university level. It is no longer a mandatory business qualification but one that can enhance your CV.

By understanding the importance of each variable, you can improve your business by managing your time and creating more effective systems and processes for running your business. The focus shifts from just providing services to providing value to your customers, creating a better, more intuitive experience for them so they can work with your company for the long-term. 

Adopting these small steps will allow you to be more effective in your business and go from being an employee to an entrepreneur.

Share This Post:

53 thoughts on “data analysis 101: how to make your presentations practical and effective”.

what is not included in presentation and analysis of data

Buy Zyvox Online – Special offer: Save up to $498 – buy antibiotics online and get discount for all purchased!

what is not included in presentation and analysis of data

Thanks again for the post.Thanks Again. Cool.

what is not included in presentation and analysis of data

Thanks for great information. What trips can you recommend in 2024? Astro tourism, eco diving, home swapping, train stations are the new food destinations,sports tourism, coolcationing, gig tripping, private group travel?

what is not included in presentation and analysis of data

Enjoyed every bit of your article.Really looking forward to read more. Great.

what is not included in presentation and analysis of data

Really appreciate you sharing this post. Want more.

what is not included in presentation and analysis of data

I am so grateful for your blog. Really Great.

what is not included in presentation and analysis of data

Hey, thanks for the blog post.

what is not included in presentation and analysis of data

Great, thanks for sharing this blog.Thanks Again. Will read on…

what is not included in presentation and analysis of data

Thanks for sharing, this is a fantastic blog.Thanks Again. Great.

what is not included in presentation and analysis of data

Thanks for sharing, this is a fantastic article.Really thank you! Fantastic.

what is not included in presentation and analysis of data

Major thankies for the article post.Thanks Again. Will read on…

what is not included in presentation and analysis of data

Hey, thanks for the blog post. Cool.

what is not included in presentation and analysis of data

A round of applause for your blog post.Really looking forward to read more. Cool.

what is not included in presentation and analysis of data

Appreciate you sharing, great article post. Great.

what is not included in presentation and analysis of data

wow, awesome blog.Much thanks again. Cool.

what is not included in presentation and analysis of data

Say, you got a nice blog.Really thank you! Cool.

what is not included in presentation and analysis of data

Enjoyed every bit of your blog article.Much thanks again. Want more.

what is not included in presentation and analysis of data

A round of applause for your blog post.Much thanks again. Cool.

what is not included in presentation and analysis of data

I’m not sure where you’re getting your info, but good topic. I needs to spend some time learning more or understanding more. Thanks for wonderful info I was looking for this info for my mission.

what is not included in presentation and analysis of data

Im thankful for the blog post.Much thanks again.

what is not included in presentation and analysis of data

A big thank you for your article.Really thank you!

what is not included in presentation and analysis of data

I truly appreciate this article post.Really looking forward to read more. Really Cool.

what is not included in presentation and analysis of data

I really enjoy the article post.Much thanks again.

what is not included in presentation and analysis of data

wow, awesome blog.Thanks Again. Will read on…

what is not included in presentation and analysis of data

Awesome blog post.Much thanks again. Much obliged.

what is not included in presentation and analysis of data

Im thankful for the blog.Much thanks again. Want more.

what is not included in presentation and analysis of data

Thanks a lot for the post.Much thanks again. Want more.

what is not included in presentation and analysis of data

I really liked your article.Really thank you! Really Cool.

what is not included in presentation and analysis of data

A round of applause for your post.Thanks Again. Much obliged.

what is not included in presentation and analysis of data

Say, you got a nice article.Really thank you! Fantastic.

what is not included in presentation and analysis of data

I value the blog article. Really Cool.

what is not included in presentation and analysis of data

A round of applause for your blog article.Really looking forward to read more. Great.

what is not included in presentation and analysis of data

Really appreciate you sharing this blog article. Really Cool.

what is not included in presentation and analysis of data

Really informative article. Really Great.

what is not included in presentation and analysis of data

Fantastic article post.Really looking forward to read more. Really Great.

what is not included in presentation and analysis of data

I really liked your post.Much thanks again. Much obliged.

what is not included in presentation and analysis of data

Beneficial document helps make frequent advance, appreciate it write about, this pile-up connected with expertise is usually to hold finding out, focus is usually the beginning of money.

what is not included in presentation and analysis of data

I really liked your blog article.Much thanks again. Really Great.

what is not included in presentation and analysis of data

I really liked your article post.Really thank you! Fantastic.

what is not included in presentation and analysis of data

Great, thanks for sharing this blog.Thanks Again. Awesome.

what is not included in presentation and analysis of data

Enjoyed every bit of your blog article. Cool.

what is not included in presentation and analysis of data

I really enjoy the blog. Want more.

what is not included in presentation and analysis of data

Im grateful for the blog.Thanks Again. Awesome.

what is not included in presentation and analysis of data

Very neat blog.Thanks Again. Cool.

what is not included in presentation and analysis of data

Muchos Gracias for your blog article.

what is not included in presentation and analysis of data

Im grateful for the post.Really looking forward to read more. Keep writing.

what is not included in presentation and analysis of data

Great, thanks for sharing this post.Really looking forward to read more. Cool.

what is not included in presentation and analysis of data

Thank you ever so for you article post.Really looking forward to read more. Will read on…

what is not included in presentation and analysis of data

Great article post.Much thanks again. Great.

what is not included in presentation and analysis of data

I really like and appreciate your blog post.Thanks Again. Awesome.

what is not included in presentation and analysis of data

Really appreciate you sharing this blog. Really Cool.

what is not included in presentation and analysis of data

Greetings! Incredibly helpful suggestions within this short article! It’s the very little adjustments that make the greatest modifications. Quite a few many thanks for sharing!

what is not included in presentation and analysis of data

Awesome blog.Really looking forward to read more. Much obliged.

Add a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Get A 5X Raise In Salary

what is not included in presentation and analysis of data

Reset Password

Insert/edit link.

Enter the destination URL

Or link to existing content

Logo for Open Educational Resources

Chapter 20. Presentations

Introduction.

If a tree falls in a forest, and no one is around to hear it, does it make a sound? If a qualitative study is conducted, but it is not presented (in words or text), did it really happen? Perhaps not. Findings from qualitative research are inextricably tied up with the way those findings are presented. These presentations do not always need to be in writing, but they need to happen. Think of ethnographies, for example, and their thick descriptions of a particular culture. Witnessing a culture, taking fieldnotes, talking to people—none of those things in and of themselves convey the culture. Or think about an interview-based phenomenological study. Boxes of interview transcripts might be interesting to read through, but they are not a completed study without the intervention of hours of analysis and careful selection of exemplary quotes to illustrate key themes and final arguments and theories. And unlike much quantitative research in the social sciences, where the final write-up neatly reports the results of analyses, the way the “write-up” happens is an integral part of the analysis in qualitative research. Once again, we come back to the messiness and stubborn unlinearity of qualitative research. From the very beginning, when designing the study, imagining the form of its ultimate presentation is helpful.

Because qualitative researchers are motivated by understanding and conveying meaning, effective communication is not only an essential skill but a fundamental facet of the entire research project. Ethnographers must be able to convey a certain sense of verisimilitude, the appearance of true reality. Those employing interviews must faithfully depict the key meanings of the people they interviewed in a way that rings true to those people, even if the end result surprises them. And all researchers must strive for clarity in their publications so that various audiences can understand what was found and why it is important. This chapter will address how to organize various kinds of presentations for different audiences so that your results can be appreciated and understood.

In the world of academic science, social or otherwise, the primary audience for a study’s results is usually the academic community, and the primary venue for communicating to this audience is the academic journal. Journal articles are typically fifteen to thirty pages in length (8,000 to 12,000 words). Although qualitative researchers often write and publish journal articles—indeed, there are several journals dedicated entirely to qualitative research [1] —the best writing by qualitative researchers often shows up in books. This is because books, running from 80,000 to 150,000 words in length, allow the researcher to develop the material fully. You have probably read some of these in various courses you have taken, not realizing what they are. I have used examples of such books throughout this text, beginning with the three profiles in the introductory chapter. In some instances, the chapters in these books began as articles in academic journals (another indication that the journal article format somewhat limits what can be said about the study overall).

While the article and the book are “final” products of qualitative research, there are actually a few other presentation formats that are used along the way. At the very beginning of a research study, it is often important to have a written research proposal not just to clarify to yourself what you will be doing and when but also to justify your research to an outside agency, such as an institutional review board (IRB; see chapter 12), or to a potential funder, which might be your home institution, a government funder (such as the National Science Foundation, or NSF), or a private foundation (such as the Gates Foundation). As you get your research underway, opportunities will arise to present preliminary findings to audiences, usually through presentations at academic conferences. These presentations can provide important feedback as you complete your analyses. Finally, if you are completing a degree and looking to find an academic job, you will be asked to provide a “job talk,” usually about your research. These job talks are similar to conference presentations but can run significantly longer.

All the presentations mentioned so far are (mostly) for academic audiences. But qualitative research is also unique in that many of its practitioners don’t want to confine their presentation only to other academics. Qualitative researchers who study particular contexts or cultures might want to report back to the people and places they observed. Those working in the critical tradition might want to raise awareness of a particular issue to as large an audience as possible. Many others simply want everyday, nonacademic people to read their work, because they think it is interesting and important. To reach a wide audience, the final product can look like almost anything—it can be a poem, a blog, a podcast, even a science fiction short story. And if you are very lucky, it can even be a national or international bestseller.

In this chapter, we are going to stick with the more basic quotidian presentations—the academic paper / research proposal, the conference slideshow presentation / job talk, and the conference poster. We’ll also spend a bit of time on incorporating universal design into your presentations and how to create some especially attractive and impactful visual displays.

Researcher Note

What is the best piece of advice you’ve ever been given about conducting qualitative research?

The best advice I’ve received came from my adviser, Alford Young Jr. He told me to find the “Jessi Streib” answer to my research question, not the “Pierre Bourdieu” answer to my research question. In other words, don’t just say how a famous theorist would answer your question; say something original, something coming from you.

—Jessi Streib, author of The Power of the Past and Privilege Lost 

Writing about Your Research

The journal article and the research proposal.

Although the research proposal is written before you have actually done your research and the article is written after all data collection and analysis is complete, there are actually many similarities between the two in terms of organization and purpose. The final article will (probably—depends on how much the research question and focus have shifted during the research itself) incorporate a great deal of what was included in a preliminary research proposal. The average lengths of both a proposal and an article are quite similar, with the “front sections” of the article abbreviated to make space for the findings, discussion of findings, and conclusion.

Figure 20.1 shows one model for what to include in an article or research proposal, comparing the elements of each with a default word count for each section. Please note that you will want to follow whatever specific guidelines you have been provided by the venue you are submitting the article/proposal to: the IRB, the NSF, the Journal of Qualitative Research . In fact, I encourage you to adapt the default model as needed by swapping out expected word counts for each section and adding or varying the sections to match expectations for your particular publication venue. [2]

You will notice a few things about the default model guidelines. First, while half of the proposal is spent discussing the research design, this section is shortened (but still included) for the article. There are a few elements that only show up in the proposal (e.g., the limitations section is in the introductory section here—it will be more fully developed in the conclusory section in the article). Obviously, you don’t have findings in the proposal, so this is an entirely new section for the article. Note that the article does not include a data management plan or a timeline—two aspects that most proposals require.

It might be helpful to find and maintain examples of successfully written sections that you can use as models for your own writing. I have included a few of these throughout the textbook and have included a few more at the end of this chapter.

Make an Argument

Some qualitative researchers, particularly those engaged in deep ethnographic research, focus their attention primarily if not exclusively on describing the data. They might even eschew the notion that they should make an “argument” about the data, preferring instead to use thick descriptions to convey interpretations. Bracketing the contrast between interpretation and argument for the moment, most readers will expect you to provide an argument about your data, and this argument will be in answer to whatever research question you eventually articulate (remember, research questions are allowed to shift as you get further into data collection and analysis). It can be frustrating to read a well-developed study with clear and elegant descriptions and no argument. The argument is the point of the research, and if you do not have one, 99 percent of the time, you are not finished with your analysis. Calarco ( 2020 ) suggests you imagine a pyramid, with all of your data forming the basis and all of your findings forming the middle section; the top/point of the pyramid is your argument, “what the patterns in your data tell us about how the world works or ought to work” ( 181 ).

The academic community to which you belong will be looking for an argument that relates to or develops theory. This is the theoretical generalizability promise of qualitative research. An academic audience will want to know how your findings relate to previous findings, theories, and concepts (the literature review; see chapter 9). It is thus vitally important that you go back to your literature review (or develop a new one) and draw those connections in your discussion and/or conclusion. When writing to other audiences, you will still want an argument, although it may not be written as a theoretical one. What do I mean by that? Even if you are not referring to previous literature or developing new theories or adapting older ones, a simple description of your findings is like dumping a lot of leaves in the lap of your audience. They still deserve to know about the shape of the forest. Maybe provide them a road map through it. Do this by telling a clear and cogent story about the data. What is the primary theme, and why is it important? What is the point of your research? [3]

A beautifully written piece of research based on participant observation [and/or] interviews brings people to life, and helps the reader understand the challenges people face. You are trying to use vivid, detailed and compelling words to help the reader really understand the lives of the people you studied. And you are trying to connect the lived experiences of these people to a broader conceptual point—so that the reader can understand why it matters. ( Lareau 2021:259 )

Do not hide your argument. Make it the focal point of your introductory section, and repeat it as often as needed to ensure the reader remembers it. I am always impressed when I see researchers do this well (see, e.g., Zelizer 1996 ).

Here are a few other suggestions for writing your article: Be brief. Do not overwhelm the reader with too many words; make every word count. Academics are particularly prone to “overwriting” as a way of demonstrating proficiency. Don’t. When writing your methods section, think about it as a “recipe for your work” that allows other researchers to replicate if they so wish ( Calarco 2020:186 ). Convey all the necessary information clearly, succinctly, and accurately. No more, no less. [4] Do not try to write from “beginning to end” in that order. Certain sections, like the introductory section, may be the last ones you write. I find the methods section the easiest, so I often begin there. Calarco ( 2020 ) begins with an outline of the analysis and results section and then works backward from there to outline the contribution she is making, then the full introduction that serves as a road map for the writing of all sections. She leaves the abstract for the very end. Find what order best works for you.

Presenting at Conferences and Job Talks

Students and faculty are primarily called upon to publicly present their research in two distinct contexts—the academic conference and the “job talk.” By convention, conference presentations usually run about fifteen minutes and, at least in sociology and other social sciences, rely primarily on the use of a slideshow (PowerPoint Presentation or PPT) presentation. You are usually one of three or four presenters scheduled on the same “panel,” so it is an important point of etiquette to ensure that your presentation falls within the allotted time and does not crowd into that of the other presenters. Job talks, on the other hand, conventionally require a forty- to forty-five-minute presentation with a fifteen- to twenty-minute question and answer (Q&A) session following it. You are the only person presenting, so if you run over your allotted time, it means less time for the Q&A, which can disturb some audience members who have been waiting for a chance to ask you something. It is sometimes possible to incorporate questions during your presentation, which allows you to take the entire hour, but you might end up shorting your presentation this way if the questions are numerous. It’s best for beginners to stick to the “ask me at the end” format (unless there is a simple clarifying question that can easily be addressed and makes the presentation run more smoothly, as in the case where you simply forgot to include information on the number of interviews you conducted).

For slideshows, you should allot two or even three minutes for each slide, never less than one minute. And those slides should be clear, concise, and limited. Most of what you say should not be on those slides at all. The slides are simply the main points or a clear image of what you are speaking about. Include bulleted points (words, short phrases), not full sentences. The exception is illustrative quotations from transcripts or fieldnotes. In those cases, keep to one illustrative quote per slide, and if it is long, bold or otherwise, highlight the words or passages that are most important for the audience to notice. [5]

Figure 20.2 provides a possible model for sections to include in either a conference presentation or a job talk, with approximate times and approximate numbers of slides. Note the importance (in amount of time spent) of both the research design and the findings/results sections, both of which have been helpfully starred for you. Although you don’t want to short any of the sections, these two sections are the heart of your presentation.

Fig 20.2. Suggested Slideshow Times and Number of Slides

Should you write out your script to read along with your presentation? I have seen this work well, as it prevents presenters from straying off topic and keeps them to the time allotted. On the other hand, these presentations can seem stiff and wooden. Personally, although I have a general script in advance, I like to speak a little more informally and engagingly with each slide, sometimes making connections with previous panelists if I am at a conference. This means I have to pay attention to the time, and I sometimes end up breezing through one section more quickly than I would like. Whatever approach you take, practice in advance. Many times. With an audience. Ask for feedback, and pay attention to any presentation issues that arise (e.g., Do you speak too fast? Are you hard to hear? Do you stumble over a particular word or name?).

Even though there are rules and guidelines for what to include, you will still want to make your presentation as engaging as possible in the little amount of time you have. Calarco ( 2020:274 ) recommends trying one of three story structures to frame your presentation: (1) the uncertain explanation , where you introduce a phenomenon that has not yet been fully explained and then describe how your research is tackling this; (2) the uncertain outcome , where you introduce a phenomenon where the consequences have been unclear and then you reveal those consequences with your research; and (3) the evocative example , where you start with some interesting example from your research (a quote from the interview transcripts, for example) or the real world and then explain how that example illustrates the larger patterns you found in your research. Notice that each of these is a framing story. Framing stories are essential regardless of format!

A Word on Universal Design

Please consider accessibility issues during your presentation, and incorporate elements of universal design into your slideshow. The basic idea behind universal design in presentations is that to the greatest extent possible, all people should be able to view, hear, or otherwise take in your presentation without needing special individual adaptations. If you can make your presentation accessible to people with visual impairment or hearing loss, why not do so? For example, one in twelve men is color-blind, unable to differentiate between certain colors, red/green being the most common problem. So if you design a graphic that relies on red and green bars, some of your audience members may not be able to properly identify which bar means what. Simple contrasts of black and white are much more likely to be visible to all members of your audience. There are many other elements of good universal design, but the basic foundation of all of them is that you consider how to make your presentation as accessible as possible at the outset. For example, include captions whenever possible, both as descriptions on slides and as images on slides and for any audio or video clips you are including; keep font sizes large enough to read from the back of the room; and face the audience when you are.

Poster Design

Undergraduate students who present at conferences are often encouraged to present at “poster sessions.” This usually means setting up a poster version of your research in a large hall or convention space at a set period of time—ninety minutes is common. Your poster will be one of dozens, and conference-goers will wander through the space, stopping intermittently at posters that attract them. Those who stop by might ask you questions about your research, and you are expected to be able to talk intelligently for two or three minutes. It’s a fairly easy way to practice presenting at conferences, which is why so many organizations hold these special poster sessions.

Null

A good poster design will be immediately attractive to passersby and clearly and succinctly describe your research methods, findings, and conclusions. Some students have simply shrunk down their research papers to manageable sizes and then pasted them on a poster, all twelve to fifteen pages of them. Don’t do that! Here are some better suggestions: State the main conclusion of your research in large bold print at the top of your poster, on brightly colored (contrasting) paper, and paste in a QR code that links to your full paper online ( Calarco 2020:280 ). Use the rest of the poster board to provide a couple of highlights and details of the study. For an interview-based study, for example, you will want to put in some details about your sample (including number of interviews) and setting and then perhaps one or two key quotes, also distinguished by contrasting color background.

Incorporating Visual Design in Your Presentations

In addition to ensuring that your presentation is accessible to as large an audience as possible, you also want to think about how to display your data in general, particularly how to use charts and graphs and figures. [6] The first piece of advice is, use them! As the saying goes, a picture is worth a thousand words. If you can cut to the chase with a visually stunning display, do so. But there are visual displays that are stunning, and then there are the tired, hard-to-see visual displays that predominate at conferences. You can do better than most presenters by simply paying attention here and committing yourself to a good design. As with model section passages, keep a file of visual displays that work as models for your own presentations. Find a good guidebook to presenting data effectively (Evergreen 2018 , 2019 ; Schwabisch 2021) , and refer to it often.

Let me make a few suggestions here to get you started. First, test every visual display on a friend or colleague to find out how quickly they can understand the point you are trying to convey. As with reading passages aloud to ensure that your writing works, showing someone your display is the quickest way to find out if it works. Second, put the point in the title of the display! When writing for an academic journal, there will be specific conventions of what to include in the title (full description including methods of analysis, sample, dates), but in a public presentation, there are no limiting rules. So you are free to write as your title “Working-Class College Students Are Three Times as Likely as Their Peers to Drop Out of College,” if that is the point of the graphic display. It certainly helps the communicative aspect. Third, use the themes available to you in Excel for creating graphic displays, but alter them to better fit your needs . Consider adding dark borders to bars and columns, for example, so that they appear crisper for your audience. Include data callouts and labels, and enlarge them so they are clearly visible. When duplicative or otherwise unnecessary, drop distracting gridlines and labels on the y-axis (the vertical one). Don’t go crazy adding different fonts, however—keep things simple and clear. Sans serif fonts (those without the little hooks on the ends of letters) read better from a distance. Try to use the same color scheme throughout, even if this means manually changing the colors of bars and columns. For example, when reporting on working-class college students, I use blue bars, while I reserve green bars for wealthy students and yellow bars for students in the middle. I repeat these colors throughout my presentations and incorporate different colors when talking about other items or factors. You can also try using simple grayscale throughout, with pops of color to indicate a bar or column or line that is of the most interest. These are just some suggestions. The point is to take presentation seriously and to pay attention to visual displays you are using to ensure they effectively communicate what you want them to communicate. I’ve included a data visualization checklist from Evergreen ( 2018 ) here.

Ethics of Presentation and Reliability

Until now, all the data you have collected have been yours alone. Once you present the data, however, you are sharing sometimes very intimate information about people with a broader public. You will find yourself balancing between protecting the privacy of those you’ve interviewed and observed and needing to demonstrate the reliability of the study. The more information you provide to your audience, the more they can understand and appreciate what you have found, but this also may pose risks to your participants. There is no one correct way to go about finding the right balance. As always, you have a duty to consider what you are doing and must make some hard decisions.

Null

The most obvious place we see this paradox emerge is when you mask your data to protect the privacy of your participants. It is standard practice to provide pseudonyms, for example. It is such standard practice that you should always assume you are being given a pseudonym when reading a book or article based on qualitative research. When I was a graduate student, I tried to find information on how best to construct pseudonyms but found little guidance. There are some ethical issues here, I think. [7] Do you create a name that has the same kind of resonance as the original name? If the person goes by a nickname, should you use a nickname as a pseudonym? What about names that are ethnically marked (as in, almost all of them)? Is there something unethical about reracializing a person? (Yes!) In her study of adolescent subcultures, Wilkins ( 2008 ) noted, “Because many of the goths used creative, alternative names rather than their given names, I did my best to reproduce the spirit of their chosen names” ( 24 ).

Your reader or audience will want to know all the details about your participants so that they can gauge both your credibility and the reliability of your findings. But how many details are too many? What if you change the name but otherwise retain all the personal pieces of information about where they grew up, and how old they were when they got married, and how many children they have, and whether they made a splash in the news cycle that time they were stalked by their ex-boyfriend? At some point, those details are going to tip over into the zone of potential unmasking. When you are doing research at one particular field site that may be easily ascertained (as when you interview college students, probably at the institution at which you are a student yourself), it is even more important to be wary of providing too many details. You also need to think that your participants might read what you have written, know things about the site or the population from which you drew your interviews, and figure out whom you are talking about. This can all get very messy if you don’t do more than simply pseudonymize the people you interviewed or observed.

There are some ways to do this. One, you can design a study with all of these risks in mind. That might mean choosing to conduct interviews or observations at multiple sites so that no one person can be easily identified. Another is to alter some basic details about your participants to protect their identity or to refuse to provide all the information when selecting quotes . Let’s say you have an interviewee named “Anna” (a pseudonym), and she is a twenty-four-year-old Latina studying to be an engineer. You want to use a quote from Anna about racial discrimination in her graduate program. Instead of attributing the quote to Anna (whom your reader knows, because you’ve already told them, is a twenty-four-year-old Latina studying engineering), you might simply attribute the quote to “Latina student in STEM.” Taking this a step further, you might leave the quote unattributed, providing a list of quotes about racial discrimination by “various students.”

The problem with masking all the identifiers, of course, is that you lose some of the analytical heft of those attributes. If it mattered that Anna was twenty-four (not thirty-four) and that she was a Latina and that she was studying engineering, taking out any of those aspects of her identity might weaken your analysis. This is one of those “hard choices” you will be called on to make! A rather radical and controversial solution to this dilemma is to create composite characters , characters based on the reality of the interviews but fully masked because they are not identifiable with any one person. My students are often very queasy about this when I explain it to them. The more positivistic your approach and the more you see individuals rather than social relationships/structure as the “object” of your study, the more employing composites will seem like a really bad idea. But composites “allow researchers to present complex, situated accounts from individuals” without disclosing personal identities ( Willis 2019 ), and they can be effective ways of presenting theory narratively ( Hurst 2019 ). Ironically, composites permit you more latitude when including “dirty laundry” or stories that could harm individuals if their identities became known. Rather than squeezing out details that could identify a participant, the identities are permanently removed from the details. Great difficulty remains, however, in clearly explaining the theoretical use of composites to your audience and providing sufficient information on the reliability of the underlying data.

There are a host of other ethical issues that emerge as you write and present your data. This is where being reflective throughout the process will help. How and what you share of what you have learned will depend on the social relationships you have built, the audiences you are writing or speaking to, and the underlying animating goals of your study. Be conscious about all of your decisions, and then be able to explain them fully, both to yourself and to those who ask.

Our research is often close to us. As a Black woman who is a first-generation college student and a professional with a poverty/working-class origin, each of these pieces of my identity creates nuances in how I engage in my research, including how I share it out. Because of this, it’s important for us to have people in our lives who we trust who can help us, particularly, when we are trying to share our findings. As researchers, we have been steeped in our work, so we know all the details and nuances. Sometimes we take this for granted, and we might not have shared those nuances in conversation or writing or taken some of this information for granted. As I share my research with trusted friends and colleagues, I pay attention to the questions they ask me or the feedback they give when we talk or when they read drafts.

—Kim McAloney, PhD, College Student Services Administration Ecampus coordinator and instructor

Final Comments: Preparing for Being Challenged

Once you put your work out there, you must be ready to be challenged. Science is a collective enterprise and depends on a healthy give and take among researchers. This can be both novel and difficult as you get started, but the more you understand the importance of these challenges, the easier it will be to develop the kind of thick skin necessary for success in academia. Scientists’ authority rests on both the inherent strength of their findings and their ability to convince other scientists of the reliability and validity and value of those findings. So be prepared to be challenged, and recognize this as simply another important aspect of conducting research!

Considering what challenges might be made as you design and conduct your study will help you when you get to the writing and presentation stage. Address probable challenges in your final article, and have a planned response to probable questions in a conference presentation or job talk. The following is a list of common challenges of qualitative research and how you might best address them:

  • Questions about generalizability . Although qualitative research is not statistically generalizable (and be prepared to explain why), qualitative research is theoretically generalizable. Discuss why your findings here might tell us something about related phenomena or contexts.
  • Questions about reliability . You probably took steps to ensure the reliability of your findings. Discuss them! This includes explaining the use and value of multiple data sources and defending your sampling and case selections. It also means being transparent about your own position as researcher and explaining steps you took to ensure that what you were seeing was really there.
  • Questions about replicability. Although qualitative research cannot strictly be replicated because the circumstances and contexts will necessarily be different (if only because the point in time is different), you should be able to provide as much detail as possible about how the study was conducted so that another researcher could attempt to confirm or disconfirm your findings. Also, be very clear about the limitations of your study, as this allows other researchers insight into what future research might be warranted.

None of this is easy, of course. Writing beautifully and presenting clearly and cogently require skill and practice. If you take anything from this chapter, it is to remember that presentation is an important and essential part of the research process and to allocate time for this as you plan your research.

Data Visualization Checklist for Slideshow (PPT) Presentations

Adapted from Evergreen ( 2018 )

Text checklist

  • Short catchy, descriptive titles (e.g., “Working-class students are three times as likely to drop out of college”) summarize the point of the visual display
  • Subtitled and annotations provide additional information (e.g., “note: male students also more likely to drop out”)
  • Text size is hierarchical and readable (titles are largest; axes labels smallest, which should be at least 20points)
  • Text is horizontal. Audience members cannot read vertical text!
  • All data labeled directly and clearly: get rid of those “legends” and embed the data in your graphic display
  • Labels are used sparingly; avoid redundancy (e.g., do not include both a number axis and a number label)

Arrangement checklist

  • Proportions are accurate; bar charts should always start at zero; don’t mislead the audience!
  • Data are intentionally ordered (e.g., by frequency counts). Do not leave ragged alphabetized bar graphs!
  • Axis intervals are equidistant: spaces between axis intervals should be the same unit
  • Graph is two-dimensional. Three-dimensional and “bevelled” displays are confusing
  • There is no unwanted decoration (especially the kind that comes automatically through the PPT “theme”). This wastes your space and confuses.

Color checklist

  • There is an intentional color scheme (do not use default theme)
  • Color is used to identify key patterns (e.g., highlight one bar in red against six others in greyscale if this is the bar you want the audience to notice)
  • Color is still legible when printed in black and white
  • Color is legible for people with color blindness (do not use red/green or yellow/blue combinations)
  • There is sufficient contrast between text and background (black text on white background works best; be careful of white on dark!)

Lines checklist

  • Be wary of using gridlines; if you do, mute them (grey, not black)
  • Allow graph to bleed into surroundings (don’t use border lines)
  • Remove axis lines unless absolutely necessary (better to label directly)

Overall design checklist

  • The display highlights a significant finding or conclusion that your audience can ‘”see” relatively quickly
  • The type of graph (e.g., bar chart, pie chart, line graph) is appropriate for the data. Avoid pie charts with more than three slices!
  • Graph has appropriate level of precision; if you don’t need decimal places
  • All the chart elements work together to reinforce the main message

Universal Design Checklist for Slideshow (PPT) Presentations

  • Include both verbal and written descriptions (e.g., captions on slides); consider providing a hand-out to accompany the presentation
  • Microphone available (ask audience in back if they can clearly hear)
  • Face audience; allow people to read your lips
  • Turn on captions when presenting audio or video clips
  • Adjust light settings for visibility
  • Speak slowly and clearly; practice articulation; don’t mutter or speak under your breath (even if you have something humorous to say – say it loud!)
  • Use Black/White contrasts for easy visibility; or use color contrasts that are real contrasts (do not rely on people being able to differentiate red from green, for example)
  • Use easy to read font styles and avoid too small font sizes: think about what an audience member in the back row will be able to see and read.
  • Keep your slides simple: do not overclutter them; if you are including quotes from your interviews, take short evocative snippets only, and bold key words and passages. You should also read aloud each passage, preferably with feeling!

Supplement: Models of Written Sections for Future Reference

Data collection section example.

Interviews were semi structured, lasted between one and three hours, and took place at a location chosen by the interviewee. Discussions centered on four general topics: (1) knowledge of their parent’s immigration experiences; (2) relationship with their parents; (3) understanding of family labor, including language-brokering experiences; and (4) experiences with school and peers, including any future life plans. While conducting interviews, I paid close attention to respondents’ nonverbal cues, as well as their use of metaphors and jokes. I conducted interviews until I reached a point of saturation, as indicated by encountering repeated themes in new interviews (Glaser and Strauss 1967). Interviews were audio recorded, transcribed with each interviewee’s permission, and conducted in accordance with IRB protocols. Minors received permission from their parents before participation in the interview. ( Kwon 2022:1832 )

Justification of Case Selection / Sample Description Section Example

Looking at one profession within one organization and in one geographic area does impose limitations on the generalizability of our findings. However, it also has advantages. We eliminate the problem of interorganizational heterogeneity. If multiple organizations are studied simultaneously, it can make it difficult to discern the mechanisms that contribute to racial inequalities. Even with a single occupation there is considerable heterogeneity, which may make understanding how organizational structure impacts worker outcomes difficult. By using the case of one group of professionals in one religious denomination in one geographic region of the United States, we clarify how individuals’ perceptions and experiences of occupational inequality unfold in relation to a variety of observed and unobserved occupational and contextual factors that might be obscured in a larger-scale study. Focusing on a specific group of professionals allows us to explore and identify ways that formal organizational rules combine with informal processes to contribute to the persistence of racial inequality. ( Eagle and Mueller 2022:1510–1511 )

Ethics Section Example

I asked everyone who was willing to sit for a formal interview to speak only for themselves and offered each of them a prepaid Visa Card worth $25–40. I also offered everyone the opportunity to keep the card and erase the tape completely at any time they were dissatisfied with the interview in any way. No one asked for the tape to be erased; rather, people remarked on the interview being a really good experience because they felt heard. Each interview was professionally transcribed and for the most part the excerpts are literal transcriptions. In a few places, the excerpts have been edited to reduce colloquial features of speech (e.g., you know, like, um) and some recursive elements common to spoken language. A few excerpts were placed into standard English for clarity. I made this choice for the benefit of readers who might otherwise find the insights and ideas harder to parse in the original. However, I have to acknowledge this as an act of class-based violence. I tried to keep the original phrasing whenever possible. ( Pascale 2021:235 )

Further Readings

Calarco, Jessica McCrory. 2020. A Field Guide to Grad School: Uncovering the Hidden Curriculum . Princeton, NJ: Princeton University Press. Don’t let the unassuming title mislead you—there is a wealth of helpful information on writing and presenting data included here in a highly accessible manner. Every graduate student should have a copy of this book.

Edwards, Mark. 2012. Writing in Sociology . Thousand Oaks, CA: SAGE. An excellent guide to writing and presenting sociological research by an Oregon State University professor. Geared toward undergraduates and useful for writing about either quantitative or qualitative research or both.

Evergreen, Stephanie D. H. 2018. Presenting Data Effectively: Communicating Your Findings for Maximum Impact . Thousand Oaks, CA: SAGE. This is one of my very favorite books, and I recommend it highly for everyone who wants their presentations and publications to communicate more effectively than the boring black-and-white, ragged-edge tables and figures academics are used to seeing.

Evergreen, Stephanie D. H. 2019. Effective Data Visualization 2 . Thousand Oaks, CA: SAGE. This is an advanced primer for presenting clean and clear data using graphs, tables, color, font, and so on. Start with Evergreen (2018), and if you graduate from that text, move on to this one.

Schwabisch, Jonathan. 2021. Better Data Visualizations: A Guide for Scholars, Researchers, and Wonks . New York: Columbia University Press. Where Evergreen’s (2018, 2019) focus is on how to make the best visual displays possible for effective communication, this book is specifically geared toward visual displays of academic data, both quantitative and qualitative. If you want to know when it is appropriate to use a pie chart instead of a stacked bar chart, this is the reference to use.

  • Some examples: Qualitative Inquiry , Qualitative Research , American Journal of Qualitative Research , Ethnography , Journal of Ethnographic and Qualitative Research , Qualitative Report , Qualitative Sociology , and Qualitative Studies . ↵
  • This is something I do with every article I write: using Excel, I write each element of the expected article in a separate row, with one column for “expected word count” and another column for “actual word count.” I fill in the actual word count as I write. I add a third column for “comments to myself”—how things are progressing, what I still need to do, and so on. I then use the “sum” function below each of the first two columns to keep a running count of my progress relative to the final word count. ↵
  • And this is true, I would argue, even when your primary goal is to leave space for the voices of those who don’t usually get a chance to be part of the conversation. You will still want to put those voices in some kind of choir, with a clear direction (song) to be sung. The worst thing you can do is overwhelm your audience with random quotes or long passages with no key to understanding them. Yes, a lot of metaphors—qualitative researchers love metaphors! ↵
  • To take Calarco’s recipe analogy further, do not write like those food bloggers who spend more time discussing the color of their kitchen or the experiences they had at the market than they do the actual cooking; similarly, do not write recipes that omit crucial details like the amount of flour or the size of the baking pan used or the temperature of the oven. ↵
  • The exception is the “compare and contrast” of two or more quotes, but use caution here. None of the quotes should be very long at all (a sentence or two each). ↵
  • Although this section is geared toward presentations, many of the suggestions could also be useful when writing about your data. Don’t be afraid to use charts and graphs and figures when writing your proposal, article, thesis, or dissertation. At the very least, you should incorporate a tabular display of the participants, sites, or documents used. ↵
  • I was so puzzled by these kinds of questions that I wrote one of my very first articles on it ( Hurst 2008 ). ↵

The visual presentation of data or information through graphics such as charts, graphs, plots, infographics, maps, and animation.  Recall the best documentary you ever viewed, and there were probably excellent examples of good data visualization there (for me, this was An Inconvenient Truth , Al Gore’s film about climate change).  Good data visualization allows more effective communication of findings of research, particularly in public presentations (e.g., slideshows).

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

  • Skip to main content
  • Skip to "About this site"

Language selection

  • Français
  • Search and menus

Publications

Statistics canada quality guidelines.

  • Introduction
  • Survey steps
  • More information

Data analysis and presentation

Archived content.

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please " contact us " to request a format other than those available.

This page has been archived on the Web.

Scope and purpose Principles Guidelines Quality indicators References

Scope and purpose

Data analysis is the process of developing answers to questions through the examination and interpretation of data.  The basic steps in the analytic process consist of identifying issues, determining the availability of suitable data, deciding on which methods are appropriate for answering the questions of interest, applying the methods and evaluating, summarizing and communicating the results.  

Analytical results underscore the usefulness of data sources by shedding light on relevant issues. Some Statistics Canada programs depend on analytical output as a major data product because, for confidentiality reasons, it is not possible to release the microdata to the public. Data analysis also plays a key role in data quality assessment by pointing to data quality problems in a given survey. Analysis can thus influence future improvements to the survey process.

Data analysis is essential for understanding results from surveys, administrative sources and pilot studies; for providing information on data gaps; for designing and redesigning surveys; for planning new statistical activities; and for formulating quality objectives.

Results of data analysis are often published or summarized in official Statistics Canada releases. 

A statistical agency is concerned with the relevance and usefulness to users of the information contained in its data. Analysis is the principal tool for obtaining information from the data.

Data from a survey can be used for descriptive or analytic studies. Descriptive studies are directed at the estimation of summary measures of a target population, for example, the average profits of owner-operated businesses in 2005 or the proportion of 2007 high school graduates who went on to higher education in the next twelve months.  Analytical studies may be used to explain the behaviour of and relationships among characteristics; for example, a study of risk factors for obesity in children would be analytic. 

To be effective, the analyst needs to understand the relevant issues both current and those likely to emerge in the future and how to present the results to the audience. The study of background information allows the analyst to choose suitable data sources and appropriate statistical methods. Any conclusions presented in an analysis, including those that can impact public policy, must be supported by the data being analyzed.

Initial preparation

Prior to conducting an analytical study the following questions should be addressed:

Objectives. What are the objectives of this analysis? What issue am I addressing? What question(s) will I answer?

Justification. Why is this issue interesting?  How will these answers contribute to existing knowledge? How is this study relevant?

Data. What data am I using? Why it is the best source for this analysis? Are there any limitations?

Analytical methods. What statistical techniques are appropriate? Will they satisfy the objectives?

Audience. Who is interested in this issue and why?

  Suitable data

Ensure that the data are appropriate for the analysis to be carried out.  This requires investigation of a wide range of details such as whether the target population of the data source is sufficiently related to the target population of the analysis, whether the source variables and their concepts and definitions are relevant to the study, whether the longitudinal or cross-sectional nature of the data source is appropriate for the analysis, whether the sample size in the study domain is sufficient to obtain meaningful results and whether the quality of the data, as outlined in the survey documentation or assessed through analysis is sufficient.

 If more than one data source is being used for the analysis, investigate whether the sources are consistent and how they may be appropriately integrated into the analysis.

Appropriate methods and tools

Choose an analytical approach that is appropriate for the question being investigated and the data to be analyzed. 

When analyzing data from a probability sample, analytical methods that ignore the survey design can be appropriate, provided that sufficient model conditions for analysis are met. (See Binder and Roberts, 2003.) However, methods that incorporate the sample design information will generally be effective even when some aspects of the model are incorrectly specified.

Assess whether the survey design information can be incorporated into the analysis and if so how this should be done such as using design-based methods.  See Binder and Roberts (2009) and Thompson (1997) for discussion of approaches to inferences on data from a probability sample.

See Chambers and Skinner (2003), Korn and Graubard (1999), Lehtonen and Pahkinen (1995), Lohr (1999), and Skinner, Holt and Smith (1989) for a number of examples illustrating design-based analytical methods.

For a design-based analysis consult the survey documentation about the recommended approach for variance estimation for the survey. If the data from more than one survey are included in the same analysis, determine whether or not the different samples were independently selected and how this would impact the appropriate approach to variance estimation.

The data files for probability surveys frequently contain more than one weight variable, particularly if the survey is longitudinal or if it has both cross-sectional and longitudinal purposes. Consult the survey documentation and survey experts if it is not obvious as to which might be the best weight to be used in any particular design-based analysis.

When analyzing data from a probability survey, there may be insufficient design information available to carry out analyses using a full design-based approach.  Assess the alternatives.

Consult with experts on the subject matter, on the data source and on the statistical methods if any of these is unfamiliar to you.

Having determined the appropriate analytical method for the data, investigate the software choices that are available to apply the method. If analyzing data from a probability sample by design-based methods, use software specifically for survey data since standard analytical software packages that can produce weighted point estimates do not correctly calculate variances for survey-weighted estimates.

It is advisable to use commercial software, if suitable, for implementing the chosen analyses, since these software packages have usually undergone more testing than non-commercial software.

Determine whether it is necessary to reformat your data in order to use the selected software.

Include a variety of diagnostics among your analytical methods if you are fitting any models to your data.

Refer to the documentation about the data source to determine the degree and types of missing data and the processing of missing data that has been performed.  This information will be a starting point for what further work may be required.

Consider how unit and/or item nonresponse could be handled in the analysis, taking into consideration the degree and types of missing data in the data sources being used.

Consider whether imputed values should be included in the analysis and if so, how they should be handled.  If imputed values are not used, consideration must be given to what other methods may be used to properly account for the effect of nonresponse in the analysis.

If the analysis includes modelling, it could be appropriate to include some aspects of nonresponse in the analytical model.

Report any caveats about how the approaches used to handle missing data could have impact on results

Interpretation of results

Since most analyses are based on observational studies rather than on the results of a controlled experiment, avoid drawing conclusions concerning causality.

When studying changes over time, beware of focusing on short-term trends without inspecting them in light of medium-and long-term trends. Frequently, short-term trends are merely minor fluctuations around a more important medium- and/or long-term trend.

Where possible, avoid arbitrary time reference points. Instead, use meaningful points of reference, such as the last major turning point for economic data, generation-to-generation differences for demographic statistics, and legislative changes for social statistics.

Presentation of results

Focus the article on the important variables and topics. Trying to be too comprehensive will often interfere with a strong story line.

Arrange ideas in a logical order and in order of relevance or importance. Use headings, subheadings and sidebars to strengthen the organization of the article.

Keep the language as simple as the subject permits. Depending on the targeted audience for the article, some loss of precision may sometimes be an acceptable trade-off for more readable text.

Use graphs in addition to text and tables to communicate the message. Use headings that capture the meaning ( e.g. "Women's earnings still trail men's") in preference to traditional chart titles ( e.g. "Income by age and sex"). Always help readers understand the information in the tables and charts by discussing it in the text.

When tables are used, take care that the overall format contributes to the clarity of the data in the tables and prevents misinterpretation.  This includes spacing; the wording, placement and appearance of titles; row and column headings and other labeling. 

Explain rounding practices or procedures. In the presentation of rounded data, do not use more significant digits than are consistent with the accuracy of the data.

Satisfy any confidentiality requirements ( e.g. minimum cell sizes) imposed by the surveys or administrative sources whose data are being analysed.

Include information about the data sources used and any shortcomings in the data that may have affected the analysis.  Either have a section in the paper about the data or a reference to where the reader can get the details.

Include information about the analytical methods and tools used.  Either have a section on methods or a reference to where the reader can get the details.

Include information regarding the quality of the results. Standard errors, confidence intervals and/or coefficients of variation provide the reader important information about data quality. The choice of indicator may vary depending on where the article is published.

Ensure that all references are accurate, consistent and are referenced in the text.

Check for errors in the article. Check details such as the consistency of figures used in the text, tables and charts, the accuracy of external data, and simple arithmetic.

Ensure that the intentions stated in the introduction are fulfilled by the rest of the article. Make sure that the conclusions are consistent with the evidence.

Have the article reviewed by others for relevance, accuracy and comprehensibility, regardless of where it is to be disseminated.  As a good practice, ask someone from the data providing division to review how the data were used.  If the article is to be disseminated outside of Statistics Canada, it must undergo institutional and peer review as specified in the Policy on the Review of Information Products (Statistics Canada, 2003). 

If the article is to be disseminated in a Statistics Canada publication make sure that it complies with the current Statistics Canada Publishing Standards. These standards affect graphs, tables and style, among other things.

As a good practice, consider presenting the results to peers prior to finalizing the text. This is another kind of peer review that can help improve the article. Always do a dry run of presentations involving external audiences.

Refer to available documents that could provide further guidance for improvement of your article, such as Guidelines on Writing Analytical Articles (Statistics Canada 2008 ) and the Style Guide (Statistics Canada 2004)

Quality indicators

Main quality elements:  relevance, interpretability, accuracy, accessibility

An analytical product is relevant if there is an audience who is (or will be) interested in the results of the study.

For the interpretability of an analytical article to be high, the style of writing must suit the intended audience. As well, sufficient details must be provided that another person, if allowed access to the data, could replicate the results.

For an analytical product to be accurate, appropriate methods and tools need to be used to produce the results.

For an analytical product to be accessible, it must be available to people for whom the research results would be useful.

Binder, D.A. and G.R. Roberts. 2003. "Design-based methods for estimating model parameters."  In Analysis of Survey Data. R.L. Chambers and C.J. Skinner ( eds. ) Chichester. Wiley. p. 29-48.

Binder, D.A. and G. Roberts. 2009. "Design and Model Based Inference for Model Parameters." In Handbook of Statistics 29B: Sample Surveys: Inference and Analysis. Pfeffermann, D. and Rao, C.R. ( eds. ) Vol. 29B. Chapter 24. Amsterdam.Elsevier. 666 p.

Chambers, R.L. and C.J. Skinner ( eds. ) 2003. Analysis of Survey Data. Chichester. Wiley. 398 p.

Korn, E.L. and B.I. Graubard. 1999. Analysis of Health Surveys. New York. Wiley. 408 p.

Lehtonen, R. and E.J. Pahkinen. 2004. Practical Methods for Design and Analysis of Complex Surveys.Second edition. Chichester. Wiley.

Lohr, S.L. 1999. Sampling: Design and Analysis. Duxbury Press. 512 p.

Skinner, C.K., D.Holt and T.M.F. Smith. 1989. Analysis of Complex Surveys. Chichester. Wiley. 328 p.

Thompson, M.E. 1997. Theory of Sample Surveys. London. Chapman and Hall. 312 p.

Statistics Canada. 2003. "Policy on the Review of Information Products." Statistics Canada Policy Manual. Section 2.5. Last updated March 4, 2009.

Statistics Canada. 2004. Style Guide.  Last updated October 6, 2004.

Statistics Canada. 2008. Guidelines on Writing Analytical Articles. Last updated September 16, 2008.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

What the data says about abortion in the U.S.

Pew Research Center has conducted many surveys about abortion over the years, providing a lens into Americans’ views on whether the procedure should be legal, among a host of other questions.

In a  Center survey  conducted nearly a year after the Supreme Court’s June 2022 decision that  ended the constitutional right to abortion , 62% of U.S. adults said the practice should be legal in all or most cases, while 36% said it should be illegal in all or most cases. Another survey conducted a few months before the decision showed that relatively few Americans take an absolutist view on the issue .

Find answers to common questions about abortion in America, based on data from the Centers for Disease Control and Prevention (CDC) and the Guttmacher Institute, which have tracked these patterns for several decades:

How many abortions are there in the U.S. each year?

How has the number of abortions in the u.s. changed over time, what is the abortion rate among women in the u.s. how has it changed over time, what are the most common types of abortion, how many abortion providers are there in the u.s., and how has that number changed, what percentage of abortions are for women who live in a different state from the abortion provider, what are the demographics of women who have had abortions, when during pregnancy do most abortions occur, how often are there medical complications from abortion.

This compilation of data on abortion in the United States draws mainly from two sources: the Centers for Disease Control and Prevention (CDC) and the Guttmacher Institute, both of which have regularly compiled national abortion data for approximately half a century, and which collect their data in different ways.

The CDC data that is highlighted in this post comes from the agency’s “abortion surveillance” reports, which have been published annually since 1974 (and which have included data from 1969). Its figures from 1973 through 1996 include data from all 50 states, the District of Columbia and New York City – 52 “reporting areas” in all. Since 1997, the CDC’s totals have lacked data from some states (most notably California) for the years that those states did not report data to the agency. The four reporting areas that did not submit data to the CDC in 2021 – California, Maryland, New Hampshire and New Jersey – accounted for approximately 25% of all legal induced abortions in the U.S. in 2020, according to Guttmacher’s data. Most states, though,  do  have data in the reports, and the figures for the vast majority of them came from each state’s central health agency, while for some states, the figures came from hospitals and other medical facilities.

Discussion of CDC abortion data involving women’s state of residence, marital status, race, ethnicity, age, abortion history and the number of previous live births excludes the low share of abortions where that information was not supplied. Read the methodology for the CDC’s latest abortion surveillance report , which includes data from 2021, for more details. Previous reports can be found at  stacks.cdc.gov  by entering “abortion surveillance” into the search box.

For the numbers of deaths caused by induced abortions in 1963 and 1965, this analysis looks at reports by the then-U.S. Department of Health, Education and Welfare, a precursor to the Department of Health and Human Services. In computing those figures, we excluded abortions listed in the report under the categories “spontaneous or unspecified” or as “other.” (“Spontaneous abortion” is another way of referring to miscarriages.)

Guttmacher data in this post comes from national surveys of abortion providers that Guttmacher has conducted 19 times since 1973. Guttmacher compiles its figures after contacting every known provider of abortions – clinics, hospitals and physicians’ offices – in the country. It uses questionnaires and health department data, and it provides estimates for abortion providers that don’t respond to its inquiries. (In 2020, the last year for which it has released data on the number of abortions in the U.S., it used estimates for 12% of abortions.) For most of the 2000s, Guttmacher has conducted these national surveys every three years, each time getting abortion data for the prior two years. For each interim year, Guttmacher has calculated estimates based on trends from its own figures and from other data.

The latest full summary of Guttmacher data came in the institute’s report titled “Abortion Incidence and Service Availability in the United States, 2020.” It includes figures for 2020 and 2019 and estimates for 2018. The report includes a methods section.

In addition, this post uses data from StatPearls, an online health care resource, on complications from abortion.

An exact answer is hard to come by. The CDC and the Guttmacher Institute have each tried to measure this for around half a century, but they use different methods and publish different figures.

The last year for which the CDC reported a yearly national total for abortions is 2021. It found there were 625,978 abortions in the District of Columbia and the 46 states with available data that year, up from 597,355 in those states and D.C. in 2020. The corresponding figure for 2019 was 607,720.

The last year for which Guttmacher reported a yearly national total was 2020. It said there were 930,160 abortions that year in all 50 states and the District of Columbia, compared with 916,460 in 2019.

  • How the CDC gets its data: It compiles figures that are voluntarily reported by states’ central health agencies, including separate figures for New York City and the District of Columbia. Its latest totals do not include figures from California, Maryland, New Hampshire or New Jersey, which did not report data to the CDC. ( Read the methodology from the latest CDC report .)
  • How Guttmacher gets its data: It compiles its figures after contacting every known abortion provider – clinics, hospitals and physicians’ offices – in the country. It uses questionnaires and health department data, then provides estimates for abortion providers that don’t respond. Guttmacher’s figures are higher than the CDC’s in part because they include data (and in some instances, estimates) from all 50 states. ( Read the institute’s latest full report and methodology .)

While the Guttmacher Institute supports abortion rights, its empirical data on abortions in the U.S. has been widely cited by  groups  and  publications  across the political spectrum, including by a  number of those  that  disagree with its positions .

These estimates from Guttmacher and the CDC are results of multiyear efforts to collect data on abortion across the U.S. Last year, Guttmacher also began publishing less precise estimates every few months , based on a much smaller sample of providers.

The figures reported by these organizations include only legal induced abortions conducted by clinics, hospitals or physicians’ offices, or those that make use of abortion pills dispensed from certified facilities such as clinics or physicians’ offices. They do not account for the use of abortion pills that were obtained  outside of clinical settings .

(Back to top)

A line chart showing the changing number of legal abortions in the U.S. since the 1970s.

The annual number of U.S. abortions rose for years after Roe v. Wade legalized the procedure in 1973, reaching its highest levels around the late 1980s and early 1990s, according to both the CDC and Guttmacher. Since then, abortions have generally decreased at what a CDC analysis called  “a slow yet steady pace.”

Guttmacher says the number of abortions occurring in the U.S. in 2020 was 40% lower than it was in 1991. According to the CDC, the number was 36% lower in 2021 than in 1991, looking just at the District of Columbia and the 46 states that reported both of those years.

(The corresponding line graph shows the long-term trend in the number of legal abortions reported by both organizations. To allow for consistent comparisons over time, the CDC figures in the chart have been adjusted to ensure that the same states are counted from one year to the next. Using that approach, the CDC figure for 2021 is 622,108 legal abortions.)

There have been occasional breaks in this long-term pattern of decline – during the middle of the first decade of the 2000s, and then again in the late 2010s. The CDC reported modest 1% and 2% increases in abortions in 2018 and 2019, and then, after a 2% decrease in 2020, a 5% increase in 2021. Guttmacher reported an 8% increase over the three-year period from 2017 to 2020.

As noted above, these figures do not include abortions that use pills obtained outside of clinical settings.

Guttmacher says that in 2020 there were 14.4 abortions in the U.S. per 1,000 women ages 15 to 44. Its data shows that the rate of abortions among women has generally been declining in the U.S. since 1981, when it reported there were 29.3 abortions per 1,000 women in that age range.

The CDC says that in 2021, there were 11.6 abortions in the U.S. per 1,000 women ages 15 to 44. (That figure excludes data from California, the District of Columbia, Maryland, New Hampshire and New Jersey.) Like Guttmacher’s data, the CDC’s figures also suggest a general decline in the abortion rate over time. In 1980, when the CDC reported on all 50 states and D.C., it said there were 25 abortions per 1,000 women ages 15 to 44.

That said, both Guttmacher and the CDC say there were slight increases in the rate of abortions during the late 2010s and early 2020s. Guttmacher says the abortion rate per 1,000 women ages 15 to 44 rose from 13.5 in 2017 to 14.4 in 2020. The CDC says it rose from 11.2 per 1,000 in 2017 to 11.4 in 2019, before falling back to 11.1 in 2020 and then rising again to 11.6 in 2021. (The CDC’s figures for those years exclude data from California, D.C., Maryland, New Hampshire and New Jersey.)

The CDC broadly divides abortions into two categories: surgical abortions and medication abortions, which involve pills. Since the Food and Drug Administration first approved abortion pills in 2000, their use has increased over time as a share of abortions nationally, according to both the CDC and Guttmacher.

The majority of abortions in the U.S. now involve pills, according to both the CDC and Guttmacher. The CDC says 56% of U.S. abortions in 2021 involved pills, up from 53% in 2020 and 44% in 2019. Its figures for 2021 include the District of Columbia and 44 states that provided this data; its figures for 2020 include D.C. and 44 states (though not all of the same states as in 2021), and its figures for 2019 include D.C. and 45 states.

Guttmacher, which measures this every three years, says 53% of U.S. abortions involved pills in 2020, up from 39% in 2017.

Two pills commonly used together for medication abortions are mifepristone, which, taken first, blocks hormones that support a pregnancy, and misoprostol, which then causes the uterus to empty. According to the FDA, medication abortions are safe  until 10 weeks into pregnancy.

Surgical abortions conducted  during the first trimester  of pregnancy typically use a suction process, while the relatively few surgical abortions that occur  during the second trimester  of a pregnancy typically use a process called dilation and evacuation, according to the UCLA School of Medicine.

In 2020, there were 1,603 facilities in the U.S. that provided abortions,  according to Guttmacher . This included 807 clinics, 530 hospitals and 266 physicians’ offices.

A horizontal stacked bar chart showing the total number of abortion providers down since 1982.

While clinics make up half of the facilities that provide abortions, they are the sites where the vast majority (96%) of abortions are administered, either through procedures or the distribution of pills, according to Guttmacher’s 2020 data. (This includes 54% of abortions that are administered at specialized abortion clinics and 43% at nonspecialized clinics.) Hospitals made up 33% of the facilities that provided abortions in 2020 but accounted for only 3% of abortions that year, while just 1% of abortions were conducted by physicians’ offices.

Looking just at clinics – that is, the total number of specialized abortion clinics and nonspecialized clinics in the U.S. – Guttmacher found the total virtually unchanged between 2017 (808 clinics) and 2020 (807 clinics). However, there were regional differences. In the Midwest, the number of clinics that provide abortions increased by 11% during those years, and in the West by 6%. The number of clinics  decreased  during those years by 9% in the Northeast and 3% in the South.

The total number of abortion providers has declined dramatically since the 1980s. In 1982, according to Guttmacher, there were 2,908 facilities providing abortions in the U.S., including 789 clinics, 1,405 hospitals and 714 physicians’ offices.

The CDC does not track the number of abortion providers.

In the District of Columbia and the 46 states that provided abortion and residency information to the CDC in 2021, 10.9% of all abortions were performed on women known to live outside the state where the abortion occurred – slightly higher than the percentage in 2020 (9.7%). That year, D.C. and 46 states (though not the same ones as in 2021) reported abortion and residency data. (The total number of abortions used in these calculations included figures for women with both known and unknown residential status.)

The share of reported abortions performed on women outside their state of residence was much higher before the 1973 Roe decision that stopped states from banning abortion. In 1972, 41% of all abortions in D.C. and the 20 states that provided this information to the CDC that year were performed on women outside their state of residence. In 1973, the corresponding figure was 21% in the District of Columbia and the 41 states that provided this information, and in 1974 it was 11% in D.C. and the 43 states that provided data.

In the District of Columbia and the 46 states that reported age data to  the CDC in 2021, the majority of women who had abortions (57%) were in their 20s, while about three-in-ten (31%) were in their 30s. Teens ages 13 to 19 accounted for 8% of those who had abortions, while women ages 40 to 44 accounted for about 4%.

The vast majority of women who had abortions in 2021 were unmarried (87%), while married women accounted for 13%, according to  the CDC , which had data on this from 37 states.

A pie chart showing that, in 2021, majority of abortions were for women who had never had one before.

In the District of Columbia, New York City (but not the rest of New York) and the 31 states that reported racial and ethnic data on abortion to  the CDC , 42% of all women who had abortions in 2021 were non-Hispanic Black, while 30% were non-Hispanic White, 22% were Hispanic and 6% were of other races.

Looking at abortion rates among those ages 15 to 44, there were 28.6 abortions per 1,000 non-Hispanic Black women in 2021; 12.3 abortions per 1,000 Hispanic women; 6.4 abortions per 1,000 non-Hispanic White women; and 9.2 abortions per 1,000 women of other races, the  CDC reported  from those same 31 states, D.C. and New York City.

For 57% of U.S. women who had induced abortions in 2021, it was the first time they had ever had one,  according to the CDC.  For nearly a quarter (24%), it was their second abortion. For 11% of women who had an abortion that year, it was their third, and for 8% it was their fourth or more. These CDC figures include data from 41 states and New York City, but not the rest of New York.

A bar chart showing that most U.S. abortions in 2021 were for women who had previously given birth.

Nearly four-in-ten women who had abortions in 2021 (39%) had no previous live births at the time they had an abortion,  according to the CDC . Almost a quarter (24%) of women who had abortions in 2021 had one previous live birth, 20% had two previous live births, 10% had three, and 7% had four or more previous live births. These CDC figures include data from 41 states and New York City, but not the rest of New York.

The vast majority of abortions occur during the first trimester of a pregnancy. In 2021, 93% of abortions occurred during the first trimester – that is, at or before 13 weeks of gestation,  according to the CDC . An additional 6% occurred between 14 and 20 weeks of pregnancy, and about 1% were performed at 21 weeks or more of gestation. These CDC figures include data from 40 states and New York City, but not the rest of New York.

About 2% of all abortions in the U.S. involve some type of complication for the woman , according to an article in StatPearls, an online health care resource. “Most complications are considered minor such as pain, bleeding, infection and post-anesthesia complications,” according to the article.

The CDC calculates  case-fatality rates for women from induced abortions – that is, how many women die from abortion-related complications, for every 100,000 legal abortions that occur in the U.S .  The rate was lowest during the most recent period examined by the agency (2013 to 2020), when there were 0.45 deaths to women per 100,000 legal induced abortions. The case-fatality rate reported by the CDC was highest during the first period examined by the agency (1973 to 1977), when it was 2.09 deaths to women per 100,000 legal induced abortions. During the five-year periods in between, the figure ranged from 0.52 (from 1993 to 1997) to 0.78 (from 1978 to 1982).

The CDC calculates death rates by five-year and seven-year periods because of year-to-year fluctuation in the numbers and due to the relatively low number of women who die from legal induced abortions.

In 2020, the last year for which the CDC has information , six women in the U.S. died due to complications from induced abortions. Four women died in this way in 2019, two in 2018, and three in 2017. (These deaths all followed legal abortions.) Since 1990, the annual number of deaths among women due to legal induced abortion has ranged from two to 12.

The annual number of reported deaths from induced abortions (legal and illegal) tended to be higher in the 1980s, when it ranged from nine to 16, and from 1972 to 1979, when it ranged from 13 to 63. One driver of the decline was the drop in deaths from illegal abortions. There were 39 deaths from illegal abortions in 1972, the last full year before Roe v. Wade. The total fell to 19 in 1973 and to single digits or zero every year after that. (The number of deaths from legal abortions has also declined since then, though with some slight variation over time.)

The number of deaths from induced abortions was considerably higher in the 1960s than afterward. For instance, there were 119 deaths from induced abortions in  1963  and 99 in  1965 , according to reports by the then-U.S. Department of Health, Education and Welfare, a precursor to the Department of Health and Human Services. The CDC is a division of Health and Human Services.

Note: This is an update of a post originally published May 27, 2022, and first updated June 24, 2022.

Portrait photo of staff

Support for legal abortion is widespread in many countries, especially in Europe

Nearly a year after roe’s demise, americans’ views of abortion access increasingly vary by where they live, by more than two-to-one, americans say medication abortion should be legal in their state, most latinos say democrats care about them and work hard for their vote, far fewer say so of gop, positive views of supreme court decline sharply following abortion ruling, most popular.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Politics & Policy
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

Terms & Conditions

Privacy Policy

Cookie Settings

Reprints, Permissions & Use Policy

IMAGES

  1. 5 Steps of the Data Analysis Process

    what is not included in presentation and analysis of data

  2. Data Presentation

    what is not included in presentation and analysis of data

  3. Unleashing Insights: Mastering the Art of Research and Data Analysis

    what is not included in presentation and analysis of data

  4. PRESENTATION, ANALYSIS AND INTERPRETATION OF DATA

    what is not included in presentation and analysis of data

  5. The Data Analysis Process

    what is not included in presentation and analysis of data

  6. Data Analysis Process For Visualization And Presentation

    what is not included in presentation and analysis of data

VIDEO

  1. INSTRUCTIONS NOT INCLUDED Review Spot

  2. 1st chapter |sample

  3. Most Popular Computer Operating Systems 1985

  4. How to present research tools, procedures and data analysis techniques

  5. Data Analysis Presentation Template

  6. Class 11 Economics Statistics Chapter 4

COMMENTS

  1. What is Data Analysis? An Expert Guide With Examples

    Data analysis is a comprehensive method of inspecting, cleansing, transforming, and modeling data to discover useful information, draw conclusions, and support decision-making. It is a multifaceted process involving various techniques and methodologies to interpret data from various sources in different formats, both structured and unstructured.

  2. Understanding Data Presentations (Guide + Examples)

    What Should a Data Presentation Include? Data presentations go beyond the mere usage of graphical elements. ... In the histogram data analysis presentation example, imagine an instructor analyzing a class's grades to identify the most common score range. A histogram could effectively display the distribution.

  3. What is data analysis? Methods, techniques, types & how-to

    A method of data analysis that is the umbrella term for engineering metrics and insights for additional value, direction, and context. By using exploratory statistical evaluation, data mining aims to identify dependencies, relations, patterns, and trends to generate advanced knowledge.

  4. Present Your Data Like a Pro

    TheJoelTruth. While a good presentation has data, data alone doesn't guarantee a good presentation. It's all about how that data is presented. The quickest way to confuse your audience is by ...

  5. Data Presentation

    Key Objectives of Data Presentation. Here are some key objectives to think about when presenting financial analysis: Visual communication. Audience and context. Charts, graphs, and images. Focus on important points. Design principles. Storytelling. Persuasiveness.

  6. Data Presentation: A Comprehensive Guide

    Definition: Data presentation is the art of visualizing complex data for better understanding. Importance: Data presentations enhance clarity, engage the audience, aid decision-making, and leave a lasting impact. Types: Textual, Tabular, and Graphical presentations offer various ways to present data.

  7. What Is Data Analysis? (With Examples)

    What Is Data Analysis? (With Examples) Data analysis is the practice of working with data to glean useful information, which can then be used to make informed decisions. "It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts," Sherlock Holme's proclaims ...

  8. Data Analytics: Definition, Uses, Examples, and More

    Data analytics is a multidisciplinary field that employs a wide range of analysis techniques, including math, statistics, and computer science, to draw insights from data sets. Data analytics is a broad term that includes everything from simply analyzing data to theorizing ways of collecting data and creating the frameworks needed to store it.

  9. Ultimate Guide to Using Data Visualization in Your Presentation

    It's not enough to just copy and paste your data into a presentation slide. Luckily, PowerPoint has a lot of smart data visualization tools! You just need to put in your data, and PowerPoint will work it up for you. 1. Collect your data. First things first, and that is to have all your information ready.

  10. How to Create a Successful Data Presentation

    Presentation length. This is my formula to determine how many slides to include in my main presentation assuming I spend about five minutes per slide. (Presentation length in minutes-10 minutes for questions ) / 5 minutes per slide. For an hour presentation that comes out to ( 60-10 ) / 5 = 10 slides.

  11. How To Create A Successful Data Presentation

    Here's my five-step routine to make and deliver your data presentation right where it is intended —. 1. Understand Your Data & Make It Seen. Data slides aren't really about data; they're about the meaning of that data. As data professionals, everyone approaches data differently.

  12. The Library: Research Skills: Analysing and Presenting Data

    Overview. Data analysis is an ongoing process that should occur throughout your research project. Suitable data-analysis methods must be selected when you write your research proposal. The nature of your data (i.e. quantitative or qualitative) will be influenced by your research design and purpose. The data will also influence the analysis ...

  13. Data Collection, Presentation and Analysis

    Abstract. This chapter covers the topics of data collection, data presentation and data analysis. It gives attention to data collection for studies based on experiments, on data derived from existing published or unpublished data sets, on observation, on simulation and digital twins, on surveys, on interviews and on focus group discussions.

  14. Data analysis

    data analysis, the process of systematically collecting, cleaning, transforming, describing, modeling, and interpreting data, generally employing statistical techniques. Data analysis is an important part of both scientific research and business, where demand has grown in recent years for data-driven decision making.Data analysis techniques are used to gain useful insights from datasets, which ...

  15. Presenting the Results of Qualitative Analysis

    There are a variety of other concerns and decision points that qualitative researchers must keep in mind, including the extent to which to include quantification in their presentation of results, ethics, considerations of audience and voice, and how to bring the richness of qualitative data to life. Quantification, as you have learned, refers ...

  16. What Is Data Presentation? (Definition, Types And How-To)

    What Is Data Presentation? Data presentation is a process of comparing two or more data sets with visual aids, such as graphs. Using a graph, you can represent how the information relates to other data. This process follows data analysis and helps organise information by visualising and putting it into a more readable format.

  17. A Checklist for Delivering Effective Presentations with Data

    1. Did the team select an interesting story to tell with the data as it related to the topic and audience? A story should have a clear beginning, middle, and end. Questions are useful to guide the audience with answers as are takeaways that drive the narrative from introduction to conclusion. 2.

  18. Data Presentation

    Data Analysis and Data Presentation have a practical implementation in every possible field. It can range from academic studies, commercial, industrial and marketing activities to professional practices. In its raw form, data can be extremely complicated to decipher and in order to extract meaningful insights from the data, data analysis is an important step towards breaking down data into ...

  19. How to Present Your Data Analysis Effectively

    6 Practice your delivery. The final step to present your data analysis is to practice your delivery. Whether you are delivering your data analysis verbally, in writing, or in another form, you ...

  20. Data Analysis 101: How to Make Your Presentations Practical and

    What data analysis entails. Data analysis is an analytical process which involves recording and tabulating (recording and entering, entering and tabulating) the quantities of a product, such as numbers of units produced, costs of materials and expenses. While data analyst can take different forms, for example in databases, in other structures ...

  21. Chapter 20. Presentations

    The argument is the point of the research, and if you do not have one, 99 percent of the time, you are not finished with your analysis. Calarco ( 2020 ) suggests you imagine a pyramid, with all of your data forming the basis and all of your findings forming the middle section; the top/point of the pyramid is your argument, "what the patterns ...

  22. PDF CHAPTER 4 Data analysis and presentation

    4.1 INTRODUCTION. This chapter presents themes and categories that emerged from the data, including the defining attributes, antecedents and consequences of the concept, and the different cases that illuminate the concept critical thinking. The data are presented from the most general (themes) to the most specific (data units/chunks).

  23. Data analysis and presentation

    Data analysis is the process of developing answers to questions through the examination and interpretation of data. The basic steps in the analytic process consist of identifying issues, determining the availability of suitable data, deciding on which methods are appropriate for answering the questions of interest, applying the methods and ...

  24. What the data says about abortion in the U.S.

    The CDC data that is highlighted in this post comes from the agency's "abortion surveillance" reports, which have been published annually since 1974 (and which have included data from 1969). Its figures from 1973 through 1996 include data from all 50 states, the District of Columbia and New York City - 52 "reporting areas" in all.