Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

critical thinking is needed to evaluate scientific findings pertaining to

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

critical thinking is needed to evaluate scientific findings pertaining to

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved April 15, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, unlimited academic ai-proofreading.

✔ Document error-free in 5minutes ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

What influences students’ abilities to critically evaluate scientific investigations?

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America

ORCID logo

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Visualization, Writing – review & editing

Affiliation Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

Roles Conceptualization, Investigation, Methodology, Writing – review & editing

Roles Conceptualization, Funding acquisition, Methodology, Project administration, Supervision, Writing – review & editing

Roles Conceptualization, Funding acquisition, Methodology, Resources, Supervision, Writing – review & editing

  • Ashley B. Heim, 
  • Cole Walsh, 
  • David Esparza, 
  • Michelle K. Smith, 
  • N. G. Holmes

PLOS

  • Published: August 30, 2022
  • https://doi.org/10.1371/journal.pone.0273337
  • Reader Comments

Table 1

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Citation: Heim AB, Walsh C, Esparza D, Smith MK, Holmes NG (2022) What influences students’ abilities to critically evaluate scientific investigations? PLoS ONE 17(8): e0273337. https://doi.org/10.1371/journal.pone.0273337

Editor: Dragan Pamucar, University of Belgrade Faculty of Organisational Sciences: Univerzitet u Beogradu Fakultet organizacionih nauka, SERBIA

Received: December 3, 2021; Accepted: August 6, 2022; Published: August 30, 2022

Copyright: © 2022 Heim et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All raw data files are available from the Cornell Institute for Social and Economic Research (CISER) data and reproduction archive ( https://archive.ciser.cornell.edu/studies/2881 ).

Funding: This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.

Introduction

Critical thinking and its importance.

Critical thinking, defined here as “the ways in which one uses data and evidence to make decisions about what to trust and what to do” [ 1 ], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum. Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible [ 2 , 3 ]. Furthermore, critical thinking is consistently ranked as one of the most necessary outcomes of post-secondary education for career advancement by employers [ 4 ]. In the workplace, those with critical thinking skills are more competitive because employers assume they can make evidence-based decisions based on multiple perspectives, keep an open mind, and acknowledge personal limitations [ 5 , 6 ]. Despite the importance of critical thinking skills, there are mixed recommendations on how to elicit and assess critical thinking during and as a result of instruction. In response, here we evaluate the degree to which different critical thinking questions elicit students’ critical thinking skills.

Assessing critical thinking in STEM

Across STEM (i.e., science, technology, engineering, and mathematics) disciplines, several standardized assessments probe critical thinking skills. These assessments focus on aspects of critical thinking and ask students to evaluate experimental methods [ 7 – 11 ], form hypotheses and make predictions [ 12 , 13 ], evaluate data [ 2 , 12 – 14 ], or draw conclusions based on a scenario or figure [ 2 , 12 – 14 ]. Many of these assessments are open-response, so they can be difficult to score, and several are not freely available.

In addition, there is an ongoing debate regarding whether critical thinking is a domain-general or context-specific skill. That is, can someone transfer their critical thinking skills from one domain or context to another (domain-general) or do their critical thinking skills only apply in their domain or context of expertise (context-specific)? Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [ 15 , 16 ]. Some argue that critical thinking is domain-general—or what Ennis refers to as the “general approach”—because it is an overlapping skill that people use in various aspects of their lives [ 17 ]. In contrast, others argue that critical thinking must be elicited in a context-specific domain, as prior knowledge is needed to make informed decisions in one’s discipline [ 18 , 19 ]. Current assessments include domain-general components [ 2 , 7 , 8 , 14 , 20 , 21 ], asking students to evaluate, for instance, experiments on the effectiveness of dietary supplements in athletes [ 20 ] and context-specific components, such as to measure students’ abilities to think critically in domains such as neuroscience [ 9 ] and biology [ 10 ].

Others maintain the view that critical thinking is a context-specific skill for the purpose of undergraduate education, but argue that it should be content accessible [ 22 – 24 ], as “thought processes are intertwined with what is being thought about” [ 23 ]. From this viewpoint, the context of the assessment would need to be embedded in a relatively accessible context to assess critical thinking independent of students’ content knowledge. Thus, to effectively elicit critical thinking among students, instructors should use assessments that present students with accessible domain-specific information needed to think deeply about the questions being asked [ 24 , 25 ].

Within the context of STEM, current critical thinking assessments primarily ask students to evaluate a single experimental scenario (e.g., [ 10 , 20 ]), though compare-and-contrast questions about more than one scenario can be a powerful way to elicit critical thinking [ 26 , 27 ]. Generally included in the “Analysis” level of Bloom’s taxonomy [ 28 – 30 ], compare-and-contrast questions encourage students to recognize, distinguish between, and relate features between scenarios and discern relevant patterns or trends, rather than compile lists of important features [ 26 ]. For example, a compare-and-contrast assessment may ask students to compare the hypotheses and research methods used in two different experimental scenarios, instead of having them evaluate the research methods of a single experiment. Alternatively, students may inherently recall and use experimental scenarios based on their prior experiences and knowledge as they evaluate an individual scenario. In addition, evaluating a single experimental scenario individually may act as metacognitive scaffolding [ 31 , 32 ]—a process which “guides students by asking questions about the task or suggesting relevant domain-independent strategies [ 32 ]—to support students in their compare-and-contrast thinking.

Purpose and research questions

Our primary objective of this study was to better understand what features of assessment questions elicit student critical thinking using two existing instruments in STEM: the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). We focused on biology and physics since critical thinking assessments were already available for these disciplines. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time or comparing and contrasting two studies and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Providing undergraduates with ample opportunities to practice critical thinking skills in the classroom is necessary for evidence-based critical thinking in their future careers and everyday life. While most critical thinking instruments in biology and physics contexts have undergone some form of validation to ensure they are accurately measuring the intended construct, to our knowledge none have explored how different question types influence students’ critical thinking. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to measure cognitive student outcomes and incorporate more effective critical thinking opportunities in the classroom.

Ethics statement

The procedures for this study were approved by the Institutional Review Board of Cornell University (Eco-BLIC: #1904008779; PLIC: #1608006532). Informed consent was obtained by all participating students via online consent forms at the beginning of the study, and students did not receive compensation for participating in this study unless their instructor offered credit for completing the assessment.

Participants and assessment distribution

We administered the Eco-BLIC to undergraduate students across 26 courses at 11 institutions (six doctoral-granting, three Master’s-granting, and two Baccalaureate-granting) in Fall 2020 and Spring 2021 and received 1612 usable responses. Additionally, we administered the PLIC to undergraduate students across 21 courses at 11 institutions (six doctoral-granting, one Master’s-granting, three four-year colleges, and one 2-year college) in Fall 2020 and Spring 2021 and received 1839 usable responses. We recruited participants via convenience sampling by emailing instructors of primarily introductory ecology-focused courses or introductory physics courses who expressed potential interest in implementing our instrument in their course(s). Both instruments were administered online via Qualtrics and students were allowed to complete the assessments outside of class. The demographic distribution of the response data is presented in Table 1 , all of which were self-reported by students. The values presented in this table represent all responses we received.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0273337.t001

Instrument description

Question types..

Though the content and concepts featured in the Eco-BLIC and PLIC are distinct, both instruments share a similar structure and set of question types. The Eco-BLIC—which was developed using a structure similar to that of the PLIC [ 1 ]—includes two predator-prey scenarios based on relationships between (a) smallmouth bass and mayflies and (b) great-horned owls and house mice. Within each scenario, students are presented with a field-based study and a laboratory-based study focused on a common research question about feeding behaviors of smallmouth bass or house mice, respectively. The prompts for these two Eco-BLIC scenarios are available in S1 and S2 Appendices. The PLIC focuses on two research groups conducting different experiments to test the relationship between oscillation periods of masses hanging on springs [ 1 ]; the prompts for this scenario can be found in S3 Appendix . The descriptive prompts in both the Eco-BLIC and PLIC also include a figure presenting data collected by each research group, from which students are expected to draw conclusions. The research scenarios (e.g., field-based group and lab-based group on the Eco-BLIC) are written so that each group has both strengths and weaknesses in their experimental designs.

After reading the prompt for the first experimental group (Group 1) in each instrument, students are asked to identify possible claims from Group 1’s data (data evaluation questions). Students next evaluate the strengths and weaknesses of various study features for Group 1 (individual evaluation questions). Examples of these individual evaluation questions are in Table 2 . They then suggest next steps the group should pursue (next steps items). Students are then asked to read about the prompt describing the second experimental group’s study (Group 2) and again answer questions about the possible claims, strengths and weaknesses, and next steps of Group 2’s study (data evaluation questions, individual evaluation questions, and next steps items). Once students have independently evaluated Groups 1 and 2, they answer a series of questions to compare the study approaches of Group 1 versus Group 2 (group comparison items). In this study, we focus our analysis on the individual evaluation questions and group comparison items.

thumbnail

https://doi.org/10.1371/journal.pone.0273337.t002

Instrument versions.

To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.

Think-aloud interviews.

We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.

Data analyses.

Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).

thumbnail

The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.

https://doi.org/10.1371/journal.pone.0273337.g001

We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.

We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).

thumbnail

Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.

https://doi.org/10.1371/journal.pone.0273337.g002

Individual evaluation versus compare-and-contrast evaluation

Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.

thumbnail

The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.

https://doi.org/10.1371/journal.pone.0273337.g003

These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”

Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.

Individual evaluation questions to support compare and contrast thinking

Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.

thumbnail

The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.

https://doi.org/10.1371/journal.pone.0273337.g004

We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).

The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.

We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.

To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Students are more discerning when making comparisons

We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.

Individual evaluation questions do not influence answers to compare and contrast questions

We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.

Practical implications

Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.

Future directions and limitations

When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.

As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?

While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?

Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.

Conclusions

Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.

Supporting information

S1 appendix. eco-blic bass-mayfly scenario prompt..

https://doi.org/10.1371/journal.pone.0273337.s001

S2 Appendix. Eco-BLIC owl-mouse scenario prompt.

https://doi.org/10.1371/journal.pone.0273337.s002

S3 Appendix. PLIC scenario prompt.

https://doi.org/10.1371/journal.pone.0273337.s003

Acknowledgments

We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.

  • View Article
  • Google Scholar
  • 2. Stein B, Haynes A, Redding M, Ennis T, Cecil M. Assessing critical thinking in STEM and beyond. In: Innovations in e-learning, instruction technology, assessment, and engineering education. Dordrecht, Netherlands: Springer; 2007. pp. 79–82.
  • PubMed/NCBI
  • 19. Carmichael M, Reid A, Karpicke JD. Assessing the impact of educational video on student engagement, critical thinking and learning. Sage Publishing. 2018. Retrieved from: https://au.sagepub.com/en-gb/oce/press/what-impact-does-videohave-on-student-engagement .
  • 26. Krishna Rao MR. Infusing critical thinking skills into content of AI course. In: Proceedings of the 10th annual SIGCSE conference on Innovation and technology in computer science education; 2005 Jun 27. pp. 173–177.
  • 28. Bloom BS. Taxonomy of educational objectives. Vol. 1: Cognitive domain. New York, NY: McKay; 1956.
  • 33. Cramer H. Mathematical methods of statistics. Princeton, NJ: Princeton University Press; 1946.
  • 38. Schwartz DL, Tsang JM, Blair KP. The ABCs of how we learn: 26 scientifically proven approaches, how they work, and when to use them. New York, NY: WW Norton & Company; 2016 Jul 26.
  • 41. Szenes E, Tilakaratna N, Maton K. The knowledge practices of critical thinking. In: The Palgrave handbook of critical thinking in higher education. Palgrave Macmillan, New York; 2015. pp. 573–579.

What influences students' abilities to critically evaluate scientific investigations?

Affiliations.

  • 1 Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America.
  • 2 Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America.
  • PMID: 36040903
  • PMCID: PMC9426932
  • DOI: 10.1371/journal.pone.0273337

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students' critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

Grants and funding

  • Interesting
  • Scholarships
  • UGC-CARE Journals

The Art and Science of Critical Thinking in Research: A Guide to Academic Excellence

Dr. Sowndarya Somasundaram

bright bulb close up conceptual

Table of contents

Rigor and accuracy, evaluation of evidence, identification of biases and assumptions, problem-solving, development of new ideas, communication, evaluate the credibility of sources, assess the quality of evidence, consider alternative explanations, challenge assumptions, seek out feedback, practice analyzing data, attend conferences and seminars, define the research problem, conduct a comprehensive literature review, evaluate evidence and sources, analyze and synthesize information, question assumptions, evaluate arguments and reasoning, consider multiple perspectives, ask critical questions, communicate effectively, practice self-reflection, embrace creativity and open-mindedness, seek feedback and engage in peer review.

Critical thinking is a fundamental skill in research and academia that involves analyzing, evaluating, and interpreting information in a systematic and logical manner. It is the process of objectively evaluating evidence, arguments, and ideas to arrive at well-reasoned conclusions or make informed decisions.

The art and science of critical thinking in research is a multifaceted and dynamic process that requires intellectual rigor, creativity, and an open mind.

In research, critical thinking is essential for developing research questions, designing research studies, collecting and analyzing data, and interpreting research findings. It allows researchers to evaluate the quality and validity of research studies, identify gaps in the literature, and make evidence-based decisions.

Critical thinking in research also involves being open to alternative viewpoints and being willing to revise one’s own conclusions based on new evidence. It requires intellectual humility and a willingness to challenge one’s own assumptions and biases.

Why Critical Thinking is Important in Research?

Critical thinking is important in research for the following reasons:

It helps researchers to approach their work with rigor and accuracy, ensuring that the research methods and findings are reliable and valid.

Critical thinking helps researchers to evaluate the evidence they encounter and determine its relevance and reliability to the research question or hypothesis.

Critical thinking helps research ers to identify their own biases and assumptions and those of others, which can influence the research process and findings.

It helps researchers to identify and solve problems that may arise during the research process, such as inconsistencies in data or unexpected results.

Critical thinking can help researchers develop new ideas and theories based on their analysis of the evidence.

Critical thinking helps researchers to communicate their findings and ideas in a clear and logical manner, making it easier for others to understand and build on their work.

Therefore, critical thinking is essential for conducting rigorous and impactful research that can advance our understanding of the world around us.

It helps researchers to approach their work with a critical and objective perspective, evaluating evidence and developing insights that can contribute to the advancement of knowledge in their field.

How to develop critical thinking skills in research?

Developing critical thinking skills in research requires a specific set of strategies. Here are some ways to develop critical thinking skills in research:

In research, it is important to evaluate the credibility of sources to determine if the information is reliable and valid. To develop your critical thinking skills, practice evaluating the sources you encounter and assessing their credibility.

Critical thinking in research involves assessing the quality of evidence and determining if it supports the research question or hypothesis. Practice evaluating the quality of evidence and understanding how it impacts the research findings.

To develop critical thinking skills in research, practice considering alternative explanations for the findings. Evaluate the evidence and consider if there are other explanations that could account for the results.

Critical thinking in research involves challenging assumptions and exploring alternative perspectives. Practice questioning assumptions and considering different viewpoints to develop your critical thinking skills.

Seek out feedback from colleagues, advisors, or peers on your research methods and findings. This can help you identify areas where you need to improve your critical thinking skills and provide valuable insights for your research.

Critical thinking in research involves analyzing and interpreting data. Practice analyzing different types of data to develop your critical thinking skills.

Attend conferences and seminars in your field to learn about the latest research and to engage in critical discussions with other researchers. This can help you develop your critical thinking skills and keep up-to-date with the latest research in your field.

By consistently practicing these strategies, you can develop your critical thinking skills in research and become a more effective and insightful researcher.

The Art and Science of Critical Thinking in Research

The art and science of critical thinking in research is a vital skill for academic excellence. Here’s a guide to academic excellence through the art and science of critical thinking in research:

The first step in critical thinking is to define the research problem or question. This involves identifying the key concepts, understanding the context, and formulating a clear and concise research question or hypothesis. Clearly define the research question or problem you are trying to address. This will help you focus your thinking and avoid unnecessary distractions.

A thorough review of relevant literature is essential in critical thinking. It helps you understand the existing knowledge and research in the field, identify research gaps, and evaluate the quality and reliability of the evidence. It also allows you to identify different perspectives and theories related to the research problem.

Critical thinking requires careful evaluation of evidence and sources. This includes assessing the credibility, reliability, and validity of research studies, data sources, and information. It also involves identifying potential biases, limitations, and assumptions in the evidence and sources. Use reputable, peer-reviewed sources and critically analyze the evidence and arguments presented in those sources.

Critical thinking involves analyzing and synthesizing information from various sources. This includes identifying patterns, trends, and relationships among different pieces of information. It also requires organizing and integrating information to develop a coherent and logical argument.

Challenge your assumptions and biases. Be aware of your own biases and preconceived notions, and critically examine them to avoid potential bias in your research.

Critical thinking involves evaluating the strength and validity of arguments and reasoning. This includes identifying logical fallacies, evaluating the coherence and consistency of arguments, and assessing the evidence and support for arguments. It also involves considering alternative viewpoints and perspectives.

Apply critical thinking tools

Use critical thinking tools such as SWOT analysis (Strengths, Weaknesses, Opportunities, Threats), mind maps, concept maps, and flowcharts to organize and analyze information in a structured and systematic manner.

Apply critical thinking skills in research design and methodology: Critical thinking is essential in research design and methodology. This includes making informed decisions about research approaches, sampling methods, data collection, and data analysis techniques. It also involves anticipating potential limitations and biases in the research design and methodology.

Avoid tunnel vision by considering multiple perspectives and viewpoints on the issue at hand. This will help you gain a more comprehensive understanding of the topic and make informed decisions based on a broader range of information.

Critical Questions in Research

Some of the sample critical questions in the research are listed below.

1. What is the research question, and is it clearly defined?

2. What are the assumptions underlying the research question?

3. What is the methodology being used, and is it appropriate for the research organized

4. What are the limitations of the study, and how might they affect the results?

5. How representative is the sample being studied, and are there any biases in the selection process?

6. What are the potential sources of error or bias in the data collection process?

7. Are the statistical analyses used appropriate, and do they support the conclusions drawn from the data?

8. What are the implications of the research findings, and do they have practical significance?

9. Are there any ethical considerations that arise from the research, and have they been adequately addressed?

10. Are there any alternative explanations for the results, and have they been considered and ruled out?

Critical thinking requires effective communication skills to articulate and present research findings and arguments clearly and convincingly.

This includes writing clearly and concisely, using appropriate evidence and examples, and presenting information in a logical and organized manner. It also involves listening and responding critically to feedback and engaging in constructive discussions and debates.

Critical thinking involves self-reflection and self-awareness.  Reflect on your own thinking and decision-making process throughout the research. It requires regularly evaluating your own biases, assumptions, and limitations in your thinking process. It also involves being mindful of your emotions and personal beliefs that may influence your critical thinking and decision-making.

Critical thinking involves being open to new ideas, perspectives, and approaches. It requires creativity in generating and evaluating alternative solutions or interpretations.

It also involves being willing to revise your conclusions or change your research direction based on new information. Avoid confirmation bias and strive for objectivity in your research.

Critical thinking benefits from feedback and peer review. Seeking feedback from mentors, colleagues, or peer reviewers can help identify potential flaws or weaknesses in your research or arguments. Engaging in peer review also provides an opportunity to critically evaluate the work of others and learn from their perspectives.

By following these best practices and techniques, you can cultivate critical thinking skills that will enhance the quality and rigor of your research, leading to more successful outcomes.

Critical thinking is an essential component of research that enables researchers to evaluate information, identify biases, and draw valid conclusions.

It involves defining research problems, conducting literature reviews, evaluating evidence and sources, analyzing and synthesizing information, evaluating arguments and reasoning, applying critical thinking in research design and methodology, communicating effectively, embracing creativity and open-mindedness, practicing self-reflection, seeking feedback, and engaging in peer review.

By cultivating and applying critical thinking skills in research, you can enhance the quality and rigor of your work and contribute to the advancement of knowledge in your field.

Remember to continuously practice and refine your critical thinking skills as they are valuable not only in research but also in various aspects of life. Happy researching!

  • academic excellence
  • Academic Research
  • academic success
  • critical thinking
  • practical tips
  • research skills

Dr. Sowndarya Somasundaram

What is Research Design? and How to Frame it?

What is a research design importance and types, indian council of social science research calls for collaborative research project, email subscription.

ilovephd logo

iLovePhD is a research education website to know updated research-related information. It helps researchers to find top journals for publishing research articles and get an easy manual for research tools. The main aim of this website is to help Ph.D. scholars who are working in various domains to get more valuable ideas to carry out their research. Learn the current groundbreaking research activities around the world, love the process of getting a Ph.D.

WhatsApp Channel

Join iLovePhD WhatsApp Channel Now!

Contact us: [email protected]

Copyright © 2019-2024 - iLovePhD

  • Artificial intelligence

Thinking critically on critical thinking: why scientists’ skills need to spread

critical thinking is needed to evaluate scientific findings pertaining to

Lecturer in Psychology, University of Tasmania

Disclosure statement

Rachel Grieve does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Tasmania provides funding as a member of The Conversation AU.

View all partners

critical thinking is needed to evaluate scientific findings pertaining to

MATHS AND SCIENCE EDUCATION: We’ve asked our authors about the state of maths and science education in Australia and its future direction. Today, Rachel Grieve discusses why we need to spread science-specific skills into the wider curriculum.

When we think of science and maths, stereotypical visions of lab coats, test-tubes, and formulae often spring to mind.

But more important than these stereotypes are the methods that underpin the work scientists do – namely generating and systematically testing hypotheses. A key part of this is critical thinking.

It’s a skill that often feels in short supply these days, but you don’t necessarily need to study science or maths in order gain it. It’s time to take critical thinking out of the realm of maths and science and broaden it into students’ general education.

What is critical thinking?

Critical thinking is a reflective and analytical style of thinking, with its basis in logic, rationality, and synthesis. It means delving deeper and asking questions like: why is that so? Where is the evidence? How good is that evidence? Is this a good argument? Is it biased? Is it verifiable? What are the alternative explanations?

Critical thinking moves us beyond mere description and into the realms of scientific inference and reasoning. This is what enables discoveries to be made and innovations to be fostered.

For many scientists, critical thinking becomes (seemingly) intuitive, but like any skill set, critical thinking needs to be taught and cultivated. Unfortunately, educators are unable to deposit this information directly into their students’ heads. While the theory of critical thinking can be taught, critical thinking itself needs to be experienced first-hand.

So what does this mean for educators trying to incorporate critical thinking within their curricula? We can teach students the theoretical elements of critical thinking. Take for example working through [statistical problems](http://wdeneys.org/data/COGNIT_1695.pdf](http://wdeneys.org/data/COGNIT_1695.pdf) like this one:

In a 1,000-person study, four people said their favourite series was Star Trek and 996 said Days of Our Lives. Jeremy is a randomly chosen participant in this study, is 26, and is doing graduate studies in physics. He stays at home most of the time and likes to play videogames. What is most likely? a. Jeremy’s favourite series is Star Trek b. Jeremy’s favourite series is Days of Our Lives

Some critical thought applied to this problem allows us to know that Jeremy is most likely to prefer Days of Our Lives.

Can you teach it?

It’s well established that statistical training is associated with improved decision-making. But the idea of “teaching” critical thinking is itself an oxymoron: critical thinking can really only be learned through practice. Thus, it is not surprising that student engagement with the critical thinking process itself is what pays the dividends for students.

As such, educators try to connect students with the subject matter outside the lecture theatre or classroom. For example, problem based learning is now widely used in the health sciences, whereby students must figure out the key issues related to a case and direct their own learning to solve that problem. Problem based learning has clear parallels with real life practice for health professionals.

Critical thinking goes beyond what might be on the final exam and life-long learning becomes the key. This is a good thing, as practice helps to improve our ability to think critically over time .

Just for scientists?

For those engaging with science, learning the skills needed to be a critical consumer of information is invaluable. But should these skills remain in the domain of scientists? Clearly not: for those engaging with life, being a critical consumer of information is also invaluable, allowing informed judgement.

Being able to actively consider and evaluate information, identify biases, examine the logic of arguments, and tolerate ambiguity until the evidence is in would allow many people from all backgrounds to make better decisions. While these decisions can be trivial (does that miracle anti-wrinkle cream really do what it claims?), in many cases, reasoning and decision-making can have a substantial impact, with some decisions have life-altering effects. A timely case-in-point is immunisation.

Pushing critical thinking from the realms of science and maths into the broader curriculum may lead to far-reaching outcomes. With increasing access to information on the internet, giving individuals the skills to critically think about that information may have widespread benefit, both personally and socially.

The value of science education might not always be in the facts, but in the thinking.

This is the sixth part of our series Maths and Science Education .

  • Maths and science education

critical thinking is needed to evaluate scientific findings pertaining to

Associate Professor, Occupational Therapy

critical thinking is needed to evaluate scientific findings pertaining to

Operations Manager

critical thinking is needed to evaluate scientific findings pertaining to

Senior Lecturer, Occupational Therapy

critical thinking is needed to evaluate scientific findings pertaining to

Lecturer, Occupational Therapy 

critical thinking is needed to evaluate scientific findings pertaining to

Deputy Social Media Producer

Warren Berger

A Crash Course in Critical Thinking

What you need to know—and read—about one of the essential skills needed today..

Posted April 8, 2024 | Reviewed by Michelle Quirk

  • In research for "A More Beautiful Question," I did a deep dive into the current crisis in critical thinking.
  • Many people may think of themselves as critical thinkers, but they actually are not.
  • Here is a series of questions you can ask yourself to try to ensure that you are thinking critically.

Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion about who and what to believe.

These are some of the hallmarks of the current crisis in critical thinking—which just might be the issue of our times. Because if people aren’t willing or able to think critically as they choose potential leaders, they’re apt to choose bad ones. And if they can’t judge whether the information they’re receiving is sound, they may follow faulty advice while ignoring recommendations that are science-based and solid (and perhaps life-saving).

Moreover, as a society, if we can’t think critically about the many serious challenges we face, it becomes more difficult to agree on what those challenges are—much less solve them.

On a personal level, critical thinking can enable you to make better everyday decisions. It can help you make sense of an increasingly complex and confusing world.

In the new expanded edition of my book A More Beautiful Question ( AMBQ ), I took a deep dive into critical thinking. Here are a few key things I learned.

First off, before you can get better at critical thinking, you should understand what it is. It’s not just about being a skeptic. When thinking critically, we are thoughtfully reasoning, evaluating, and making decisions based on evidence and logic. And—perhaps most important—while doing this, a critical thinker always strives to be open-minded and fair-minded . That’s not easy: It demands that you constantly question your assumptions and biases and that you always remain open to considering opposing views.

In today’s polarized environment, many people think of themselves as critical thinkers simply because they ask skeptical questions—often directed at, say, certain government policies or ideas espoused by those on the “other side” of the political divide. The problem is, they may not be asking these questions with an open mind or a willingness to fairly consider opposing views.

When people do this, they’re engaging in “weak-sense critical thinking”—a term popularized by the late Richard Paul, a co-founder of The Foundation for Critical Thinking . “Weak-sense critical thinking” means applying the tools and practices of critical thinking—questioning, investigating, evaluating—but with the sole purpose of confirming one’s own bias or serving an agenda.

In AMBQ , I lay out a series of questions you can ask yourself to try to ensure that you’re thinking critically. Here are some of the questions to consider:

  • Why do I believe what I believe?
  • Are my views based on evidence?
  • Have I fairly and thoughtfully considered differing viewpoints?
  • Am I truly open to changing my mind?

Of course, becoming a better critical thinker is not as simple as just asking yourself a few questions. Critical thinking is a habit of mind that must be developed and strengthened over time. In effect, you must train yourself to think in a manner that is more effortful, aware, grounded, and balanced.

For those interested in giving themselves a crash course in critical thinking—something I did myself, as I was working on my book—I thought it might be helpful to share a list of some of the books that have shaped my own thinking on this subject. As a self-interested author, I naturally would suggest that you start with the new 10th-anniversary edition of A More Beautiful Question , but beyond that, here are the top eight critical-thinking books I’d recommend.

The Demon-Haunted World: Science as a Candle in the Dark , by Carl Sagan

This book simply must top the list, because the late scientist and author Carl Sagan continues to be such a bright shining light in the critical thinking universe. Chapter 12 includes the details on Sagan’s famous “baloney detection kit,” a collection of lessons and tips on how to deal with bogus arguments and logical fallacies.

critical thinking is needed to evaluate scientific findings pertaining to

Clear Thinking: Turning Ordinary Moments Into Extraordinary Results , by Shane Parrish

The creator of the Farnham Street website and host of the “Knowledge Project” podcast explains how to contend with biases and unconscious reactions so you can make better everyday decisions. It contains insights from many of the brilliant thinkers Shane has studied.

Good Thinking: Why Flawed Logic Puts Us All at Risk and How Critical Thinking Can Save the World , by David Robert Grimes

A brilliant, comprehensive 2021 book on critical thinking that, to my mind, hasn’t received nearly enough attention . The scientist Grimes dissects bad thinking, shows why it persists, and offers the tools to defeat it.

Think Again: The Power of Knowing What You Don't Know , by Adam Grant

Intellectual humility—being willing to admit that you might be wrong—is what this book is primarily about. But Adam, the renowned Wharton psychology professor and bestselling author, takes the reader on a mind-opening journey with colorful stories and characters.

Think Like a Detective: A Kid's Guide to Critical Thinking , by David Pakman

The popular YouTuber and podcast host Pakman—normally known for talking politics —has written a terrific primer on critical thinking for children. The illustrated book presents critical thinking as a “superpower” that enables kids to unlock mysteries and dig for truth. (I also recommend Pakman’s second kids’ book called Think Like a Scientist .)

Rationality: What It Is, Why It Seems Scarce, Why It Matters , by Steven Pinker

The Harvard psychology professor Pinker tackles conspiracy theories head-on but also explores concepts involving risk/reward, probability and randomness, and correlation/causation. And if that strikes you as daunting, be assured that Pinker makes it lively and accessible.

How Minds Change: The Surprising Science of Belief, Opinion and Persuasion , by David McRaney

David is a science writer who hosts the popular podcast “You Are Not So Smart” (and his ideas are featured in A More Beautiful Question ). His well-written book looks at ways you can actually get through to people who see the world very differently than you (hint: bludgeoning them with facts definitely won’t work).

A Healthy Democracy's Best Hope: Building the Critical Thinking Habit , by M Neil Browne and Chelsea Kulhanek

Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the world around us. His newest book, co-authored with Chelsea Kulhanek, breaks down critical thinking into “11 explosive questions”—including the “priors question” (which challenges us to question assumptions), the “evidence question” (focusing on how to evaluate and weigh evidence), and the “humility question” (which reminds us that a critical thinker must be humble enough to consider the possibility of being wrong).

Warren Berger

Warren Berger is a longtime journalist and author of A More Beautiful Question .

  • Find a Therapist
  • Find a Treatment Center
  • Find a Support Group
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10672018

Logo of jintell

Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An Integrative Review

Associated data.

This research did not involve collection of original data, and hence there are no new data to make available.

A review of the research shows that critical thinking is a more inclusive construct than intelligence, going beyond what general cognitive ability can account for. For instance, critical thinking can more completely account for many everyday outcomes, such as how thinkers reject false conspiracy theories, paranormal and pseudoscientific claims, psychological misconceptions, and other unsubstantiated claims. Deficiencies in the components of critical thinking (in specific reasoning skills, dispositions, and relevant knowledge) contribute to unsubstantiated belief endorsement in ways that go beyond what standardized intelligence tests test. Specifically, people who endorse unsubstantiated claims less tend to show better critical thinking skills, possess more relevant knowledge, and are more disposed to think critically. They tend to be more scientifically skeptical and possess a more rational–analytic cognitive style, while those who accept unsubstantiated claims more tend to be more cynical and adopt a more intuitive–experiential cognitive style. These findings suggest that for a fuller understanding of unsubstantiated beliefs, researchers and instructors should also assess specific reasoning skills, relevant knowledge, and dispositions which go beyond what intelligence tests test.

1. Introduction

Why do some people believe implausible claims, such as the QAnon conspiracy theory, that a cabal of liberals is kidnapping and trafficking many thousands of children each year, despite the lack of any credible supporting evidence? Are believers less intelligent than non-believers? Do they lack knowledge of such matters? Are they more gullible or less skeptical than non-believers? Or, more generally, are they failing to think critically?

Understanding the factors contributing to acceptance of unsubstantiated claims is important, not only to the development of theories of intelligence and critical thinking but also because many unsubstantiated beliefs are false, and some are even dangerous. Endorsing them can have a negative impact on an individual and society at large. For example, false beliefs about the COVID-19 pandemic, such as believing that 5G cell towers induced the spread of the COVID-19 virus, led some British citizens to set fire to 5G towers ( Jolley and Paterson 2020 ). Other believers in COVID-19 conspiracy theories endangered their own and their children’s lives when they refused to socially distance and be vaccinated with highly effective vaccines, despite the admonitions of scientific experts ( Bierwiaczonek et al. 2020 ). Further endangering the population at large, those who believe the false conspiracy theory that human-caused global warming is a hoax likely fail to respond adaptively to this serious global threat ( van der Linden 2015 ). Parents, who uncritically accept pseudoscientific claims, such as the false belief that facilitated communication is an effective treatment for childhood autism, may forego more effective treatments ( Lilienfeld 2007 ). Moreover, people in various parts of the world still persecute other people whom they believe are witches possessing supernatural powers. Likewise, many people still believe in demonic possession, which has been associated with mental disorders ( Nie and Olson 2016 ). Compounding the problems created by these various unsubstantiated beliefs, numerous studies now show that when someone accepts one of these types of unfounded claims, they tend to accept others as well; see Bensley et al. ( 2022 ) for a review.

Studying the factors that contribute to unfounded beliefs is important not only because of their real-world consequences but also because this can facilitate a better understanding of unfounded beliefs and how they are related to critical thinking and intelligence. This article focuses on important ways in which critical thinking and intelligence differ, especially in terms of how a comprehensive model of CT differs from the view of intelligence as general cognitive ability. I argue that this model of CT more fully accounts for how people can accurately decide if a claim is unsubstantiated than can views of intelligence, emphasizing general cognitive ability. In addition to general cognitive ability, thinking critically about unsubstantiated claims involves deployment of specific reasoning skills, dispositions related to CT, and specific knowledge, which go beyond the contribution of general cognitive ability.

Accordingly, this article begins with an examination of the constructs of critical thinking and intelligence. Then, it discusses theories proposing that to understand thinking in the real world requires going beyond general cognitive ability. Specifically, the focus is on factors related to critical thinking, such as specific reasoning skills, dispositions, metacognition, and relevant knowledge. I review research showing that that this alternative multidimensional view of CT can better account for individual differences in the tendency to endorse multiple types of unsubstantiated claims than can general cognitive ability alone.

2. Defining Critical Thinking and Intelligence

Critical thinking is an almost universally valued educational objective in the US and in many other countries which seek to improve it. In contrast, intelligence, although much valued, has often been viewed as a more stable characteristic and less amenable to improvement through specific short-term interventions, such as traditional instruction or more recently through practice on computer-implemented training programs. According to Wechsler’s influential definition, intelligence is a person’s “aggregate or global capacity to act purposefully, to think rationally, and to deal effectively with his environment” ( Wechsler 1944, p. 3 ).

Consistent with this definition, intelligence has long been associated with general cognitive or intellectual ability and the potential to learn and reason well. Intelligence (IQ) tests measure general cognitive abilities, such as knowledge of words, memory skills, analogical reasoning, speed of processing, and the ability to solve verbal and spatial problems. General intelligence or “g” is a composite of these abilities statistically derived from various cognitive subtests on IQ tests which are positively intercorrelated. There is considerable overlap between g and the concept of fluid intelligence (Gf) in the prominent Cattell–Horn–Carroll model ( McGrew 2009 ), which refers to “the ability to solve novel problems, the solution of which does not depend on previously acquired skills and knowledge,” and crystalized intelligence (Gc), which refers to experience, existing skills, and general knowledge ( Conway and Kovacs 2018, pp. 50–51 ). Although g or general intelligence is based on a higher order factor, inclusive of fluid and crystallized intelligence, it is technically not the same as general cognitive ability, a commonly used, related term. However, in this article, I use “general cognitive ability” and “cognitive ability” because they are the imprecise terms frequently used in the research reviewed.

Although IQ scores have been found to predict performance in basic real-world domains, such as academic performance and job success ( Gottfredson 2004 ), an enduring question for intelligence researchers has been whether g and intelligence tests predict the ability to adapt well in other real-world situations, which concerns the second part of Wechsler’s definition. So, in addition to the search for the underlying structure of intelligence, researchers have been perennially concerned with how general abilities associated with intelligence can be applied to help a person adapt to real-world situations. The issue is largely a question of how cognitive ability and intelligence can help people solve real-world problems and cope adaptively and succeed in dealing with various environmental demands ( Sternberg 2019 ).

Based on broad conceptual definitions of intelligence and critical thinking, both intelligence and CT should aid adaptive functioning in the real world, presumably because they both involve rational approaches. Their common association with rationality gives each term a positive connotation. However, complicating the definition of each of these is the fact that rationality also continues to have a variety of meanings. In this article, in agreement with Stanovich et al. ( 2018 ), rationality is defined in the normative sense, used in cognitive science, as the distance between a person’s response and some normative standard of optimal behavior. As such, degree of rationality falls on a continuous scale, not a categorical one.

Despite disagreements surrounding the conceptual definitions of intelligence, critical thinking, and rationality, a commonality in these terms is they are value-laden and normative. In the case of intelligence, people are judged based on norms from standardized intelligence tests, especially in academic settings. Although scores on CT tests seldom are, nor could be, used to judge individuals in this way, the normative and value-laden basis of CT is apparent in people’s informal judgements. They often judge others who have made poor decisions to be irrational or to have failed to think critically.

This value-laden aspect of CT is also apparent in formal definitions of CT. Halpern and Dunn ( 2021 ) defined critical thinking as “the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal-directed.” The positive conception of CT as helping a person adapt well to one’s environment is clearly implied in “desirable outcome”.

Robert Ennis ( 1987 ) has offered a simpler, yet useful definition of critical thinking that also has normative implications. According to Ennis, “critical thinking is reasonable, reflective thinking focused on deciding what to believe or do” ( Ennis 1987, p. 102 ). This definition implies that CT helps people know what to believe (a goal of epistemic rationality) and how to act (a goal of instrumental rationality). This is conveyed by associating “critical thinking” with the positive terms, “reasonable” and “reflective”. Dictionaries commonly define “reasonable” as “rational”, “logical”, “intelligent”, and “good”, all terms with positive connotations.

For critical thinkers, being reasonable involves using logical rules, standards of evidence, and other criteria that must be met for a product of thinking to be considered good. Critical thinkers use these to evaluate how strongly reasons or evidence supports one claim versus another, drawing conclusions which are supported by the highest quality evidence ( Bensley 2018 ). If no high-quality evidence is available for consideration, it would be unreasonable to draw a strong conclusion. Unfortunately, people’s beliefs are too often based on acceptance of unsubstantiated claims. This is a failure of CT, but is it also a failure of intelligence?

3. Does Critical Thinking “Go Beyond” What Is Meant by Intelligence?

Despite the conceptual overlap in intelligence and CT at a general level, one way that CT can be distinguished from the common view of intelligence as general cognitive ability is in terms of what each can account for. Although intelligence tests, especially measures of general cognitive ability, have reliably predicted academic and job performance, they may not be sufficient to predict other everyday outcomes for which CT measures have made successful predictions and have added to the variance accounted for in performance. For instance, replicating a study by Butler ( 2012 ), Butler et al. ( 2017 ) obtained a negative correlation ( r = −0.33) between scores on the Halpern Critical Thinking Appraisal (HCTA) and a measure of 134 negative, real-world outcomes, not expected to befall critical thinkers, such as engaging in unprotected sex or posting a message on social media which the person regretted. They found that higher HCTA scores not only predicted better life decisions, but also predicted better performance beyond a measure of general cognitive ability. These results suggest that CT can account for real-world outcomes and goes beyond general cognitive ability to account for additional variance.

Some theorists maintain that standardized intelligence tests do not capture the variety of abilities that people need to adapt well in the real world. For example, Gardner ( 1999 ), has proposed that additional forms of intelligence are needed, such as spatial, musical, and interpersonal intelligences in addition to linguistic and logical–mathematical intelligences, more typically associated with general cognitive ability and academic success. In other theorizing, Sternberg ( 1988 ) has proposed three additional types of intelligence: analytical, practical, and creative intelligence, to more fully capture the variety of intelligent abilities on which people differ. Critical thinking is considered part of analytical skills which involve evaluating the quality and applicability of ideas, products, and options ( Sternberg 2022 ). Regarding adaptive intelligence, Sternberg ( 2019 ) has emphasized how adaptive aspects of intelligence are needed to solve real-world problems both at the individual and species levels. According to Sternberg, core components of intelligence have evolved in humans, but intelligence takes different forms in different cultures, with each culture valuing its own skills for adaptation. Thus, the construct of intelligence must go beyond core cognitive ability to encompass the specific abilities needed for adaptive behavior in specific cultures and settings.

Two other theories propose that other components be added to intelligent and rational thinking. Ackerman ( 2022 ) has emphasized the importance of acquiring domain-specific knowledge for engaging in intelligent functioning in the wide variety of tasks found in everyday life. Ackerman has argued that declarative, procedural, and tacit knowledge, as well as non-ability variables, are needed to better predict job performance and performance of other everyday activities. Taking another approach, Halpern and Dunn ( 2021 ) have proposed that critical thinking is essentially the adaptive application of intelligence for solving real-world problems. Elsewhere, Butler and Halpern ( 2019 ) have argued that dispositions such as open-mindedness are another aspect of CT and that domain-specific knowledge and specific CT skills are needed to solve real-world problems.

Examples are readily available for how CT goes beyond what IQ tests test to include specific rules for reasoning and relevant knowledge needed to execute real-world tasks. Take the example of scientific reasoning, which can be viewed as a specialized form of CT. Drawing a well-reasoned inductive conclusion about a theory or analyzing the quality of a research study both require that a thinker possess relevant specialized knowledge related to the question and specific reasoning skills for reasoning about scientific methodology. In contrast, IQ tests are deliberately designed to be nonspecialized in assessing Gc, broadly sampling vocabulary and general knowledge in order to be fair and unbiased ( Stanovich 2009 ). Specialized knowledge and reasoning skills are also needed in non-academic domains. Jurors must possess specialized knowledge to understand expert, forensic testimony and specific reasoning skills to interpret the law and make well-reasoned judgments about a defendant’s guilt or innocence.

Besides lacking specific reasoning skills and domain-relevant knowledge, people may fail to think critically because they are not disposed to use their reasoning skills to examine such claims and want to preserve their favored beliefs. Critical thinking dispositions are attitudes or traits that make it more likely that a person will think critically. Theorists have proposed numerous CT dispositions (e.g., Bensley 2018 ; Butler and Halpern 2019 ; Dwyer 2017 ; Ennis 1987 ). Some commonly identified CT dispositions especially relevant to this discussion are open-mindedness, skepticism, intellectual engagement, and the tendency to take a reflective, rational–analytic approach. Critical thinking dispositions are clearly value-laden and prescriptive. A good thinker should be open-minded, skeptical, reflective, intellectually engaged, and value a rational–analytic approach to inquiry. Conversely, corresponding negative dispositions, such as “close-mindedness” and “gullibility”, could obstruct CT.

Without the appropriate disposition, individuals will not use their reasoning skills to think critically about questions. For example, the brilliant mystery writer, Sir Arthur Conan Doyle, who was trained as a physician and created the hyper-reasonable detective Sherlock Holmes, was not disposed to think critically about some unsubstantiated claims. Conan Doyle was no doubt highly intelligent in cognitive ability terms, but he was not sufficiently skeptical (disposed to think critically) about spiritualism. He believed that he was talking to his dearly departed son though a medium, despite the warnings of his magician friend, Harry Houdini, who told him that mediums used trickery in their seances. Perhaps influenced by his Irish father’s belief in the “wee folk”, Conan Doyle also believed that fairies inhabited the English countryside, based on children’s photos, despite the advice of experts who said the photos could be faked. Nevertheless, he was skeptical of a new theory of tuberculosis proposed by Koch when he reported on it, despite his wife suffering from the disease. So, in professional capacities, Conan Doyle used his CT skills, but in certain other domains for which he was motivated to accept unsubstantiated claims, he failed to think critically, insufficiently disposed to skeptically challenge certain implausible claims.

This example makes two important points. Conan Doyle’s superior intelligence was not enough for him to reject implausible claims about the world. In general, motivated reasoning can lead people, even those considered highly intelligent, to accept claims with no good evidentiary support. The second important point is that we would not be able to adequately explain cases like this one, considering only the person’s intelligence or even their reasoning skills, without also considering the person’s disposition. General cognitive ability alone is not sufficient, and CT dispositions should also be considered.

Supporting this conclusion, Stanovich and West ( 1997 ) examined the influence of dispositions beyond the contribution of cognitive ability on a CT task. They gave college students an argument evaluation test in which participants first rated their agreement with several claims about real social and political issues made by a fictitious person. Then, they gave them evidence against each claim and finally asked them to rate the quality of a counterargument made by the same fictitious person. Participants’ ratings of the counterarguments were compared to the median ratings of expert judges on the quality of the rebuttals. Stanovich and West also administered a new measure of rational disposition called the Actively Open-minded Thinking (AOT) scale and the SAT as a proxy for cognitive ability. The AOT was a composite of items from several other scales that would be expected to measure CT disposition. They found that both SAT and AOT scores were significant predictors of higher argument analysis scores. Even after partialing out cognitive ability, actively open-minded thinking was significant. These results suggest that general cognitive ability alone was not sufficient to account for thinking critically about real-world issues and that CT disposition was needed to go beyond it.

Further examining the roles of CT dispositions and cognitive ability on reasoning, Stanovich and West ( 2008 ) studied myside bias, a bias in reasoning closely related to one-sided thinking and confirmation bias. A critical thinker would be expected to not show myside bias and instead fairly evaluate evidence on all sides of a question. Stanovich and West ( 2007 ) found that college students often showed myside bias when asked their opinions about real-world policy issues, such as those concerning the health risks of smoking and drinking alcohol. For example, compared to non-smokers, smokers judged the health risks of smoking to be lower. When they divided participants into higher versus lower cognitive ability groups based on SAT scores, the two groups showed little difference on myside bias. Moreover, on the hazards of drinking issue, participants who drank less had higher scores on the CT disposition measure.

Other research supports the need for both reasoning ability and CT disposition in predicting outcomes in the real world. Ren et al. ( 2020 ) found that CT disposition, as measured by a Chinese critical thinking disposition inventory, and a CT skill measure together contributed a significant amount of the variance in predicting academic performance beyond the contribution of cognitive ability alone, as measured by a test of fluid intelligence. Further supporting the claim that CT requires both cognitive ability and CT disposition, Ku and Ho ( 2010 ) found that a CT disposition measure significantly predicted scores on a CT test beyond the significant contribution of verbal intelligence in high school and college students from Hong Kong.

The contribution of dispositions to thinking is related to another way that CT goes beyond the application of general cognitive ability, i.e., by way of the motivation for reasoning. Assuming that all reasoning is motivated ( Kunda 1990 ), then CT is motivated, too, which is implicit within the Halpern and Dunn ( 2021 ) and Ennis ( 1987 ) definitions. Critical thinking is motivated in the sense of being purposeful and directed towards the goal of arriving at an accurate conclusion. For instance, corresponding to pursuit of the goal of accurate reasoning, the CT disposition of “truth-seeking” guides a person towards reaching the CT goal of arriving at an accurate conclusion.

Also, according to Kunda ( 1990 ), a second type of motivated reasoning can lead to faulty conclusions, often by directing a person towards the goal of maintaining favored beliefs and preconceptions, as in illusory correlation, belief perseverance, and confirmation bias. Corresponding to this second type, negative dispositions, such as close-mindedness and self-serving motives, can incline thinkers towards faulty conclusions. This is especially relevant in the present discussion because poorer reasoning, thinking errors, and the inappropriate use of heuristics are related to the endorsement of unsubstantiated claims, all of which are CT failures. The term “thinking errors” is a generic term referring to logical fallacies, informal reasoning fallacies, argumentation errors, and inappropriate uses of cognitive heuristics ( Bensley 2018 ). Heuristics are cognitive shortcuts, commonly used to simplify judgment tasks and reduce mental effort. Yet, when used inappropriately, heuristics often result in biased judgments.

Stanovich ( 2009 ) has argued that IQ tests do not test people’s use of heuristics, but heuristics have been found to be negatively correlated with CT performance ( West et al. 2008 ). In this same study, they found that college students’ cognitive ability, as measured by performance on the SAT, was not correlated with thinking biases associated with use of heuristics. Although Stanovich and West ( 2008 ) found that susceptibility to biases, such as the conjunction fallacy, framing effect, base-rate neglect, affect bias, and myside bias were all uncorrelated with cognitive ability (using SAT as a proxy), other types of thinking errors were correlated with SAT.

Likewise, two types of knowledge are related to the two forms of motivated reasoning. For instance, inaccurate knowledge, such as misconceptions, can derail reasoning from moving towards a correct conclusion, as in when a person reasons from false premises. In contrast, reasoning from accurate knowledge is more likely to produce an accurate conclusion. Taking into account inaccurate knowledge and thinking errors is important to understanding the endorsement of unsubstantiated claims because these are also related to negative dispositions, such as close-mindedness and cynicism, none of which are measured by intelligence tests.

Critical thinking questions are often situated in real-world examples or in simulations of them which are designed to detect thinking errors and bias. As described in Halpern and Butler ( 2018 ), an item like one on the “Halpern Critical Thinking Assessment” (HCTA) provides respondents with a mock newspaper story about research showing that first-graders who attended preschool were better able to learn how to read. Then the question asks if preschool should be made mandatory. A correct response to this item requires recognizing that correlation does not imply causation, that is, avoiding a common reasoning error people make in thinking about research implications in everyday life. Another CT skills test, “Analyzing Psychological Statements” (APS) assesses the ability to recognize thinking errors and apply argumentation skills and psychology to evaluate psychology-related examples and simulations of real-life situations ( Bensley 2021 ). For instance, besides identifying thinking errors in brief samples of thinking, questions ask respondents to distinguish arguments from non-arguments, find assumptions in arguments, evaluate kinds of evidence, and draw a conclusion from a brief psychological argument. An important implication of the studies just reviewed is that efforts to understand CT can be further informed by assessing thinking errors and biases, which, as the next discussion shows, are related to individual differences in thinking dispositions and cognitive style.

4. Dual-Process Theory Measures and Unsubstantiated Beliefs

Dual-process theory (DPT) and measures associated with it have been widely used in the study of the endorsement of unsubstantiated beliefs, especially as they relate to cognitive style. According to a cognitive style version of DPT, people have two modes of processing, a fast intuitive–experiential (I-E) style of processing and a slower, reflective, rational–analytic (R-A) style of processing. The intuitive cognitive style is associated with reliance on hunches, feelings, personal experience, and cognitive heuristics which simplify processing, while the R-A cognitive style is a reflective, rational–analytic style associated with more elaborate and effortful processing ( Bensley et al. 2022 ; Epstein 2008 ). As such, the rational–analytic cognitive style is consistent with CT dispositions, such as those promoting the effortful analysis of evidence, objective truth, and logical consistency. In fact, CT is sometimes referred to as “critical-analytic” thinking ( Byrnes and Dunbar 2014 ) and has been associated with analytical intelligence Sternberg ( 1988 ) and with rational thinking, as discussed before.

People use both modes of processing, but they show individual differences in which mode they tend to rely upon, although the intuitive–experiential mode is the default ( Bensley et al. 2022 ; Morgan 2016 ; Pacini and Epstein 1999 ), and they accept unsubstantiated claims differentially based on their predominate cognitive style ( Bensley et al. 2022 ; Epstein 2008 ). Specifically, individuals who rely more on an I-E cognitive style tend to endorse unsubstantiated claims more strongly, while individuals who rely more on a R-A cognitive style tend to endorse those claims less. Note, however, that other theorists view the two processes and cognitive styles somewhat differently, (e.g., Kahneman 2011 ; Stanovich et al. 2018 ).

Researchers have often assessed the contribution of these two cognitive styles to endorsement of unsubstantiated claims, using variants of three measures: the Cognitive Reflection Test (CRT) of Frederick ( 2005 ), the Rational–Experiential Inventory of Epstein and his colleagues ( Pacini and Epstein 1999 ), and the related Need for Cognition scale of Cacioppo and Petty ( 1982 ). The CRT is a performance-based test which asks participants to solve problems that appear to require simple mathematical calculations, but which actually require more reflection. People typically do poorly on the CRT, which is thought to indicate reliance on an intuitive cognitive style, while better performance is thought to indicate reliance on the slower, more deliberate, and reflective cognitive style. The positive correlation of the CRT with numeracy scores suggests it also has a cognitive skill component ( Patel et al. 2019 ). The Rational–Experiential Inventory (REI) of Pacini and Epstein ( 1999 ) contains one scale designed to measure an intuitive–experiential cognitive style and a second scale intended to measure a rational–analytic (R-A) style. The R-A scale was adapted from the Need for Cognition (NFC) scale of Cacioppo and Petty ( 1982 ), another scale associated with rational–analytic thinking and expected to be negatively correlated with unsubstantiated beliefs. The NFC was found to be related to open-mindedness and intellectual engagement, two CT dispositions ( Cacioppo et al. 1996 ).

The cognitive styles associated with DPT also relate to CT dispositions. Thinking critically requires that individuals be disposed to use their reasoning skills to reject unsubstantiated claims ( Bensley 2018 ) and that they be inclined to take a rational–analytic approach rather than relying on their intuitions and feelings. For instance, Bensley et al. ( 2014 ) found that students who endorsed more psychological misconceptions adopted a more intuitive cognitive style, were less disposed to take a rational–scientific approach to psychology, and scored lower on a psychological critical thinking skills test. Further supporting this connection, West et al. ( 2008 ) found that participants who tended to use cognitive heuristics more, thought to be related to intuitive processing and bias, scored lower on a critical thinking measure. As the Bensley et al. ( 2014 ) results suggest, in addition to assessing reasoning skills and dispositions, comprehensive CT assessment research should assess knowledge and unsubstantiated beliefs because these are related to failures of critical thinking.

5. Assessing Critical Thinking and Unsubstantiated Beliefs

Assessing endorsement of unsubstantiated claims provides another way to assess CT outcomes related to everyday thinking, which goes beyond what intelligence tests test ( Bensley and Lilienfeld 2020 ). From the perspective of the multi-dimensional model of CT, endorsement of unsubstantiated claims could result from deficiencies in a person’s CT reasoning skills, a lack of relevant knowledge, and in the engagement of inappropriate dispositions. Suppose an individual endorses an unsubstantiated claim, such as believing the conspiracy theory that human-caused global warming is a hoax. The person may lack the specific reasoning skills needed to critically evaluate the conspiracy. Lantian et al. ( 2020 ) found that scores on a CT skills test were negatively correlated with conspiracy theory beliefs. The person also must possess relevant scientific knowledge, such as knowing the facts that each year humans pump about 40 billion metric tons of carbon dioxide into the atmosphere and that carbon dioxide is a greenhouse gas which traps heat in the atmosphere. Or, the person may not be scientifically skeptical or too cynical or mistrustful of scientists or governmental officials.

Although endorsing unsubstantiated beliefs is clearly a failure of CT, problems arise in deciding which ones are unsubstantiated, especially when considering conspiracy theories. Typically, the claims which critical thinkers should reject as unsubstantiated are those which are not supported by objective evidence. But of the many conspiracies proposed, few are vigorously examined. Moreover, some conspiracy theories which authorities might initially deny turn out to be real, such as the MK-Ultra theory that the CIA was secretly conducting mind-control research on American citizens.

A way out of this quagmire is to define unsubstantiated beliefs on a continuum which depends on the quality of evidence. This has led to the definition of unsubstantiated claims as assertions which have not been supported by high-quality evidence ( Bensley 2023 ). Those which are supported have the kind of evidentiary support that critical thinkers are expected to value in drawing reasonable conclusions. Instead of insisting that a claim must be demonstrably false to be rejected, we adopt a more tentative acceptance or rejection of claims, based on how much good evidence supports them. Many claims are unsubstantiated because they have not yet been carefully examined and so totally lack support or they may be supported only by low quality evidence such as personal experience, anecdotes, or non-scientific authority. Other claims are more clearly unsubstantiated because they contradict the findings of high-quality research. A critical thinker should be highly skeptical of these.

Psychological misconceptions are one type of claim that can be more clearly unsubstantiated. Psychological misconceptions are commonsense psychological claims (folk theories) about the mind, brain, and behavior that are contradicted by the bulk of high-quality scientific research. Author developed the Test of Psychological Knowledge and Misconceptions (TOPKAM), a 40-item, forced-choice measure with each item posing a statement of a psychological misconception and the other response option stating the evidence-based alternative ( Bensley et al. 2014 ). They found that higher scores on the APS, the argument analysis test applying psychological concepts to analyze real-world examples, were associated with more correct answers on the TOPKAM. Other studies have found positive correlations between CT skills tests and other measures of psychological misconceptions ( McCutcheon et al. 1992 ; Kowalski and Taylor 2004 ). Bensley et al. ( 2014 ) also found that higher correct TOPKAM scores were positively correlated with scores on the Inventory of Thinking Dispositions in Psychology (ITDP) of Bensley ( 2021 ), a measure of the disposition to take a rational and scientific approach to psychology but were negatively correlated with an intuitive cognitive style.

Bensley et al. ( 2021 ) conducted a multidimensional study, assessing beginner psychology students starting a CT course on their endorsement of psychological misconceptions, recognition of thinking errors, CT dispositions, and metacognition, before and after CT instruction. Two classes received explicit instruction involving considerable practice in argument analysis and scientific reasoning skills, with one class receiving CT instruction focused more on recognizing psychological misconceptions and a second class focused more on recognizing various thinking errors. Bensley et al. assessed both classes before and after instruction on the TOPKAM and on the Test of Thinking Errors, a test of the ability to recognize in real-world examples 17 different types of thinking errors, such as confirmation bias, inappropriate use of the availability and representativeness heuristics, reasoning from ignorance/possibility, gambler’s fallacy, and hasty generalization ( Bensley et al. 2021 ). Correct TOPKAM and TOTE scores were positively correlated, and after CT instruction both were positively correlated with the APS, the CT test of argument analysis skills.

Bensley et al. found that after explicit instruction of CT skills, students improved significantly on both the TOPKAM and TOTE, but those focusing on recognizing misconceptions improved the most. Also, those students who improved the most on the TOTE scored higher on the REI rational–analytic scale and on the ITDP, while those improving the most on the TOTE scored higher on the ITDP. The students receiving explicit CT skill instruction in recognizing misconceptions also significantly improved the accuracy of their metacognitive monitoring in estimating their TOPKAM scores after instruction.

Given that before instruction neither class differed in GPA nor on the SAT, a proxy for general cognitive ability, CT instruction provided a good accounting for the improvement in recognition of thinking errors and misconceptions without recourse to intelligence. However, SAT scores were positively correlated with both TOTE scores and APS scores, suggesting that cognitive ability contributed to CT skill performance. These results replicated the earlier findings of Bensley and Spero ( 2014 ) showing that explicit CT instruction improved performance on both CT skills tests and metacognitive monitoring accuracy while controlling for SAT, which was positively correlated with the CT skills test performance.

Taken together, these findings suggest that cognitive ability contributes to performance on CT tasks but that CT instruction goes beyond it to further improve performance. As the results of Bensley et al. ( 2021 ) show, and as discussed next, thinking errors and bias from heuristics are CT failures that should also be assessed because they are related to endorsement of unsubstantiated beliefs and cognitive style.

6. Dual-Processing Theory and Research on Unsubstantiated Beliefs

Consistent with DPT, numerous other studies have obtained significant positive correlations between intuitive cognitive style and paranormal belief, often using the REI intuitive–experiential scale and the Revised Paranormal Belief Scale (RPBS) of Tobacyk ( 2004 ) (e.g., Genovese 2005 ; Irwin and Young 2002 ; Lindeman and Aarnio 2006 ; Pennycook et al. 2015 ; Rogers et al. 2018 ; Saher and Lindeman 2005 ). Studies have also found positive correlations between superstitious belief and intuitive cognitive style (e.g., Lindeman and Aarnio 2006 ; Maqsood et al. 2018 ). REI intuitive–experiential thinking style was also positively correlated with belief in complementary and alternative medicine ( Lindeman 2011 ), conspiracy theory belief ( Alper et al. 2020 ), and with endorsement of psychological misconceptions ( Bensley et al. 2014 ; Bensley et al. 2022 ).

Additional evidence for DPT has been found when REI R-A and NFC scores were negatively correlated with scores on measures of unsubstantiated beliefs, but studies correlating them with measures of paranormal belief and conspiracy theory belief have shown mixed results. Supporting a relationship, REI rational–analytic and NFC scores significantly and negatively predicted paranormal belief ( Lobato et al. 2014 ; Pennycook et al. 2012 ). Other studies have also obtained a negative correlation between NFC and paranormal belief ( Lindeman and Aarnio 2006 ; Rogers et al. 2018 ; Stahl and van Prooijen 2018 ), but both Genovese ( 2005 ) and Pennycook et al. ( 2015 ) found that NFC was not significantly correlated with paranormal belief. Swami et al. ( 2014 ) found that although REI R-A scores were negatively correlated with conspiracy theory belief, NFC scores were not.

Researchers often refer to people who are doubtful of paranormal and other unfounded claims as “skeptics” and so have tested whether measures related to skepticism are associated with less endorsement of unsubstantiated claims. They typically view skepticism as a stance towards unsubstantiated claims taken by rational people who reject them, (e.g., Lindeman and Aarnio 2006 ; Stahl and van Prooijen 2018 ), rather than as a disposition inclining a person to think critically about unsubstantiated beliefs ( Bensley 2018 ).

Fasce and Pico ( 2019 ) conducted one of the few studies using a measure related to skeptical disposition, the Critical Thinking Disposition Scale (CTDS) of Sosu ( 2013 ), in relation to endorsement of unsubstantiated claims. They found that scores on the CTDS were negatively correlated with scores on the RPBS but not significantly correlated with either a measure of pseudoscience or of conspiracy theory belief. However, the CRT was negatively correlated with both RPBS and the pseudoscience measure. Because Fasce and Pico ( 2019 ) did not examine correlations with the Reflective Skepticism subscale of the CTDS, its contribution apart from full-scale CTDS was not found.

To more directly test skepticism as a disposition, we recently assessed college students on how well three new measures predicted endorsement of psychological misconceptions, paranormal claims, and conspiracy theories ( Bensley et al. 2022 ). The dispositional measures included a measure of general skeptical attitude; a second measure, the Scientific Skepticism Scale (SSS), which focused more on waiting to accept claims until high-quality scientific evidence supported them; and a third measure, the Cynicism Scale (CS), which focused on doubting the sincerity of the motives of scientists and people in general. We found that although the general skepticism scale did not predict any of the unsubstantiated belief measures, SSS scores were a significant negative predictor of both paranormal belief and conspiracy theory belief. REI R-A scores were a less consistent negative predictor, while REI I-E scores were more consistent positive predictors, and surprisingly CS scores were the most consistent positive predictors of the unsubstantiated beliefs.

Researchers commonly assume that people who accept implausible, unsubstantiated claims are gullible or not sufficiently skeptical. For instance, van Prooijen ( 2019 ) has argued that conspiracy theory believers are more gullible (less skeptical) than non-believers and tend to accept unsubstantiated claims more than less gullible people. van Prooijen ( 2019 ) reviewed several studies supporting the claim that people who are more gullible tend to endorse conspiracy theories more. However, he did not report any studies in which a gullible disposition was directly measured.

Recently, we directly tested the gullibility hypothesis in relation to scientific skepticism ( Bensley et al. 2023 ) using the Gullibility Scale of Teunisse et al. ( 2019 ) on which people skeptical of the paranormal had been shown to have lower scores. We found that Gullibility Scale and the Cynicism Scale scores were positively correlated, and both were significant positive predictors of unsubstantiated beliefs, in general, consistent with an intuitive–experiential cognitive style. In contrast, we found that scores on the Cognitive Reflection Test, the Scientific Skepticism Scale, and the REI rational–analytic scale were all positively intercorrelated and significant negative predictors of unsubstantiated beliefs, in general, consistent with a rational–analytic/reflective cognitive style. Scientific skepticism scores negatively predicted general endorsement of unsubstantiated claims beyond the REI R-A scale, but neither the CTDS nor the CTDS Reflective Skepticism subscale were significant. These results replicated findings from the Bensley et al. ( 2023 ) study and supported an elaborated dual-process model of unsubstantiated belief. The SSS was not only a substantial negative predictor, it was also negatively correlated with the Gullibility Scale, as expected.

These results suggest that both CT-related dispositions and CT skills are related to endorsement of unsubstantiated beliefs. However, a measure of general cognitive ability or intelligence must be examined along with measures of CT and unsubstantiated beliefs to determine if CT goes beyond intelligence to predict unsubstantiated beliefs. In one of the few studies that also included a measure of cognitive ability, Stahl and van Prooijen ( 2018 ) found that dispositional characteristics helped account for acceptance of conspiracies and paranormal belief beyond cognitive ability. Using the Importance of Rationality Scale (IRS), a rational–analytic scale designed to measure skepticism towards unsubstantiated beliefs, Stahl and van Prooijen ( 2018 ) found that the IRS was negatively correlated with paranormal belief and belief in conspiracy theories. In separate hierarchical regressions, cognitive ability was the strongest negative predictor of both paranormal belief and of conspiracy belief, but IRS scores in combination with cognitive ability negatively predicted endorsement of paranormal belief but did not significantly predict conspiracy theory belief. These results provided partial support that that a measure of rational–analytic cognitive style related to skeptical disposition added to the variance accounted for beyond cognitive ability in negatively predicting unsubstantiated belief.

In another study that included a measure of cognitive ability, Cavojova et al. ( 2019 ) examined how CT-related dispositions and the Scientific Reasoning Scale (SRS) were related to a measure of paranormal, pseudoscientific, and conspiracy theory beliefs. The SRS of Drummond and Fischhoff ( 2017 ) likely measures CT skill in that it measures the ability to evaluate scientific research and evidence. As expected, the unsubstantiated belief measure was negatively correlated with the SRS and a cognitive ability measure, similar to Raven’s Progressive Matrices. Unsubstantiated beliefs were positively correlated with dogmatism (the opposite of open-mindedness) but not with REI rational–analytic cognitive style. The SRS was a significant negative predictor of both unsubstantiated belief and susceptibility to bias beyond the contribution of cognitive ability, but neither dogmatism nor analytic thinking were significant predictors. Nevertheless, this study provides some support that a measure related to CT reasoning skill accounts for variance in unsubstantiated belief beyond cognitive ability.

The failure of this study to show a correlation between rational–analytic cognitive style and unsubstantiated beliefs, when some other studies have found significant correlations with it and related measures, has implications for the multidimensional assessment of unsubstantiated beliefs. One implication is that the REI rational–analytic scale may not be a strong predictor of unsubstantiated beliefs. In fact, we have recently found that the Scientific Skepticism Scale was a stronger negative predictor ( Bensley et al. 2022 ; Bensley et al. 2023 ), which also suggests that other measures related to rational–analytic thinking styles should be examined. This could help triangulate the contribution of self-report cognitive style measures to endorsement of unsubstantiated claims, recognizing that the use of self-report measures has a checkered history in psychological research. A second implication is that once again, measures of critical thinking skill and cognitive ability were negative predictors of unsubstantiated belief and so they, too, should be included in future assessments of unsubstantiated beliefs.

7. Discussion

This review provided different lines of evidence supporting the claim that CT goes beyond cognitive ability in accounting for certain real-world outcomes. Participants who think critically reported fewer problems in everyday functioning, not expected to befall critical thinkers. People who endorsed unsubstantiated claims less showed better CT skills, more accurate domain-specific knowledge, less susceptibility to thinking errors and bias, and were more disposed to think critically. More specifically, they tended to be more scientifically skeptical and adopt a more rational–analytic cognitive style. In contrast, those who endorsed them more tended to be more cynical and adopt an intuitive–experiential cognitive style. These characteristics go beyond what standardized intelligence tests test. In some studies, the CT measures accounted for additional variance beyond the variance contributed by general cognitive ability.

That is not to say that measures of general cognitive ability are not useful. As noted by Gottfredson ( 2004 ), “g” is a highly successful predictor of academic and job performance. More is known about g and Gf than about many other psychological constructs. On average, g is closely related to Gf, which is highly correlated with working memory ( r = 0.70) and can be as high as r = 0.77 ( r 2 = 0.60) based on a correlated two-factor model ( Gignac 2014 ). Because modern working memory theory is, itself, a powerful theory ( Chai et al. 2018 ), this lends construct validity to the fluid intelligence construct. Although cognitive scientists have clearly made progress in understanding the executive processes underlying intelligence, they have not yet identified the specific cognitive components of intelligence ( Sternberg 2022 ). Moreover, theorists have acknowledged that intelligence must also include components beyond g, including domain-specific knowledge ( Ackerman 2022 ; Conway and Kovacs 2018 ) which are not yet clearly understood,

This review also pointed to limitations in the research that should be addressed. So far, not only have few studies of unsubstantiated beliefs included measures of intelligence, but they have also often used proxies for intelligence test scores, such as SAT scores. Future studies, besides using more and better measures of intelligence, could benefit from inclusion of more specifically focused measures, such as measures of Gf and Gc. Also, more research should be carried out to develop additional high-quality measures of CT, including ones that assess specific reasoning skills and knowledge relevant to thinking about a subject, which could help resolve perennial questions about the domain-general versus domain-specific nature of intelligence and CT. Overall, the results of this review encourage taking a multidimensional approach to investigating the complex constructs of intelligence, CT, and unsubstantiated belief. Supporting these recommendations were results of studies in which the improvement accrued from explicit CT skill instruction could be more fully understood when CT skills, relevant knowledge, CT dispositions, metacognitive monitoring accuracy, and a proxy for intelligence were used.

8. Conclusions

Critical thinking, broadly conceived, offers ways to understand real-world outcomes of thinking beyond what general cognitive ability can provide and intelligence tests test. A multi-dimensional view of CT which includes specific reasoning and metacognitive skills, CT dispositions, and relevant knowledge can add to our understanding of why some people endorse unsubstantiated claims more than others do, going beyond what intelligence tests test. Although general cognitive ability and domain-general knowledge often contribute to performance on CT tasks, thinking critically about real-world questions also involves applying rules, criteria, and knowledge which are specific to the question under consideration, as well as the appropriate dispositions and cognitive styles for deploying these.

Despite the advantages of taking this multidimensional approach to CT in helping us to more fully understand everyday thinking and irrationality, it presents challenges for researchers and instructors. It implies the need to assess and instruct multidimensionally, including not only measures of reasoning skills but also addressing thinking errors and biases, dispositions, the knowledge relevant to a task, and the accuracy of metacognitive judgments. As noted by Dwyer ( 2023 ), adopting a more complex conceptualization of CT beyond just skills is needed, but it presents challenges for those seeking to improve students’ CT. Nevertheless, the research reviewed suggests that taking this multidimensional approach to CT can enhance our understanding of the endorsement of unsubstantiated claims beyond what standardized intelligence tests contribute. More research is needed to resolve remaining controversies and to develop evidence-based applications of the findings.

Funding Statement

This research received no external funding.

Institutional Review Board Statement

This research involved no new testing of participants and hence did not require Institutional Review Board approval.

Informed Consent Statement

This research involved no new testing of participants and hence did not require an Informed Consent Statement.

Data Availability Statement

Conflicts of interest.

The author declares no conflict of interest.

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

  • Ackerman Phillip L. Intelligence … Moving beyond the lowest common denominator. American Psychologist. 2022; 78 :283–97. doi: 10.1037/amp0001057. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Alper Sinan, Bayrak Faith, Yilmaz Onurcan. Psychological correlates of COVID-19 conspiracy beliefs and preventive measures: Evidence from Turkey. Current Psychology. 2020; 40 :5708–17. doi: 10.1007/s12144-020-00903-0. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan. Critical Thinking in Psychology and Everyday Life: A Guide to Effective Thinking. Worth Publishers; New York: 2018. [ Google Scholar ]
  • Bensley D. Alan. The Critical Thinking in Psychology Assessment Battery (CTPAB) and Test Guide. 2021. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan. “I can’t believe you believe that”: Identifying unsubstantiated claims. Skeptical Inquirer. 2023; 47 :53–56. [ Google Scholar ]
  • Bensley D. Alan, Spero Rachel A. Improving critical thinking skills and metacognitive monitoring through direct infusion. Thinking Skills and Creativity. 2014; 12 :55–68. doi: 10.1016/j.tsc.2014.02.001. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Lilienfeld Scott O. Assessment of Unsubstantiated Beliefs. Scholarship of Teaching and Learning in Psychology. 2020; 6 :198–211. doi: 10.1037/stl0000218. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Masciocchi Christopher M., Rowan Krystal A. A comprehensive assessment of explicit critical thinking instruction on recognition of thinking errors and psychological misconceptions. Scholarship of Teaching and Learning in Psychology. 2021; 7 :107. doi: 10.1037/stl0000188. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Watkins Cody, Lilienfeld Scott O., Masciocchi Christopher, Murtagh Michael, Rowan Krystal. Skepticism, cynicism, and cognitive style predictors of the generality of unsubstantiated belief. Applied Cognitive Psychology. 2022; 36 :83–99. doi: 10.1002/acp.3900. [ CrossRef ] [ Google Scholar ]
  • Bensley D. Alan, Rodrigo Maria, Bravo Maria, Jocoy Kathleen. Dual-Process Theory and Cognitive Style Predictors of the General Endorsement of Unsubstantiated Claims. 2023. Unpublished manuscript. Frostburg State University, Frostburg, MD, USA.
  • Bensley D. Alan, Lilienfeld Scott O., Powell Lauren. A new measure of psychological. misconceptions: Relations with academic background, critical thinking, and acceptance of paranormal and pseudoscientific claims. Learning and Individual Differences. 2014; 36 :9–18. doi: 10.1016/j.lindif.2014.07.009. [ CrossRef ] [ Google Scholar ]
  • Bierwiaczonek Kinga, Kunst Jonas R., Pich Olivia. Belief in COVID-19 conspiracy theories reduces social distancing over time. Applied Psychology Health and Well-Being. 2020; 12 :1270–85. doi: 10.1111/aphw.12223. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Butler Heather A. Halpern critical thinking assessment predicts real-world outcomes of critical thinking. Applied Cognitive Psychology. 2012; 26 :721–29. doi: 10.1002/acp.2851. [ CrossRef ] [ Google Scholar ]
  • Butler Heather A., Halpern Diane F. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Intelligence. Cambridge University Press; Cambridge: 2019. pp. 183–96. [ Google Scholar ]
  • Butler Heather A., Pentoney Christopher, Bong Maebelle P. Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Thinking Skills and Creativity. 2017; 25 :38–46. doi: 10.1016/j.tsc.2017.06.005. [ CrossRef ] [ Google Scholar ]
  • Byrnes James P., Dunbar Kevin N. The nature and development of critical-analytic thinking. Educational Research Review. 2014; 26 :477–93. doi: 10.1007/s10648-014-9284-0. [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E. The need for cognition. Journal of Personality and Social Psychology. 1982; 42 :116–31. doi: 10.1037/0022-3514.42.1.116. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cacioppo John T., Petty Richard E., Feinstein Jeffrey A., Jarvis W. Blair G. Dispositional differences in cognitive motivation: The life and times of individuals varying in need for cognition. Psychological Bulletin. 1996; 119 :197. doi: 10.1037/0033-2909.119.2.197. [ CrossRef ] [ Google Scholar ]
  • Cavojova Vladimira, Srol Jakub, Jurkovic Marek. Why we should think like scientists? Scientific reasoning and susceptibility to epistemically suspect beliefs and cognitive biases. Applied Cognitive Psychology. 2019; 34 :85–95. doi: 10.1002/acp.3595. [ CrossRef ] [ Google Scholar ]
  • Chai Wen Jia, Hamid Abd, Ismafairus Aini, Abdullah Jafri Malin. Working memory from the psychological and neuroscience perspective. Frontiers in Psychology. 2018; 9 :401. doi: 10.3389/fpsyg.2018.00401. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Conway Andrew R., Kovacs Kristof. The nature of the general factor of intelligence. In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 49–63. [ Google Scholar ]
  • Drummond Caitlin, Fischhoff Baruch. Development and validation of the Scientific Reasoning Scale. Journal of Behavioral Decision Making. 2017; 30 :26–38. doi: 10.1002/bdm.1906. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P. Conceptual Perspectives and Practical Guidelines. Cambridge University Press; Cambridge: 2017. [ Google Scholar ]
  • Dwyer Christopher P. An evaluative review of barriers to critical thinking in educational and real-world settings. Journal of Intelligence. 2023; 11 :105. doi: 10.3390/jintelligence11060105. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ennis Robert H. A taxonomy of critical thinking dispositions and abilities. In: Baron Joan, Sternberg Robert., editors. Teaching Thinking Skills: Theory and Practice. W. H. Freeman; New York: 1987. [ Google Scholar ]
  • Epstein Seymour. Intuition from the perspective of cognitive-experiential self-theory. In: Plessner Henning, Betsch Tilmann., editors. Intuition in Judgment and Decision Making. Erlbaum; Washington, DC: 2008. pp. 23–37. [ Google Scholar ]
  • Fasce Angelo, Pico Alfonso. Science as a vaccine: The relation between scientific literacy and unwarranted beliefs. Science & Education. 2019; 28 :109–25. doi: 10.1007/s11191-018-00022-0. [ CrossRef ] [ Google Scholar ]
  • Frederick Shane. Cognitive reflection and decision making. Journal of Economic Perspectives. 2005; 19 :25–42. doi: 10.1257/089533005775196732. [ CrossRef ] [ Google Scholar ]
  • Gardner Howard. Intelligence Reframed: Multiple Intelligence for the 21st Century. Basic Books; New York: 1999. [ Google Scholar ]
  • Genovese Jeremy E. C. Paranormal beliefs, schizotypy, and thinking styles among teachers and future teachers. Personality and Individual Differences. 2005; 39 :93–102. doi: 10.1016/j.paid.2004.12.008. [ CrossRef ] [ Google Scholar ]
  • Gignac Gilles E. Fluid intelligence shares closer to 60% of its variance with working memory capacity and is a better indicator of general intelligence. Intelligence. 2014; 47 :122–33. doi: 10.1016/j.intell.2014.09.004. [ CrossRef ] [ Google Scholar ]
  • Gottfredson Linda S. Life, death, and intelligence. Journal of Cognitive Education and Psychology. 2004; 4 :23–46. doi: 10.1891/194589504787382839. [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Dunn Dana. Critical thinking: A model of intelligence for solving real-world problems. Journal of Intelligence. 2021; 9 :22. doi: 10.3390/jintelligence9020022. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F., Butler Heather A. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The Nature of Human Intelligence. Cambridge University Press; Cambridge: 2018. pp. 183–196. [ Google Scholar ]
  • Irwin Harvey J., Young J. M. Intuitive versus reflective processes in the formation of paranormal beliefs. European Journal of Parapsychology. 2002; 17 :45–55. [ Google Scholar ]
  • Jolley Daniel, Paterson Jenny L. Pylons ablaze: Examining the role of 5G COVID-19 conspiracy beliefs and support for violence. British Journal of Social Psychology. 2020; 59 :628–40. doi: 10.1111/bjso.12394. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kahneman Daniel. Thinking Fast and Slow. Farrar, Strauss and Giroux; New York: 2011. [ Google Scholar ]
  • Kowalski Patricia, Taylor Annette J. Ability and critical thinking as predictors of change in students’ psychological misconceptions. Journal of Instructional Psychology. 2004; 31 :297–303. [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Dispositional Factors predicting Chinese students’ critical thinking performance. Personality and Individual Differences. 2010; 48 :54–58. doi: 10.1016/j.paid.2009.08.015. [ CrossRef ] [ Google Scholar ]
  • Kunda Ziva. The case for motivated reasoning. Psychological Bulletin. 1990; 98 :480–98. doi: 10.1037/0033-2909.108.3.480. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lantian Anthony, Bagneux Virginie, Delouvee Sylvain, Gauvrit Nicolas. Maybe a free thinker but not a critical one: High conspiracy belief is associated with low critical thinking ability. Applied Cognitive Psychology. 2020; 35 :674–84. doi: 10.1002/acp.3790. [ CrossRef ] [ Google Scholar ]
  • Lilienfeld Scott O. Psychological treatments that cause harm. Perspectives on Psychological Science. 2007; 2 :53–70. doi: 10.1111/j.1745-6916.2007.00029.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana. Biases in intuitive reasoning and belief in complementary and alternative medicine. Psychology and Health. 2011; 26 :371–82. doi: 10.1080/08870440903440707. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lindeman Marjaana, Aarnio Kia. Paranormal beliefs: Their dimensionality and correlates. European Journal of Personality. 2006; 20 :585–602. [ Google Scholar ]
  • Lobato Emilio J., Mendoza Jorge, Sims Valerie, Chin Matthew. Explaining the relationship between conspiracy theories, paranormal beliefs, and pseudoscience acceptance among a university population. Applied Cognitive Psychology. 2014; 28 :617–25. doi: 10.1002/acp.3042. [ CrossRef ] [ Google Scholar ]
  • Maqsood Alisha, Jamil Farhat, Khalid Ruhi. Thinking styles and belief in superstitions: Moderating role of gender in young adults. Pakistan Journal of Psychological Research. 2018; 33 :335–348. [ Google Scholar ]
  • McCutcheon Lynn E., Apperson Jenneifer M., Hanson Esher, Wynn Vincent. Relationships among critical thinking skills, academic achievement, and misconceptions about psychology. Psychological Reports. 1992; 71 :635–39. doi: 10.2466/pr0.1992.71.2.635. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McGrew Kevin S. CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence. 2009; 37 :1–10. doi: 10.1016/j.intell.2008.08.004. [ CrossRef ] [ Google Scholar ]
  • Morgan Jonathan. Religion and dual-process cognition: A continuum of styles or distinct types. Religion, Brain, & Behavior. 2016; 6 :112–29. doi: 10.1080/2153599X.2014.966315. [ CrossRef ] [ Google Scholar ]
  • Nie Fanhao, Olson Daniel V. A. Demonic influence: The negative mental health effects of belief in demons. Journal for the Scientific Study of Religion. 2016; 55 :498–515. doi: 10.1111/jssr.12287. [ CrossRef ] [ Google Scholar ]
  • Pacini Rosemary, Epstein Seymour. The relation of rational and experiential information processing styles to personality, basic beliefs, and the ratio-bias phenomenon. Journal of Personality and Social Psychology. 1999; 76 :972–87. doi: 10.1037/0022-3514.76.6.972. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Patel Niraj, Baker S. Glenn, Scherer Laura D. Evaluating the cognitive reflection test as a measure of intuition/reflection, numeracy, and insight problem solving, and the implications for understanding real-world judgments and beliefs. Journal of Experimental Psychology: General. 2019; 148 :2129–53. doi: 10.1037/xge0000592. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Barr Nathaniel, Koehler Derek J., Fugelsang Jonathan A. On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making. 2015; 10 :549–63. doi: 10.1017/S1930297500006999. [ CrossRef ] [ Google Scholar ]
  • Pennycook Gordon, Cheyne James Allen, Seti Paul, Koehler Derek J., Fugelsang Jonathan A. Analytic cognitive style predicts religious and paranormal belief. Cognition. 2012; 123 :335–46. doi: 10.1016/j.cognition.2012.03.003. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ren Xuezhu, Tong Yan, Peng Peng, Wang Tengfei. Critical thinking predicts academic performance beyond cognitive ability: Evidence from adults and children. Intelligence. 2020; 82 :10187. doi: 10.1016/j.intell.2020.101487. [ CrossRef ] [ Google Scholar ]
  • Rogers Paul, Fisk John E., Lowrie Emma. Paranormal belief, thinking style preference and susceptibility to confirmatory conjunction errors. Consciousness and Cognition. 2018; 65 :182–95. doi: 10.1016/j.concog.2018.07.013. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Saher Marieke, Lindeman Marjaana. Alternative medicine: A psychological perspective. Personality and Individual Differences. 2005; 39 :1169–78. doi: 10.1016/j.paid.2005.04.008. [ CrossRef ] [ Google Scholar ]
  • Sosu Edward M. The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity. 2013; 9 :107–19. doi: 10.1016/j.tsc.2012.09.002. [ CrossRef ] [ Google Scholar ]
  • Stahl Tomas, van Prooijen Jan-Wilem. Epistemic irrationality: Skepticism toward unfounded beliefs requires sufficient cognitive ability and motivation to be rational. Personality and Individual Differences. 2018; 122 :155–63. doi: 10.1016/j.paid.2017.10.026. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E. What Intelligence Tests Miss: The Psychology of Rational Thought. Yale University Press; New Haven: 2009. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Reasoning independently of prior belief and individual differences in actively open-minded thinking. Journal of Educational Psychology. 1997; 89 :345–57. doi: 10.1037/0022-0663.89.2.342. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Natural myside bias is independent of cognitive ability. Thinking & Reasoning. 2007; 13 :225–47. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict myside and one-sided thinking bias. Thinking and Reasoning. 2008; 14 :129–67. doi: 10.1080/13546780701679764. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F., Toplak Maggie E. The Rationality Quotient: Toward a Test of Rational Thinking. The MIT Press; Cambridge, MA: 2018. [ Google Scholar ]
  • Sternberg Robert J. The Triarchic Mind: A New Theory of Intelligence. Penguin Press; London: 1988. [ Google Scholar ]
  • Sternberg Robert J. A theory of adaptive intelligence and its relation to general intelligence. Journal of Intelligence. 2019; 7 :23. doi: 10.3390/jintelligence7040023. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sternberg Robert J. The search for the elusive basic processes underlying human intelligence: Historical and contemporary perspectives. Journal of Intelligence. 2022; 10 :28. doi: 10.3390/jintelligence10020028. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Swami Viren, Voracek Martin, Stieger Stefan, Tran Ulrich S., Furnham Adrian. Analytic thinking reduces belief in conspiracy theories. Cognition. 2014; 133 :572–85. doi: 10.1016/j.cognition.2014.08.006. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Teunisse Alessandra K., Case Trevor I., Fitness Julie, Sweller Naomi. I should have known better: Development of a self-report measure of gullibility. Personality and Social Psychology Bulletin. 2019; 46 :408–23. doi: 10.1177/0146167219858641. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tobacyk Jerome J. A revised paranormal belief scale. The International Journal of Transpersonal Studies. 2004; 23 :94–98. doi: 10.24972/ijts.2004.23.1.94. [ CrossRef ] [ Google Scholar ]
  • van der Linden Sander. The conspiracy-effect: Exposure to conspiracy theories (about global warming) leads to decreases pro-social behavior and science acceptance. Personality and Individual Differences. 2015; 87 :173–75. doi: 10.1016/j.paid.2015.07.045. [ CrossRef ] [ Google Scholar ]
  • van Prooijen Jan-Willem. Belief in conspiracy theories: Gullibility or rational skepticism? In: Forgas Joseph P., Baumeister Roy F., editors. The Social Psychology of Gullibility: Fake News, Conspiracy Theories, and Irrational Beliefs. Routledge; London: 2019. pp. 319–32. [ Google Scholar ]
  • Wechsler David. The Measurement of Intelligence. 3rd ed. Williams & Witkins; Baltimore: 1944. [ Google Scholar ]
  • West Richard F., Toplak Maggie E., Stanovich Keith E. Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology. 2008; 100 :930–41. doi: 10.1037/a0012842. [ CrossRef ] [ Google Scholar ]

A critical issue: assessing the critical thinking skills and dispositions of undergraduate health science students

  • Open access
  • Published: 15 August 2023
  • Volume 2 , article number  21 , ( 2023 )

Cite this article

You have full access to this open access article

  • Anthony Dissen   ORCID: orcid.org/0000-0003-0828-387X 1  

1765 Accesses

Explore all metrics

Critical thinking skills and dispositions are significantly important factors that aid in one’s ability to understand and solve complex problems. Within the field of higher education, critical thinking skills and dispositions are valued and encouraged but are not always fully developed at the completion of the undergraduate degree experience. Those students who are studying to enter the healthcare field are particularly in need of strong critical thinking skills and dispositions in order to provide patients and communities with effective, evidence-based care in the midst of an ever-increasingly complex environment. What program of study a student selects, and the unique curriculum design of that program, may impact the development of these skills and dispositions during undergraduate study. This quantitative study sought to explore and understand the critical thinking skills and dispositions of undergraduate students enrolled in a BS in Health Science (BSHS) degree program, and how these skills in particular compared to the national student population. During the Spring 2022 academic semester, 140 senior-level BSHS students were given the opportunity to complete the California Critical Thinking Skills Test and the California Critical Thinking Disposition Inventory. Results show less development in critical thinking skills when compared to the national student population, with Numeracy skills being the most poorly developed, and Truth-Seeking being the most inconsistent disposition possessed by the student participants. The implications of these findings, particularly for faculty who teach students planning to enter the healthcare field, are offered, including recommendations for curricular design and modification.

Similar content being viewed by others

critical thinking is needed to evaluate scientific findings pertaining to

Chinese medical students’ disposition for critical thinking: a mixed methods exploration

Lei Huang, Angela Pei-Chen Fan, … Xudong Zhao

critical thinking is needed to evaluate scientific findings pertaining to

Clear skies ahead: optimizing the learning environment for critical thinking from a qualitative analysis of interviews with expert teachers

Lynn E. Jaffe, Deborah Lindell, … Grace C. Huang

A meta-analysis of the effects of non-traditional teaching methods on the critical thinking abilities of nursing students

JuHee Lee, Yoonju Lee, … Moonki Choi

Avoid common mistakes on your manuscript.

1 Introduction

Critical thinking skills and dispositions allow students to gather, interpret, and reflect upon how new information and data can be applied to address personal and professional needs and situations [ 1 ]. While there is no one singular definition, critical thinking is often described as an active, attentive, and purposeful method by which one analyzes facts and information to form a judgment or accomplish a specific goal [ 2 ]. This is an important set of skills and attitudes for students in the health sciences to possess, as critical thinking allows one to be comfortable with the possibilities of new perspectives and ideas, which is crucial for healthcare practice. Additionally, critical thinking is necessary for the development of current and future clinical reasoning skills [ 3 ]. This is partly due to the need for students to learn to be appropriately skeptical when reviewing treatment techniques, best practice guidelines, and new research that may impact their means of practice and care delivery [ 4 ]. To be able to work effectively and rationally in the healthcare and medical fields, critical thinking skills and dispositions must be properly developed and supported in educational settings [ 5 ].

The Carnegie Foundation for Teaching and Learning [ 6 ] has proposed four major categories of recommendations for the reform of medical and health science education: Teaching and learning to promote integration, promoting habits of inquiry and improvement, individualizing learning, using standardized assessments, and supporting the progressive development of professional identity. These recommendations parallel the subsequent evolution of education and teaching theories over the past century [ 7 ], namely the dynamic nature of the learning and the teaching processes, and the importance of the teaching environment. Both undergraduate and graduate-level health science educational programs are recognizing that these reforms are needed in order to meet the current and future demands being placed upon healthcare professionals, and that the environment in which learning is taking place is as important as the content being shared. Much of the emphasis behind these proposed reforms is centered around the need for future healthcare professionals to not only know the didactic and intellectual aspects of their work, but to also be able to solve complex problems and to think critically about their work and their identities as healthcare workers.

As such, critical thinking is a fundamental aspect of quality clinical decision-making among a variety of healthcare professions. To be able to think rationally and clearly, especially when encountering problems and uncertainty at work, is a necessary skill to be effective in the kinds of environments and situations that are common in the healthcare and medical fields [ 5 ]. Undergraduate health-focused students who have critical thinking education embedded into their curriculum have shown improvements in their problem-solving skills [ 8 ], which may have particularly important outcomes in promoting patient safety. Health education programs that teach critical thinking have been found to help reduce diagnostic errors, improve overall patient safety, and reduce cognitive those biases that can lead to poorer patient outcomes and professional practice [ 9 ]. This need for critical thinking is not just present in professional practice, but during pre-professional educational experiences as well, where the ability to enhance the capacity for problem-solving and wider reasoning is necessary to perform well academically [ 10 ]. This is especially important considering the significant pressures that are placed upon students during their academic careers at the undergraduate level to perform well academically to secure spots in clinical and graduate programs after completing their baccalaureate degrees.

The consequences of not possessing critical thinking skills in healthcare and medicine can be significant. Healthcare professionals who do not possess a capacity for critical thinking and problem-solving skills have a measurable impact on the health of their patients and communities, specifically poor rates of compliance with health recommendations and treatments, as well as direct harm to the health and wellbeing of those being served [ 11 ]. Given the importance of having a healthcare workforce that can practice critical thinking as part of their professional work, it is necessary to better understand how critical thinking skills and attitudes can be instilled within healthcare professionals, both during their pre-professional education and throughout their professional careers.

By understanding the current level of critical thinking skills and attitudes of health science students before they enter their professional fields of practice, it can be possible to identify those areas of strength, those areas of weakness, and how to make changes as needed within health science education programs to better prepare students for a professional field that demands strong critical thinking skills, attitudes, and applications. In addition, by understanding how skills, attitudes, and overall academic performance relate to one another, health science education programs can be more purposeful in how they advise students, develop curriculum, and track student progress throughout their academic journey.

This study sought to answer the following research questions:

RQ1: What are the critical thinking skills of undergraduate health science students at a four-year, public, comprehensive state university? RQ2: How do the overall critical thinking skills of undergraduate health science students at a four-year, public, comprehensive state university compare to the national population of undergraduate students? RQ3: What are the dispositions towards the importance of critical thinking of undergraduate health science students at a four-year, public, comprehensive state university? RQ4: To what degree is overall academic performance as measured by grade point average (GPA) a reasonable indicator of critical thinking development?

2 Study methods

The theoretical framework for this study was heavily influenced by the work of Dr. Peter Facione, whose seminal work in the field of critical thinking assessment is utilized by educators, employers, and policymakers who recognize the need for students and alumni of institutions of higher education to be able to properly demonstrate these skills and dispositions as a result of their time in higher education [ 12 ]. An additional component to the assessment work developed by Facione is the need for not only developing critical thinking skillsets, but also the development of those dispositions and attitudes, what Facione and colleagues call the critical spirit, that are needed in order to possess the internal drive and motivation to apply critical thinking skills in various aspects of one’s personal, professional, and social spheres of life [ 13 ].

The work done by Facione in the development of this framework has been studied and utilized by other researchers, particularly around assessing the critical thinking skills and disposition of healthcare professionals and health science students. This framework has been utilized by Nair et al. [ 14 ] in the development of their Critical Thinking Self-Assessment Scale, which was built specifically to be utilized by nurses as part of their own critical thinking self-assessment. Facione’s critical thinking assessment work has also been used to evaluate the effectiveness of different educational interventions with regard to their ability to improve critical thinking in pre- and post-exposure to treatment. A 2020 study by Wu et al. [ 15 ] utilized the disposition assessment tool developed using Facione’s work to evaluate the effectiveness of mind mapping exercises to increase critical thinking inclination of students. Additionally, the assessment tools built from this framework have been used to evaluate the potential predictors of critical thinking abilities of undergraduate students, such as taking courses online or transferring courses from another college or university [ 16 ].

2.1 Population and sample selection

The participants for this study were BS in Health Science (BSHS) students enrolled at a four-year, public, comprehensive state university located on the east coast of the United States. All participants were 18 years of age or older, were enrolled in one of five sections of the senior-level BSHS research course that was offered in the Spring 2022 academic semester, and agreed to participate in this study. Two additional sections of the course were offered that did not participate in the data collection efforts of this study. Participants in this study were those students in attendance during the class period that was utilized to administer the critical thinking assessment tools.

Students had declared one of the following concentrations within the BSHS degree: General Concentration, Pre-Occupational Therapy, Pre-Physical Therapy, or Pre-Communication Disorders. Participants were given the opportunity to complete each assessment tool in a voluntary capacity and were not required to complete either or both assessments under any conditions. As the study participants were recruited as part of the senior-level research course of the BS in Health Science degree, all participants were nearing the culmination of their undergraduate career at the time of data collection. No exclusionary criteria were used in selecting study participants beyond their enrollment within the BS in Health Science degree and current enrollment in the senior-level research course. All data collection took place after obtaining all necessary approvals from the Stockton University IRB Committee, including CITI training by the researcher. IRB approval was obtained after submitting all required documentation, proof of CITI training, study procedures, and informed consent documents (Stockton University IRB Approval Number #2021.175). A total of 194 students were enrolled in the senior-level research course during the Spring 2022 semester, with 5 sections of this course agreeing to participate in data collection efforts, who in total represented 140 enrolled students or 72% of the total student population enrolled in the senior-level research course in the Spring 2022 semester.

2.2 Instrumentation

The researcher used the California Critical Thinking Skills Test (CCTST) and the California Critical Thinking Disposition Inventory (CCTDI) for data collection purposes, and administered each assessment to students enrolled in the senior-level research course for the BS in Health Science degree. These tests are owned and administered by Insight Assessment and were developed in part by the work in Critical Thinking Assessment (CTA) theory as described by Peter et al. [ 12 , 13 ]. The CCTST is a 34-item, multiple-choice, non-discipline-specific test that evaluates critical thinking along 8 different subscales: Analysis, Interpretation, Inference, Evaluation, Explanation, Inductive Reasoning, Deductive Reasoning, and Numeracy. It is estimated that the CCTST takes an average of 45 min to complete. Each multiple-choice question item is related to generic situations not unique to any particular domain of work. Scores are developed using a proprietary formula, and range from low or non-manifested, weak development, moderate development, strong development, and superior development [ 17 ].

The CCTDI consists of 75 generic statements with a 6-point Likert-selected response scale that is also non-discipline specific. The CCTDI test evaluates disposition towards critical thinking along 7 different subscales: Truth Seeking, Analyticity, Open-Mindedness, Systematicity, Confidence in Reasoning, Inquisitiveness, and Maturity of Judgement. It is estimated that the CCTDI takes an average of 15–20 min to complete. Scores can range from 5 to 60 for each subscale and indicate a level of disposition ranging from weak disposition development, positive disposition development, or strong disposition development [ 18 ]. Each tool is delivered via an online web-based portal owned and operated by Insight Assessment. Both of these tools were selected due to their previously established validity and reliability in assessing the critical thinking skills and attitudes of study participants [ 13 ]. The CCTST has documented strength in both the content validity of each of the skill domains as well as construct validity. Validity has been demonstrated by correlational studies exploring critical thinking skills with additional measurements such as GPA and GRE scores, as well as for criterion (predictive) validity [ 13 , 19 , 20 ]. The CCTST has also shown strong internal reliability with documented Cronbach’s Alpha coefficients ranging from 0.60 to 0.78 on individual scales, and 0.90 or above for the overall measure [ 17 ]. See Appendix A for the breakdown of the score ranges that pertain to each level of development for the CCTST and the CCTDI.

The CCTDI has also been researched and assessed for its validity and reliability, with the inventory items being found to be valid with an internal consistency reliability score of 0.887 [ 21 ]. This has also been shown with a cross-cultural application of the CCTDI, with high content validity across cultural versions of the inventory with alpha coefficients ranging from 0.81 to 0.97 [ 22 ]. A recent meta-analysis by Orhan explored the reliability of the CCTDI using 98 alpha values across 87 unique studies of the CCTDI. Orhan found the CCTDI to be reliable across samples with an alpha value of 0.83 [ 23 ]. These studies have shown strong consistent validity and reliability for the CCTDI as an instrument for the assessment of the critical thinking dispositions of students.

To assure ongoing validity and reliability for both the CCTST and the CCTDI in this study, both tools were delivered exactly as instructed by Insight Assessment. No variations were made to either instrument, no questions or sections were added, omitted, or changed, and study participants met all requirements for participation as described by the Insight Assessment user’s manual [ 17 , 18 ].

2.3 Data collection and management

All data collection took place during the first 2 weeks of March 2022. Study participants and faculty were informed that participation in the data collection phase of this study was purely voluntary and that there would be no penalty for not participating in the study. All participant information has been kept confidential, and participants were provided with an informed consent form prior to the data collection beginning. Participants were also informed that the information collected for this study would not be shared with members of the public in any identifiable way and that all study findings would be presented as aggregated data. All data collection took place during the traditional meeting time of each confirmed section of the senior-level research course, with two sections meeting via Zoom conference, and three sections meeting face-to-face in a university computer lab.

Distribution and completion of each of the assessments took place via the online portal offered through Insight Assessment. Each class meeting allowed for adequate time for both assessments to be administered in a single class meeting. Data collection took place as an in-class activity for that day’s class meeting, and there was no course penalty for not taking part in the data collection. Should a student have declined to participate in the in-class activity during the day of data collection, they would have been provided with an assigned reading on critical thinking in the healthcare field that would be utilized for in-class discussions after the data collection activity had concluded. No student declined to participate in the study. In an effort to reduce student anxiety, students were assured that all results were purely for the purpose of assessment and that class rankings or comparisons would not be shared. Additionally, there was no additional course credit given for participation, nor were there any extra credit or similar potentially coercive incentives provided for data collection participation. To ensure each student participant had the opportunity to access the online platform, all in-person meetings took place in a university computer lab. For the 2 sections that met with the researcher via Zoom conferencing, all students had access to a laptop or computer with internet access. For these sections, the faculty member teaching the course was present in the Zoom room. It should be noted that since these meetings took place over Zoom, the continuity of the environment in which students were completing the assessment could not be guaranteed when compared to those students completing the assessment in a university computer lab.

2.4 Data analysis

All data collection took place during the first 2 weeks of March 2022, with all data being collected before the beginning of the Spring break period of the term. After all data collection was completed, reports were generated by Insight Assessment to provide results of Overall Critical Thinking Skills, Critical Thinking Skills across each Subscale, Overall Critical Thinking Dispositions, and Critical Thinking Dispositions across each Subscale. The overall critical thinking skill score population means for both the national undergraduate student population and the national health science undergraduate student population were obtained through Insight Assessment to allow for comparison between these two national populations and the study sample.

Descriptive statistics were generated for overall and subscale scores for the CCTST and the CCTDI, and frequency statistics were generated for ethnicity, gender, and declared concentration within the BSHS degree. T-tests for independent samples were conducted for gender for both the CCTST and the CCTDI. Overall student scores for the CCTST were analyzed for comparison to the national population of undergraduate students via one sample t-test. For the demographic variable of Degree Concentration, which includes General Concentration, Pre-Occupational Therapy Concentration, Pre-Physical Therapy Concentration, and Pre-Communication Disorders Concentration, the researcher conducted a one-way analysis of variance. All statistical analysis was conducted using the IBM SPSS software Version 25.

3.1 Research question 1

RQ1 sought to understand the critical thinking skills of undergraduate health science students at a four-year, public, comprehensive state university by utilizing the CCTST offered through Insight Assessment. Of the 140 total students who were invited to participate in this, 130 students completed the CCTST, as 10 students in total did not attend class on the day of data collection. Using the criteria set forward by Insight Assessment, the data results from 5 student participants were removed from the final report of the data, as they completed the CCTST in under 15 min, which would not be considered an adequate amount of time to thoughtfully respond to each question being asked. As a result, a total of 125 students completed the CCTST in full, representing approximately 89% of the potential sample. The mean age was 22 years of age, with 79% indicating female gender identity. See Table 1 for ethnicity findings and Table 2 for the frequency of each concentration identified within the BS in Health Science degree.

The CCTST is designed to assess and measure the critical thinking and numeracy skills that are used in the process of reflective reasoning in order to make an informed judgment about what to do, or what to believe, in a particular situation or setting. The CCTST provides an overall critical thinking score, as well as scores across 8 sub-domains: Analysis, Inference, Evaluation, Induction, Deduction, Interpretation, Explanation, and Numeracy. A brief description of each domain is described in Appendix B .

The CCTST scores are calculated by Insight Assessment via a proprietary formula for both the overall score and the score of each sub-domain. Study sample scores for overall critical thinking ability, as well as across each sub-domain, are shown via descriptive statistics in Table 3 . One-way analysis of variance showed no statistically significant difference ( p  = 0.708) in the Overall Critical Thinking Skills Scores of participants among the different degree concentration options (Pre-Occupational Therapy, Pre-Physical Therapy, Pre-Communication Disorders, and General) within the BS in Health Science students (see Tables 4 , 5 ).

3.2 Research question 2

RQ2 sought to answer was to understand how the overall critical thinking skills of undergraduate health science students at a four-year, public, comprehensive state university compared to the national population of undergraduate students. Aggregate data provided by Insight Assessment shows that the population mean score for overall critical thinking skills of four-year college/university undergraduate students is currently 75.3, which can be compared to the overall critical thinking skills score of 69.96 for the study’s sample. The results of the one-sample t-test showed that the overall critical thinking skills score of the study sample is significantly lower than that of the national four-year college/university undergraduate student population (see Table 6 ).

As Insight Assessment does not collect aggregate data for the sub-domain measurements, comparison between the national four-year college/university undergraduate student population and the study sample for each sub-domain was not possible.

3.3 Research question 3

RQ3 sought to understand the dispositions towards the importance of critical thinking of undergraduate health science students at a four-year, public, comprehensive state university. Of the 140 total students who were invited to participate in this, 130 students completed the CCTDI, as 10 students in total did not attend class on the day of data collection, representing approximately 93% of the potential sample. The mean age was 22 years of age, with 80% indicating female gender identity. See Table 7 for ethnicity findings, and Table 8 for frequency of each concentration identified within the BS in Health Science degree. It is important to note that the sample size is larger for RQ4 (n = 130) than for RQ1, RQ2, and RQ3 (n = 125), as all students who completed the CCTDI did so at or above the minimum amount of time deemed necessary to ensure validity and accuracy of the results.

The CCTDI is designed to assess the critical thinking mindset and attitudes of individuals toward critical thinking. The CCTDI provides scores across 7 subdomains: Truth-Seeking, Open-Mindedness, Inquisitiveness, Analyticity, Systematicity, Confidence in Reasoning, and Maturity of Judgment. A brief description of each domain is described in Appendix C .

Study sample scores for overall critical thinking dispositions, as well as across each sub-domain, are shown via descriptive statistics in Table 9 , with the sub-domain of Inquisitiveness showing the highest mean score of 46.5, and the sub-domain of Truth-Seeking showing the lowest mean score of 35.4. It is important to note that there are no national population means available for comparative purposes, as Insight Assessment does not collect this kind of national mean data for the CCTDI. The reason for this is that there is no correct or incorrect answer for each of the 6-point Likert questions asked in the CCTDI, and there is no ideal mean score for study results to be measured against.

3.4 Research question 4

RQ4 sought to understand to what degree overall academic performance, as measured by grade point average (GPA), is a reasonable indicator of critical thinking development. To help correct for multiple comparisons, a Bonferroni Correction was conducted. An adjusted p-value was computed by dividing a 0.05 level of significance by the number of correlations for both the CCTST and the CCTDI. For the CCTST, the adjusted p-value (0.05/9) was 0.005. For the CCTDI, the adjusted p-value (0.05/8) was 0.006. Tables 10 and 11 shows the correlation matrix between Critical Thinking Skills and GPA, and Critical Thinking Dispositions and GPA, respectfully, that indicated a statistically significant relationship. For both tables, relationships that are significant at the 0.05 level are marked with a single asterisk (*) and those that are significant at the adjusted p-value levels are marked with a double asterisk (**). Pearson correlation shows a statistically significant positive correlation between GPA and overall critical thinking skills (0.235, p  = 0.008), as well as across all critical thinking subscales (Table 10 ), with the subscale of numeracy showing the highest correlation with GPA (0.300, p  = 0.001). Pearson correlation shows a statistically significant positive correlation between GPA and the critical thinking disposition subscale of systematicity only (0.175, p  = 0.047), with no other subscale showing a statistically significant correlation (Table 11 ).

4 Discussion

The aim of this study was to understand the critical thinking skills and dispositions of undergraduate students enrolled in a BS in Health Science degree program at a four-year university. The findings of this study are in agreement with the published research pertaining to critical thinking skills development in undergraduate students as a whole, as some estimates have described that 45% of undergraduate students do not show meaningful improvement in their critical thinking skills upon graduation, with even this number potentially being underestimated [ 24 ]. As this study was not longitudinal in nature, it is not known to what degree critical thinking skills or dispositions did or did not improve over the course of a student’s higher education experience. Rather, this study provides a snapshot of the skills and dispositions found at the culmination of their program of study. Therefore, the findings of this study do not necessarily suggest a failure to develop critical thinking skills and dispositions of this particular Health Science undergraduate program. Instead, it provides insight into the degree to which critical thinking skills and attitudes have been developed upon the conclusion of academic study, with opportunities to evaluate ways in which to further enhance critical thinking skill and disposition development by understanding the current baseline.

Earlier research conducted by Keeley et al. [ 25 ] points to a common resistance of students to engage in critical thinking, which these authors suggest may be due to a generalized resistance to engaging in different forms of learning and studying behaviors than they have previously utilized in their education in an effort to “avoid change, work, and pain.” The authors also suggest that students who do not regularly engage in self-reflection (i.e. why am I resistant to engaging in critical thinking?) are less likely to be aware of their hesitation in the first place.

Another potential reason for this deficiency in critical thinking skill development may be pedagogical in nature. Higher education pedagogy is often content-based and seeks to imbue students and learners with deep knowledge about a series of subjects, whereas a more critical thinking-oriented pedagogy is rooted in teaching students and learners how to think complexly and across a number of different areas [ 26 ]. As a result of a heavily content-based pedagogy, undergraduate students may not be receiving the kind of complex and problem-based learning environment needed to develop a more robust critical thinking skillset. Research by Matthews & Lowe also suggests both pedagogical and environmental reasons as to why students may be resistant to engaging in more critical thinking and critically reflective mindsets [ 27 ]. Particularly, these researchers highlight the need for the development of the critical thinking disposition (the critical spirit described by Facione) in order for students to overcome resistance to both developing and utilizing critical thinking skills in their educational and professional endeavors. Without possession of a strong disposition toward critical thinking, more overt resistance to the utilization of critical thinking may remain.

Participants in this study showed the strongest development in Inference and Induction skills. Inference, the ability to draw logical conclusions based on presented data, is an important subdomain of critical thinking skills. Healthcare practice and research both require the utilization of inferential reasoning in order to appropriately draw conclusions and make recommendations in situations and environments that are not always pristine or ideal [ 28 ]. This allows for the greater development of a “what if?” mindset that can be of significant importance in health-related environments. The similar level of development found within induction is interesting to note, as induction can be seen as a sub-category of inferential reasoning. Clinical reasoning requires the development and application of inductive reasoning in order to make larger generalizations and conclusions based on the individual clinical scenarios or patterns that are being witnessed and observed [ 29 ]. While development in the areas of Inference and Induction was only at a moderate level, as opposed to strong or superior development as described by the CCTST, it is still important to note that these areas are of significant importance when it comes to future work in the healthcare field.

What starts to become more concerning are the areas within critical thinking skillsets that were more weakly developed and demonstrated in this study. Weak development in the area of evaluation is worthy of special attention, as the healthcare field is riddled with dubious claims, misinformation campaigns, and conspiracy theories. Recent research done by Lantian, Bagneux, Delouvée, & Gauvrit provided insight into the link between evaluative and critical thinking abilities and subscribing to conspiratorial beliefs and theories [ 30 ]. Courses that emphasize evaluation skills have been shown to reduce adherence to pseudoscientific beliefs while also building a more skeptical frame of mind when coming across new information or claims [ 31 ].

The skillset with the lowest level of development was numeracy, with weak to no development in this area being shown by the CCTST. This is, in some ways, not surprising, as adults in the United States have been found to perform well below average in numeracy skills when compared to adults in other developed nations according to the Organization for Economic Cooperation and Development [ 32 ]. However, in this present study, it must be noted that not only was numeracy the most poorly developed critical thinking skillset, numeracy mean scores fell within the weak to not developed range. This is a finding of great importance, as numeracy is a required skill within the field of healthcare. Regarding critical thinking as a whole, the study sample’s mean score was 69.96 for overall critical thinking skills, which was statistically significantly lower than the overall critical thinking skills of 4-year college/university undergraduate students’ mean score of 75.3. Utilizing the criteria provided in the CCTST, the study sample mean shows weak to moderate development, whereas the national student population shows moderate development at the higher end of the moderate development range. This shows that not only do the study participants show lower development in their critical thinking skills when compared to the national population, but that the study participants are a full category of development lower.

Results show that the majority of dispositions assessed in the CCTDI showed positive development among the study participants. Open-Mindedness, Inquisitiveness, Analyticity, and Confidence in Reasoning were all found to fall within the positive range of personal development. Higher scores on the CCTDI have been found to be associated with greater problem-solving skills, showing that these affective qualities are important in the overall critical thinking attributes of students [ 33 ]. Open-mindedness and inquisitiveness are especially important dispositions to possess, as they are paramount to supporting the desire to learn and to enhance personal knowledge within students, which has further been associated with better student performance in higher education [ 34 ].

Particularly with students pursuing health-related careers, open-mindedness again has been found to be associated with academic success and graded work in courses [ 35 ]. While these other domains of analyticity and confidence in reasoning are associated with problem-solving overall, they are not as predictive of student success and readiness as open-mindedness and inquisitiveness [ 36 ], although higher dispositions overall are an important aspect of building problem-based learning skills.

What is perhaps most concerning amongst the findings pertaining to this research question is that Truth-Seeking showed the lowest disposition development, with results showing inconsistent to ambivalent demonstration. Truth-seeking is a necessary disposition to possess in order to seek out the best possible evidence and information to understand a situation or issue. As such, truth-seeking behavior has been described as the main predictive dispositional factor of an individual possessing a robust overall critical thinking behavior [ 37 ]. In particular, truth-seeking allows one to question their previously held beliefs or ideas about a topic, which is critical in the healthcare field, as new information and science are always coming forward. This new information often may displace or change previously held theories or practices, and a truth-seeking disposition is required in order to critically evaluate and accept new information that is found to be factually based.

Part of the reason why dispositions and attitudes towards Truth-Seeking may be so hard to foster is the subjective and often abstract nature of what constitutes truth, which is then further compounded by the copious amounts of information that students are tasked with processing when attempting to determine factualness. As described by Arth et al. [ 38 ], “…information is available to people in unrecordable amounts and insurmountable ways.” The sheer amount of information that students are being confronted with is only increasing, and without proper information literacy preparation, and particularly digital information literacy, students may be both unprepared and unmotivated to seek out that information which would point towards the truth. This point is reinforced by Gibbs [ 39 ], who emphasizes the additional consideration of trust in self. Without a level of trust in one’s own ability to both seek out true information and simultaneously recognize false or misleading information, students may not possess the confidence necessary to develop a stronger attitudes towards truth-seeking as a behavior.

Regarding the findings pertaining to correlations between GPA and critical thinking skills and dispositions, academic performance and GPA have been shown to be associated with greater critical thinking skill development [ 40 ]. And while GPA is not the only indicator of skill development, overall academic performance and success may be one way of measuring the potential for critical thinking skillset enhancement. The finding of numeracy being the most positively correlated subscale with GPA is an important one, given the overall poor development of numeracy skill development in this study sample. However, numeracy as a skill that was shown to be poorly developed in this study may be impacted by more than overall GPA and academic development. Within the research seeking to understanding why mathematics and numeracy skills are often poorly developed in American students, negative stereotypes, stigma, and poor sense of self have been identified as significant influences. The psychological impact of negative self-stereotyping can be a double-edged sword, both in terms of instructor biases towards what kind of student tends to be better at mathematics, as well as student self-belief regarding whether or not they are the kind of student who is good at math [ 41 ]. The impact of stereotype threat on mathematics and numeracy achievement has been identified as a potential key factor in the overall lack of mathematics development across student groups and demographics [ 42 ], with female-identifying students in particular being highly vulnerable to these stereotyping images and messages [ 43 ]. Considering the high percentage of female-identifying students within this study, the potential impact of stereotyping and stereotype threat, particularly its role in mathematics and numeracy skill development and utilization, cannot be ignored.

With critical thinking dispositions, the fact that systematicity was the only subscale found to be associated with GPA is in some ways not surprising, as systematicity is the tendency to approach problems in an ordered, disciplined, and systematic way. Those with higher GPAs may naturally be inclined to a more systematic way of approaching their work and studies, which may explain this correlation. However, it should also be noted that no other disposition subscale was found to be correlated with GPA, which brings attention to the fact that GPA and academic grade achievement may not be an indicator of disposition and attitude towards critical thinking. This highlights the limitation of using GPA as a barometer for critical thinking development, as it cannot fully capture or predict how a student will conceptualize and utilize critical thinking in their personal or professional lives.

However, this finding does highlight the phenomenon that students may possess critical thinking skills but not possess the disposition necessary to put these skills into use, which may in part be influenced by the dispositions of the educators who are teaching these students. A recent study by Shin et al. [ 44 ] explored the role of a critical reflection competency program for nurse educators in improving the educators’ dispositions. Participation in a 4-week critical reflection competency program was found to improve the critical thinking dispositions and teaching efficacy of nurse educators, which simultaneously allows for greater opportunity for nursing educators to imbue these dispositions and attitudes within their students. How an educator is projecting their own attitudes towards the importance of critical thinking utilization may have a significant impact on how they are not only designing curriculum and teaching methods, but also in how they are creating a general environment that fosters a curious mind and a stronger disposition towards employing critical thinking skills in work.

An additional influential factor on the development of critical thinking dispositions may be the opportunity for a student to explore and utilize creativity in their classroom. Qiang et al. [ 45 ] found that a student’s critical thinking disposition was positively related to their self-concepts of creativity and scientific creativity in particular. This was further emphasized by Khoshgoftar et al. [ 46 ], who found a direct relationship with critical thinking dispositions and reflective creative capacities. The significance of these findings are two-fold. First, that classroom learning opportunities that emphasize creativity and reflection opportunities may help to further bolster critical thinking dispositions within students, and secondly, that a student’s ability to be reflective and creative may not always be properly captured in GPA scoring. Educators, particularly those working with students in the health sciences, may find benefit in not only improving their own dispositions towards critical thinking, but also find opportunities to properly assign, assess, and capture reflection and creative capacity in their students to further enhance student disposition development.

4.1 Implications for practice

The findings of this study are of great importance, as future healthcare professionals need to possess the critical thinking skills and dispositions necessary to perform their work accurately and safely, especially given a work environment that is ever-increasing in its complexity. As this study was conducted with pre-professional health science students, the ways in which the findings of this study may be applied to the field of health pre-professional education are specific to the development of these skills and dispositions before clinical education and/or encounters with patients or community members begins. This speaks specifically to the general development of cognitive skills and attitudes versus clinical skills and attitudes, which would be developed during their post-baccalaureate education and training.

An important area to note is the correlational relationship that exists between the different subscales of both the CCTST and the CCTDI, particularly those correlations that showed the strongest relationship to one another. Overall critical thinking skill was most strongly correlated with analysis, inference, induction, and deduction skills, which provides insight into ways to focus potential curricular and pedagogical changes that may work to increase overall critical thinking skills within students. Course assignments, projects, educational lessons, and readings that require students to utilize analytical, inferential, and both inductive and deductive skills may be of particular benefit to shaping an overall improvement and strengthening of critical thinking within students. Numeracy skills, which were the most poorly developed, were most positively correlated with explanatory skills. This is an important finding, as strengthening explanatory skills, which refers to a student’s ability to defend and justify a belief or a response to a question, may have a simultaneous benefit of supporting a student’s development in numeracy.

Regarding critical thinking dispositions, while truth-seeking was the most poorly developed attitude, it also showed the strongest correlation with overall critical thinking dispositions. Therefore, in an effort to improve truth-seeking dispositions within students, exposing students to opportunities that will overall strengthen and support their dispositions towards critical thinking may have the added benefit of supporting their desires to seek out the truth. Maturity of judgment also showed a higher correlation with truth-seeking, which again provides helpful insight. As maturity of judgment allows a student to understand and accept that multiple solutions or options may be possible when approaching a question or issue, and that complexity is an inherent aspect of many problems and issues, fostering this disposition within students may again help to support their development within truth-seeking.

In an effort to put these findings into practical use, the first and most immediate practice-based recommendation based on the findings of this study is to evaluate programmatic curriculum and teaching approaches that have been shown to promote critical thinking skill development in higher education settings. Mahmoud & Mohamed [ 47 ] provide several evidence-based recommendations for the enhancement of critical thinking skills and abilities. While a few of these recommendations are described below, readers are encouraged to read the paper by Mahmoud & Mohamed in its entirety, particularly those educators who work with health-oriented students, in order to fully recognize the breadth of curricular and teaching approaches recommended.

Problem-Based Learning A major component of pre-health profession education should be problem-based learning, which is a student-centered approach to the learning process that focuses on solving open-ended problems through collaborative engagement with other learners in a group setting.

Programmatic Orientation Students often do not fully understand the philosophy and core concepts of the programs they are selecting to study. As students are often oriented to their college or university after admittance, so too should they be fully orientated to the program of study they are choosing as their major.

Clinical Scenarios Context-dependent activities ask the learner to bring their life experiences, prior learning, and personal skills into the classroom. In this way, improved recall and application of knowledge have been shown to be enhanced, allowing students to encode information learned in such a way that it can be easily retrieved when they are in a specific scenario.

An additional recommendation is to encourage faculty members of pre-health educational programs to adjust their curriculum and teaching styles, such as the utilization of a flipped classroom model, to promote critical thinking dispositions. This may be particularly helpful in developing the disposition of truth-seeking, which was not only found to be poorly developed in this study but in other studies that have sought to understand the dispositions of students in the healthcare field [ 48 ]. However, as previously shared, resistance to new methods of teaching can influence how effective a flipped-classroom approach can be in fostering critical thinking skills and dispositions. Oudbier, Spaai, Timmermans, & Boerboom highlight how student self-regulation, the motivation of the faulty member, and variation in assessment approaches can all play a significant role in whether a flipped classroom approach will be effective [ 48 ]. To increase the positive possible outcomes of such an approach, Arth et al. [ 38 ] provide valuable insights and recommendations made by professors on how to encourage critical thinking and truth-seeking dispositions within undergraduate students. Selected examples of their recommendations are particularly linked with curricular design and teaching strategies.

Research Information Skills The ability to properly seek out and evaluate information should be incorporated throughout the curriculum in a variety of classes versus localizing these skills in a research-specific course. Specifically, students need to learn the difference between researching information via the scientific method versus simply looking up information.

Belief Bias & Skepticism An important aspect of developing a critically oriented mindset is to understand one’s own biases, and how these personal biases can influence the way in which information is sought out and interpreted. In this way, confirmation bias can be avoided, and a healthy level of skepticism can be maintained.

Discernment of Good vs. Bad Information Avoiding belief bias and maintaining a skeptical mindset also links to the desire to find reliable information and to be able to discern good quality from poor quality information. Given the proliferation of questionable claims that are found through online sources, educators need to be teaching the necessary skills to determine the reliability of the information that is obtained during the research process.

The Constant Pursuit of Truth Although it may initially seem counterintuitive, one of the most important ways to encourage a truth-seeking disposition in undergraduate students is to design a curriculum that reinforces the idea that nothing can ever be known with complete certainty, particularly in the health sciences. Not only because of the abundance of information of questionable validity and reliability, but also due to the fact that information is constantly changing as new research is conducted and new evidence is gathered. As previously shared in the research by Arth et al. [ 38 ], students need to be encouraged to see the pursuit of truth as an ever evolving behavior due to the plethora of new information that is being shared, particularly via digital platforms. This requires students to be comfortable with a lack of finality when it comes to the pursuit of truthful and factual information. Students who are exposed to educational environments that encourage comfortability with the ever-present need to seek out truth through purposefully designed learning experiences, modeling techniques, and reflection time from their faculty have been shown to improve in both their critical thinking skills as well as their attitudes towards seeking out truth [ 49 , 50 ].

It should be noted that this study serves as an internal review and assessment of a single academic program within the field of health science. While this may lead to a reduction in generalizability to other educational program, what this study can contribute is the necessity for higher education programs to engage in this very kind of assessment and evaluation of critical thinking skills and attitudes of their students. Without engaging in an internal assessment and audit of student critical thinking skills and attitudes, educators and curriculum developers will not have the information and data needed to determine whether or not their curricular program, as well as the pedagogical methods being employed by faculty, is leading to a robust development in critical thinking skills and attitudes. These methods are not currently in place in meaningful levels within the current program of study that the students within this study were enrolled in, and it is the hope of this researcher that these methods will be increased in an effort to increase critical thinking skills and dispositions over time.

4.2 Study strengths and weaknesses

As with any scholarly research, there are limitations to the methods of research design and data collection that influence the results of the study itself. First, the collection of data for this study utilized a sample of convenience. This researcher is a faculty member in this BS in Health Science program and therefore was able to focus data collection solely on students with which he had easy and convenient access. Since all students who responded were a part of this single program, it is difficult to be able to fully generalize the results of the CCTST and the CCTDI to the undergraduate health science population as a whole. While this does allow for a more specific analysis of this particular cohort of students, it does introduce limitations into how study findings can be then expanded to additional institutions of higher education.

A second limitation of this study is the lack of ability to compare the critical thinking skills and dispositions of BS in Health Science students to other undergraduate students at the same university who are enrolled in other programs of study. While a comparison to national population means was possible for overall critical thinking scores, it would have been illuminating to be able to compare the mean scores across each subdomain of critical thinking skill as well. Since these data are not collected or stored by Insight Assessment, being able to draw data from other students at the same university would have made this kind of comparison possible. Given the logistical and financial constraints that existed, it would not have been possible to collect data from an adequate number of non-health science students, leaving this kind of subdomain comparison absent from this particular study. It should also be noted that 2 sections of students did complete the assessment over Zoom versus being in an in-person computer lab setting. While there is no evidence to suggest a significant difference in student performance or adherence to assessment guidelines between those completing the assessment online versus those in person, nonetheless it may have played a role in impacting student outcomes.

An additional consideration is the comparison of data from the study sample to the national population of students providing Overall Critical Thinking Skill scores. Since the exact demographic breakdown and program of study breakdown of this national population of students is not known, there is a natural limit to what degree this comparison is helpful. Future research hoping to compare a study sample to another sample or population would benefit from knowing more specific details pertaining to the demographic and educational descriptors in order to extrapolate greater findings.

Lastly, data collection took place amongst a group of senior-level students who had spent the previous 2 years of their undergraduate-level education in the COVID-19 global pandemic environment. The impact of COVID-19, and especially the way in which it significantly impacted the field of higher education and of learning as a whole, is still being assessed and understood. For the purposes of this study, it would not have been possible to control for the ways in which COVID-19 may have temporarily or permanently impacted critical thinking skills and dispositions. As such, the results of this study must be viewed through this lens, as it is possible that the scores for skills and dispositions would have been different in a non-COVID-impacted learning environment.

5 Conclusion

Undergraduate health science students within this study population show low to moderate development of critical thinking skills, with numeracy skills being particularly poorly developed, and grade point average being moderately but significantly associated with critical thinking skill development across all subscales. And while students show positive development across most critical thinking disposition subscales, they also show inconsistent and ambivalent dispositions towards truth-seeking, with grade point average not being a significant indicator of attitudes and dispositions. Health science education programs that hope to enhance and strengthen both critical thinking skill and disposition development may wish to implement evidence-based pedagogical practices to ensure students are prepared for professional practice within the field of health science that require strong critical thinking development.

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Arnott SR. Evidence beyond the rules: a critical thinking approach to teaching evidence law to undergraduate students. J Scholarsh Teach Learn. 2018. https://doi.org/10.14434/josotl.v18i4.22812 .

Article   Google Scholar  

Hitchcock D. Stanford encyclopedia of philosophy—critical thinking. 2018. https://plato.stanford.edu/entries/critical-thinking . Accessed Jan 15 2023.

Allen DD, Toth-Cohen S. Use of case studies to promote critical thinking in occupational therapy students. J Occup Ther Ed. 2019. https://doi.org/10.26681/jote.2019.030309 .

Morris RJ, Gorham-Rowan MM, Robinson RJ, Scholz K. Assessing and teaching critical thinking in communication science and disorders. Teach Learn Commun Sci Dsord. 2018. https://doi.org/10.30707/TLCSD2.1Morris .

Sharples JM, et al. Critical thinking in healthcare and education. Brit Med J. 2017. https://doi.org/10.1136/bmj.j2234 .

Irby D, Cooke M, O’Brien B. Calls for reform of medical education by the carnegie foundation for the advancement of teaching: 1910 and 2010. Ac Med. 2010. https://doi.org/10.1097/ACM.0b013e3181c88449 .

Mann KV. Theoretical perspective in medical education: past experience and future possibilities. Med Ed. 2011. https://doi.org/10.1111/j.1365-2923.2010.03757.x .

Kanbay Y, Okanlı A. The effect of critical thinking education on nursing students’ problem-solving skills. Contemp Nurs. 2017. https://doi.org/10.1080/10376178.2017.1339567 .

Chacon JA, Janssen H. Teaching critical thinking and problem-solving skills to healthcare professionals. Med Sci Ed. 2021. https://doi.org/10.1007/s40670-020-01128-3 .

Hanley P, Slavin RE, Elliot L. Thinking, doing, talking science. Evaluation report and executive summary: Education endowment foundation. 2015. https://educationendowmentfoundation.org.uk/projects-and-evaluation/projects/thinking-doing-talking-science/ . Accessed Jan 15 2023.

Cummings L. Critical thinking in medicine and health. Fall in Med Health. 2020. https://doi.org/10.1007/978-3-030-28513-5_1 .

Facione NC, Facione PA. Externalizing the critical thinking in clinical judgment. NursOutlook. 1996. https://doi.org/10.1016/S0029-6554(06)80005-9 .

Facione NC, Facione PA, Sanchez C. Critical thinking disposition as a measure of competent clinical judgment: the development of the California critical thinking disposition inventory. J Nurs Ed. 1994. https://doi.org/10.3928/0148-4834-19941001-05 .

Nair GG, Stambler LL. A conceptual framework for developing a critical thinking self-assessment scale. J Nurs Ed. 2013. https://doi.org/10.3928/01484834-20120215-01 .

Wu HZ, Wu QT. Impact of mind mapping on the critical thinking ability of clinical nursing students and teaching application. J Int Med Res. 2020. https://doi.org/10.1177/0300060519893225 .

Terry N, Ervin B. Student performance on the California critical thinking skills test. Acad Ed Lead J. 2012;16:S25.

Google Scholar  

CCTST User Manual and Resource Guide. Insight assessment. Oakland: The California Academic Press; 2021.

CCTDI User Manual and Resource Guide. Insight assessment. Oakland: The California Academic Press; 2021.

Denia A. Association of critical thinking skills with clinical performance in fourth-year optometry students. Optom Ed. 2008;33:103–6.

Paans W, Sermeus W, Niewsg R, van der Schans C. Determinants of the accuracy of nursing diagnoses: Influence of ready knowledge, knowledge sources, disposition toward critical thinking and reasoning skills. J Prof Nurs. 2010. https://doi.org/10.1016/j.profnurs.2009.12.006 .

Redhana I, Sudria IBN. Validity and reliability of critical thinking disposition inventory. Proceedings of the 3rd International Conference on Innovative Research Across Disciplines. 2020. https://doi.org/10.2991/assehr.k.200115.046 .

İskïfglu G. Cross-cultural equivalency of the California critical thinking disposition inventory. Ed Sci Theory Prac. 2013. https://doi.org/10.12738/estp.2014.1.1840 .

Orhan A. California critical thinking disposition inventory: reliability generalization meta-analysis. J Psychoeduc. 2022. https://doi.org/10.1177/07342829211048962 .

Lane D, Oswald FL. Do 45% of college students lack critical thinking skills? Revisiting a central conclusion of academically adrift. Ed Meas Iss Pract. 2016. https://doi.org/10.1111/emip.12120 .

Keeley SM, Shemberg KM, Cowell BS, Zinnbauer BJ. Coping with student resistance to critical thinking. Coll Teach. 1995. https://doi.org/10.1080/87567555.1995.9925537 .

Flores KL, Matkin GS, Burbach ME, Quinn CE, Harding H. Deficient critical thinking skills among college graduates: implications for leadership. Ed Phil Theory. 2012. https://doi.org/10.1111/j.1469-5812.2010.00672.x .

Mathews SR, Lowe K. Classroom environments that foster a disposition for critical thinking. Learn Envir Res. 2011. https://doi.org/10.1007/s10984-011-9082-2 .

Moser A, Puhan MA, Zwahlen M. The role of causal inference in health services research I: tasks in health services research. Int J Pub Health. 2020. https://doi.org/10.1007/s00038-020-01333-2 .

Shin HS. Reasoning processes in clinical reasoning: from the perspective of cognitive psychology. Korean J Med Ed. 2019. https://doi.org/10.3946/kjme.2019.140 .

Lantian A, Bagneux V, Delouvée S, Gauvrit N. Maybe a free thinker but not a critical one: high conspiracy belief is associated with low critical thinking ability. Appl Cog Psych. 2021. https://doi.org/10.1002/acp.3790 .

Wilson JA. Reducing pseudoscientific and paranormal beliefs in university students through a course in science and critical thinking. Sci Ed. 2018. https://doi.org/10.1007/s11191-018-9956-0 .

OECD—Skills matter: additional results from the survey of adult skills—United States. 2019. https://www.oecd.org/skills/piaac/publications/countryspecificmaterial/PIAAC_Country_Note_USA.pdf . Accessed 25 Jan 2023.

Tümkaya S, et al. An investigation of university student’s critical thinking disposition and perceived problem solving skills. Euras J Ed Res. 2009;36:57–74.

Comer RD, Schweiger TA, Shelton P. Impact of students’ strengths, critical thinking skills and disposition on academic success in the first year of a PharmD program. Amer J Pharm Ed. 2019. https://doi.org/10.5688/ajpe6499 .

Ozcan H, Elkoca A. Critical thinking skills of nursing candidates. Int J Car Sci. 2019;12:1600–6.

Pu D, et al. Influence of critical thinking disposition on the learning efficiency of problem-based learning in undergraduate medical students. BMC Med Ed. 2019. https://doi.org/10.1186/s12909-018-1418-5 .

Rahmawati M, Kurniati D, Trapsilasiwi D, Osman S. Students’ truth-seeking behaviour in solving problems with no specified universal set given. Kreano. 2021. https://doi.org/10.15294/kreano.v12i2.32549 .

Arth A, Griffin D, Earnest W. Professors’ perspectives on truth-seeking and new literacy. J Med Lit Ed. 2019. https://doi.org/10.23860/JMLE-2019-11-3-6 .

Gibbs P. Why academics should have a duty of truth telling in an epoch of post-truth? High Ed. 2019. https://doi.org/10.1007/s10734-018-0354-y .

Ghazivakili Z, Norouzi NR, Panahi F, Karimi M, Gholsorkh H, Ahmadi Z. The role of critical thinking skills and learning styles of university students in their academic performance. J Advanc Med Ed Prof. 2014;2:95–102.

Reyna C. Lazy, dumb, or industrious: when stereotypes convey attribution information in the classroom. Educ Psych Rev. 2000. https://doi.org/10.1023/A:1009037101170 .

Appel M, Kronberger N. Stereotypes and the achievement gap: stereotype threat prior to test taking. Educ Psych Rev. 2012. https://doi.org/10.1007/s10648-012-9200-4 .

Chang F, Luo M, Walton G, Aguilar L, Bailenson J. Stereotype threat in virtual learning environments: effects of avatar gender and sexist behavior on women’s math learning outcomes. Cyberpsych Behav Soc Net. 2019. https://doi.org/10.1089/cyber.2019.0106 .

Shin S, Lee I, Kim J, Oh E, Hong E. Effectiveness of a critical reflection competency program for clinical nurse educators: a pilot study. BMC Nurs. 2023. https://doi.org/10.1186/s12912-023-01236-6 .

Qiang R, Han Q, Guo Y, Bai J, Karwowski M. Critical thinking disposition and scientific creativity: the mediating role of creative self-efficacy. J Cret Behav. 2020. https://doi.org/10.1002/jocb.347 .

Khoshgoftar Z, Barkhordari-Sharifabad M. Medical students’ reflective capacity and its role in their critical thinking disposition. BMC Med Ed. 2023. https://doi.org/10.1186/s12909-023-04163-x .

Mahmoud SA, Mohamed HA. Critical thinking disposition among nurses working in public hospitals at port-said governorate. Int J Nurs Sci. 2017. https://doi.org/10.1016/j.ijnss.2017.02.006 .

Oudbier J, Spaai G, Timmermans K, Boerboom T. Enhancing the effectiveness of flipped classroom in health science education: a state-of-the-art review. BMC Med Ed. 2022. https://doi.org/10.1186/s12909-021-03052-5 .

Medina MS, Castleberry AN, Persky AM. Strategies for improving learner metacognition in health professional education. Am J Pharm Ed. 2017. https://doi.org/10.5688/ajpe81478 .

Abiogu GC, et al. Cognitive-behavioral reflective training for improving critical thinking disposition of nursing students. Medicine. 2020. https://doi.org/10.1097/MD.0000000000022429 .

Download references

Author information

Authors and affiliations.

School of Health Sciences, Stockton University, 101 Vera King Farris Dr., Galloway, NJ, 08205, USA

Anthony Dissen

You can also search for this author in PubMed   Google Scholar

Contributions

Author completed all data collection, analysis, table formatting, literature review, and manuscript writing.

Corresponding author

Correspondence to Anthony Dissen .

Ethics declarations

Ethics approval and consent to participate.

The research instruments and research methods for this study were approved by the Stockton University Institutional Review Board. All data collection took place after obtaining all necessary approvals from the Stockton University IRB Committee, including CITI training by the researcher. IRB approval was obtained after submitting all required documentation, proof of CITI training, study procedures, and informed consent documents (Stockton University IRB Approval Number #2021.175). All research activities were carried out following the guidelines set forth by the Stockton University IRB.

Competing interests

The author received no funding as part of this study, nor does he have any Competing interests related to the design or implementation of this study.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A: Score ranges for CCTST and CCTDI

1.1 cctst score ranges, 1.2 cctdi score ranges, appendix b: california critical thinking skills test domain descriptions.

Overall critical thinking skills score overall ability and strength of a student to use reflective judgement and reasoning about how to make choices regarding a particular set of actions or how to develop an informed belief or opinion. This predicts capacities for success in educational and professional settings that require reasoned decision making and complex problem-solving.

Analysis score a measurement of overall analytical skill. This score is used to identify reasons, themes, assumptions, and evidence present that all must be considered and utilized when making an argument or offering explanation for phenomena.

Inference score refers to those skills and abilities that allow one to draw conclusions from the evidence, experiences, and observations being presented. In addition, Inference Scores show how one uses their personal values, beliefs, and reasoning skills to draw conclusions.

Evaluation scores the ability of someone to assess the credibility of claims and assertions being made by others, as well as their ability to assess the quality of the reasoning being used by others when an argument is being made or an explanation is being given.

Induction skill score one’s ability to estimate the likely outcomes of certain decisions or choices. Inductive reasoning and decision making is often assessed after reviewing case studies, reflecting upon prior life experiences, performing statistical analyses, participating in simulations, reviewing hypothetical situations, or studying patterns that emerge in a set of events.

Deduction critical thinking skills score the ability to engage in logical decision making that is based on a given set of rules, beliefs, conditions, values, principles, and/or policies.

Interpretation critical thinking skills score the development in the process of discovering and assigning meaning to information or events. Interpretive skills can be applied to verbal information, written text, and graphical and/or pictorial information.

Explanation critical thinking skills score the development in the process of justifying a decision that has been made or a belief that has been stated. Strong skills in this sub-domain rely upon the ability to provide evidence and to explain the methods used to explain the decision that has been made.

Numeracy critical thinking skills score the ability to make judgments and decisions based on quantitative information within a variety of different environments and contexts. This can include description on how quantitative information is gathered, adjusted, manipulated, represented, and explained.

Appendix C: California critical thinking disposition inventory domain descriptions

Truth-seeking score the habit and desire to seek out the best possible understanding of any given situation or issue. Truth-Seeking requires the goal of following the best available evidence to come to an informed conclusion, even if this leads one to question previously held beliefs or ideas.

Open-mindedness score the tendency to give space to others to voice their views, opinions, and beliefs, even when one may not personally agree with what is being shared. Open-Mindedness is a necessary disposition to be able to regard the opinions of others, and to understand the complexities that exist in a pluralistic and intersectional society.

Inquisitiveness score a curiosity at the intellectual level that is motivated by a desire to know and understand. Inquisitiveness is particularly related to an inherent desire to know this information, even if it does not appear to be immediately useful or relevant.

Analyticity Score the tendency to be actively aware of the next stage of actions that occur during an occurrence or event. Analyticity involves anticipating both positive and negative outcomes, and the various choices, plans, and proposals that can be considered at any given time.

Systematicity score the tendency to strive to approach issues or problems in an ordered, disciplined, and systematic way. Systematicity provides one with the desire to approach questions and uncertain situations in a purposeful manner, even when they do not possess a strong background or skill in using a particular approach.

Confidence in reasoning score the tendency and habit to solve problems and make decisions by trusting in reflective thinking and assessment. This relates to not only the confidence in one’s own reasoning process, but also in the reasoning that is utilized by groups and teams.

Maturity of judgment score refers to the habit and desire to be able to make timely decisions when confronted with complex issues and situations. Possessing an attitude that emphasizes Maturity of Judgment allows one to understand and accept that multiple solutions or options may be possible when approaching a question or issue and recognize that black-and-white thinking is not appropriate.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Dissen, A. A critical issue: assessing the critical thinking skills and dispositions of undergraduate health science students. Discov Educ 2 , 21 (2023). https://doi.org/10.1007/s44217-023-00044-z

Download citation

Received : 01 March 2023

Accepted : 02 August 2023

Published : 15 August 2023

DOI : https://doi.org/10.1007/s44217-023-00044-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Critical thinking
  • Educational assessment
  • Health science education
  • Undergraduate education
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. 💋 What is critical thinking examples. What Is Critical Thinking?. 2022

    critical thinking is needed to evaluate scientific findings pertaining to

  2. 6 Main Types of Critical Thinking Skills (With Examples)

    critical thinking is needed to evaluate scientific findings pertaining to

  3. Critical Thinking Definition, Skills, and Examples

    critical thinking is needed to evaluate scientific findings pertaining to

  4. Critical Thinking Skills: Definitions, Examples, and How to Improve

    critical thinking is needed to evaluate scientific findings pertaining to

  5. Critical Thinking

    critical thinking is needed to evaluate scientific findings pertaining to

  6. Guide to improve critical thinking skills

    critical thinking is needed to evaluate scientific findings pertaining to

VIDEO

  1. Critical thinking and deferring to experts

  2. INQA Conference 2023: Takashi Imoto, AIST

  3. What is Critical Thinking and how to develop this skill (5 tips) #criticalthinking

  4. Evaluating Cause and Effect Claims: A Critical Thinking Approach

  5. CORE-MD webinar Cardiovascular and diabetic

  6. The Hidden Secret to Developing Critical Thinking

COMMENTS

  1. What influences students' abilities to critically evaluate scientific investigations?

    Here we evaluate the efficacy of assessment questions to probe students' critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC).

  2. Understanding the Complex Relationship between Critical Thinking and

    The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking.

  3. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  4. What influences students' abilities to critically evaluate scientific

    Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of ...

  5. Redefining Critical Thinking: Teaching Students to Think like

    Scientific thinking is the ability to generate, test, and evaluate claims, data, and theories (e.g., Bullock et al., 2009; Koerber et al., 2015 ). Simply stated, the basic tenets of scientific thinking provide students with the tools to distinguish good information from bad. Students have access to nearly limitless information, and the skills ...

  6. Teaching critical thinking

    Understanding and thinking critically about scientific evidence is a crucial skill in the modern world. We present a simple learning framework that employs cycles of decisions about making and acting on quantitative comparisons between datasets or data and models. With opportunities to improve the data or models, this structure is appropriate ...

  7. The Role of Evidence Evaluation in Critical Thinking: Fostering

    By epistemically vigilant we mean evaluating and monitoring the credibility and trustworthiness of information while being aware of the potential of being misinformed (Sperber et al., 2010 ). Epistemic vigilance is vital to critical thinking. We draw on Kuhn's ( 2018) definition of critical thinking as argumentation.

  8. Evidenced-Based Thinking for Scientific Thinking

    Critical thinking is a foundational cornerstone of scientific thinking that provides students with abilities to weigh up and evaluate information in order to make sound judgements. Today, more than ever, this is important as students are inundated with information and need critical faculties to evaluate the vast information that they consume.

  9. Enhancing Scientific Thinking Through the Development of Critical

    Research publications, policy papers and reports have argued that higher education cannot only facilitate learning of domain-specific knowledge and skills, but it also has to promote learning of thinking skills for using that knowledge in action (e.g. Greiff et al., 2014; Shavelson, 2010a; Strijbos, Engels, & Struyven, 2015).The focus on critical thinking arises, in part, because of higher ...

  10. Measuring Critical Thinking in Science: Systematic Review

    The findings were analyzed using document analysis technique to answer research questions of this study. This systematic review reveals that critical thinking can be assessed using quantitative or ...

  11. What influences students' abilities to critically evaluate scientific

    Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and ...

  12. Nutrition 1020- Exam 6 Flashcards

    Study with Quizlet and memorize flashcards containing terms like A scientific hypothesis is:, *Critical thinking is needed to evaluate scientific findings pertaining to:, Which of the following is true about a scientific experiment: and more.

  13. Guidelines for a Scientific Approach to Critical Thinking Assessment

    This article examines benefits of taking a scientific approach to critical thinking assessment and proposes guidelines for planning, conducting, and using assessment research. Specifically, we discuss study design options and strategies for improving the quality of assessment data and the use of such data to improve critical thinking ...

  14. The Art and Science of Critical Thinking in Research: A Guide to

    The art and science of critical thinking in research is a multifaceted and dynamic process that requires intellectual rigor, creativity, and an open mind. In research, critical thinking is essential for developing research questions, designing research studies, collecting and analyzing data, and interpreting research findings.

  15. Thinking critically on critical thinking: why scientists' skills need

    Critical thinking moves us beyond mere description and into the realms of scientific inference and reasoning. This is what enables discoveries to be made and innovations to be fostered.

  16. Argumentation, Evidence Evaluation and Critical Thinking

    Using this frame, the chapter examines the contributions of argumentation in science education to the components of critical thinking, and also discusses the evaluation of evidence and the different factors influencing or even hampering it. The chapter concludes with consideration of the development of critical thinking in the science classroom.

  17. (PDF) Assessment of Scientific Inquiry and Critical Thinking: Measuring

    Goal 2 of the APA Goals for Undergraduate Major in Psychology, Scientific Inquiry and Critical Thinking, addresses the development of scientific reasoning and problem-solving, including effective ...

  18. A Crash Course in Critical Thinking

    Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the ...

  19. Nutrition Exam 6A Practice test Flashcards

    Study with Quizlet and memorize flashcards containing terms like A scientific hypothesis is:, Critical thinking is needed to evaluate scientific findings pertaining to:, Which of the following is true about a scientific experiment: and more.

  20. Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An

    Critical thinking is considered part of analytical skills which involve evaluating the quality and applicability of ideas, products, and options ( ). Regarding adaptive intelligence, () has emphasized how adaptive aspects of intelligence are needed to solve real-world problems both at the individual and species levels.

  21. Scientific Thinking and Critical Thinking in Science Education

    Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of science education, there are quite frequent allusions to one ...

  22. NUTR 1020 6A Test Practice Flashcards

    Study with Quizlet and memorize flashcards containing terms like A scientific hypothesis is:, Critical thinking is needed to evaluate scientific findings pertaining to: A. Interpretation and explanation findings B. The research protocol and method used C. The reliability of the data produced D. All of the above, Which of the following is true about a scientific experiment? and more.

  23. A critical issue: assessing the critical thinking skills and

    The findings of this study are in agreement with the published research pertaining to critical thinking skills development in undergraduate students as a whole, as some estimates have described that 45% of undergraduate students do not show meaningful improvement in their critical thinking skills upon graduation, with even this number ...

  24. Nutr. 1020 Exam 6 Flashcards

    Critical thinking is needed to evaluate scientific findings pertaining to: interpretation and explanation findings, the research protocol and method used, the reliability of the data produced what is true about a scientific experiment