• Our Mission

Helping Students Hone Their Critical Thinking Skills

Used consistently, these strategies can help middle and high school teachers guide students to improve much-needed skills.

Middle school students involved in a classroom discussion

Critical thinking skills are important in every discipline, at and beyond school. From managing money to choosing which candidates to vote for in elections to making difficult career choices, students need to be prepared to take in, synthesize, and act on new information in a world that is constantly changing.

While critical thinking might seem like an abstract idea that is tough to directly instruct, there are many engaging ways to help students strengthen these skills through active learning.

Make Time for Metacognitive Reflection

Create space for students to both reflect on their ideas and discuss the power of doing so. Show students how they can push back on their own thinking to analyze and question their assumptions. Students might ask themselves, “Why is this the best answer? What information supports my answer? What might someone with a counterargument say?”

Through this reflection, students and teachers (who can model reflecting on their own thinking) gain deeper understandings of their ideas and do a better job articulating their beliefs. In a world that is go-go-go, it is important to help students understand that it is OK to take a breath and think about their ideas before putting them out into the world. And taking time for reflection helps us more thoughtfully consider others’ ideas, too.

Teach Reasoning Skills 

Reasoning skills are another key component of critical thinking, involving the abilities to think logically, evaluate evidence, identify assumptions, and analyze arguments. Students who learn how to use reasoning skills will be better equipped to make informed decisions, form and defend opinions, and solve problems. 

One way to teach reasoning is to use problem-solving activities that require students to apply their skills to practical contexts. For example, give students a real problem to solve, and ask them to use reasoning skills to develop a solution. They can then present their solution and defend their reasoning to the class and engage in discussion about whether and how their thinking changed when listening to peers’ perspectives. 

A great example I have seen involved students identifying an underutilized part of their school and creating a presentation about one way to redesign it. This project allowed students to feel a sense of connection to the problem and come up with creative solutions that could help others at school. For more examples, you might visit PBS’s Design Squad , a resource that brings to life real-world problem-solving.

Ask Open-Ended Questions 

Moving beyond the repetition of facts, critical thinking requires students to take positions and explain their beliefs through research, evidence, and explanations of credibility. 

When we pose open-ended questions, we create space for classroom discourse inclusive of diverse, perhaps opposing, ideas—grounds for rich exchanges that support deep thinking and analysis. 

For example, “How would you approach the problem?” and “Where might you look to find resources to address this issue?” are two open-ended questions that position students to think less about the “right” answer and more about the variety of solutions that might already exist. 

Journaling, whether digitally or physically in a notebook, is another great way to have students answer these open-ended prompts—giving them time to think and organize their thoughts before contributing to a conversation, which can ensure that more voices are heard. 

Once students process in their journal, small group or whole class conversations help bring their ideas to life. Discovering similarities between answers helps reveal to students that they are not alone, which can encourage future participation in constructive civil discourse.

Teach Information Literacy 

Education has moved far past the idea of “Be careful of what is on Wikipedia, because it might not be true.” With AI innovations making their way into classrooms, teachers know that informed readers must question everything. 

Understanding what is and is not a reliable source and knowing how to vet information are important skills for students to build and utilize when making informed decisions. You might start by introducing the idea of bias: Articles, ads, memes, videos, and every other form of media can push an agenda that students may not see on the surface. Discuss credibility, subjectivity, and objectivity, and look at examples and nonexamples of trusted information to prepare students to be well-informed members of a democracy.

One of my favorite lessons is about the Pacific Northwest tree octopus . This project asks students to explore what appears to be a very real website that provides information on this supposedly endangered animal. It is a wonderful, albeit over-the-top, example of how something might look official even when untrue, revealing that we need critical thinking to break down “facts” and determine the validity of the information we consume. 

A fun extension is to have students come up with their own website or newsletter about something going on in school that is untrue. Perhaps a change in dress code that requires everyone to wear their clothes inside out or a change to the lunch menu that will require students to eat brussels sprouts every day. 

Giving students the ability to create their own falsified information can help them better identify it in other contexts. Understanding that information can be “too good to be true” can help them identify future falsehoods. 

Provide Diverse Perspectives 

Consider how to keep the classroom from becoming an echo chamber. If students come from the same community, they may have similar perspectives. And those who have differing perspectives may not feel comfortable sharing them in the face of an opposing majority. 

To support varying viewpoints, bring diverse voices into the classroom as much as possible, especially when discussing current events. Use primary sources: videos from YouTube, essays and articles written by people who experienced current events firsthand, documentaries that dive deeply into topics that require some nuance, and any other resources that provide a varied look at topics. 

I like to use the Smithsonian “OurStory” page , which shares a wide variety of stories from people in the United States. The page on Japanese American internment camps is very powerful because of its first-person perspectives. 

Practice Makes Perfect 

To make the above strategies and thinking routines a consistent part of your classroom, spread them out—and build upon them—over the course of the school year. You might challenge students with information and/or examples that require them to use their critical thinking skills; work these skills explicitly into lessons, projects, rubrics, and self-assessments; or have students practice identifying misinformation or unsupported arguments.

Critical thinking is not learned in isolation. It needs to be explored in English language arts, social studies, science, physical education, math. Every discipline requires students to take a careful look at something and find the best solution. Often, these skills are taken for granted, viewed as a by-product of a good education, but true critical thinking doesn’t just happen. It requires consistency and commitment.

In a moment when information and misinformation abound, and students must parse reams of information, it is imperative that we support and model critical thinking in the classroom to support the development of well-informed citizens.

Menu Trigger

Why Schools Need to Change Yes, We Can Define, Teach, and Assess Critical Thinking Skills

student critical thinking skills questionnaire

Jeff Heyck-Williams (He, His, Him) Director of the Two Rivers Learning Institute in Washington, DC

critical thinking

Today’s learners face an uncertain present and a rapidly changing future that demand far different skills and knowledge than were needed in the 20th century. We also know so much more about enabling deep, powerful learning than we ever did before. Our collective future depends on how well young people prepare for the challenges and opportunities of 21st-century life.

Critical thinking is a thing. We can define it; we can teach it; and we can assess it.

While the idea of teaching critical thinking has been bandied around in education circles since at least the time of John Dewey, it has taken greater prominence in the education debates with the advent of the term “21st century skills” and discussions of deeper learning. There is increasing agreement among education reformers that critical thinking is an essential ingredient for long-term success for all of our students.

However, there are still those in the education establishment and in the media who argue that critical thinking isn’t really a thing, or that these skills aren’t well defined and, even if they could be defined, they can’t be taught or assessed.

To those naysayers, I have to disagree. Critical thinking is a thing. We can define it; we can teach it; and we can assess it. In fact, as part of a multi-year Assessment for Learning Project , Two Rivers Public Charter School in Washington, D.C., has done just that.

Before I dive into what we have done, I want to acknowledge that some of the criticism has merit.

First, there are those that argue that critical thinking can only exist when students have a vast fund of knowledge. Meaning that a student cannot think critically if they don’t have something substantive about which to think. I agree. Students do need a robust foundation of core content knowledge to effectively think critically. Schools still have a responsibility for building students’ content knowledge.

However, I would argue that students don’t need to wait to think critically until after they have mastered some arbitrary amount of knowledge. They can start building critical thinking skills when they walk in the door. All students come to school with experience and knowledge which they can immediately think critically about. In fact, some of the thinking that they learn to do helps augment and solidify the discipline-specific academic knowledge that they are learning.

The second criticism is that critical thinking skills are always highly contextual. In this argument, the critics make the point that the types of thinking that students do in history is categorically different from the types of thinking students do in science or math. Thus, the idea of teaching broadly defined, content-neutral critical thinking skills is impossible. I agree that there are domain-specific thinking skills that students should learn in each discipline. However, I also believe that there are several generalizable skills that elementary school students can learn that have broad applicability to their academic and social lives. That is what we have done at Two Rivers.

Defining Critical Thinking Skills

We began this work by first defining what we mean by critical thinking. After a review of the literature and looking at the practice at other schools, we identified five constructs that encompass a set of broadly applicable skills: schema development and activation; effective reasoning; creativity and innovation; problem solving; and decision making.

critical thinking competency

We then created rubrics to provide a concrete vision of what each of these constructs look like in practice. Working with the Stanford Center for Assessment, Learning and Equity (SCALE) , we refined these rubrics to capture clear and discrete skills.

For example, we defined effective reasoning as the skill of creating an evidence-based claim: students need to construct a claim, identify relevant support, link their support to their claim, and identify possible questions or counter claims. Rubrics provide an explicit vision of the skill of effective reasoning for students and teachers. By breaking the rubrics down for different grade bands, we have been able not only to describe what reasoning is but also to delineate how the skills develop in students from preschool through 8th grade.

reasoning rubric

Before moving on, I want to freely acknowledge that in narrowly defining reasoning as the construction of evidence-based claims we have disregarded some elements of reasoning that students can and should learn. For example, the difference between constructing claims through deductive versus inductive means is not highlighted in our definition. However, by privileging a definition that has broad applicability across disciplines, we are able to gain traction in developing the roots of critical thinking. In this case, to formulate well-supported claims or arguments.

Teaching Critical Thinking Skills

The definitions of critical thinking constructs were only useful to us in as much as they translated into practical skills that teachers could teach and students could learn and use. Consequently, we have found that to teach a set of cognitive skills, we needed thinking routines that defined the regular application of these critical thinking and problem-solving skills across domains. Building on Harvard’s Project Zero Visible Thinking work, we have named routines aligned with each of our constructs.

For example, with the construct of effective reasoning, we aligned the Claim-Support-Question thinking routine to our rubric. Teachers then were able to teach students that whenever they were making an argument, the norm in the class was to use the routine in constructing their claim and support. The flexibility of the routine has allowed us to apply it from preschool through 8th grade and across disciplines from science to economics and from math to literacy.

argumentative writing

Kathryn Mancino, a 5th grade teacher at Two Rivers, has deliberately taught three of our thinking routines to students using the anchor charts above. Her charts name the components of each routine and has a place for students to record when they’ve used it and what they have figured out about the routine. By using this structure with a chart that can be added to throughout the year, students see the routines as broadly applicable across disciplines and are able to refine their application over time.

Assessing Critical Thinking Skills

By defining specific constructs of critical thinking and building thinking routines that support their implementation in classrooms, we have operated under the assumption that students are developing skills that they will be able to transfer to other settings. However, we recognized both the importance and the challenge of gathering reliable data to confirm this.

With this in mind, we have developed a series of short performance tasks around novel discipline-neutral contexts in which students can apply the constructs of thinking. Through these tasks, we have been able to provide an opportunity for students to demonstrate their ability to transfer the types of thinking beyond the original classroom setting. Once again, we have worked with SCALE to define tasks where students easily access the content but where the cognitive lift requires them to demonstrate their thinking abilities.

These assessments demonstrate that it is possible to capture meaningful data on students’ critical thinking abilities. They are not intended to be high stakes accountability measures. Instead, they are designed to give students, teachers, and school leaders discrete formative data on hard to measure skills.

While it is clearly difficult, and we have not solved all of the challenges to scaling assessments of critical thinking, we can define, teach, and assess these skills . In fact, knowing how important they are for the economy of the future and our democracy, it is essential that we do.

Jeff Heyck-Williams (He, His, Him)

Director of the two rivers learning institute.

Jeff Heyck-Williams is the director of the Two Rivers Learning Institute and a founder of Two Rivers Public Charter School. He has led work around creating school-wide cultures of mathematics, developing assessments of critical thinking and problem-solving, and supporting project-based learning.

Read More About Why Schools Need to Change

NGLC's Bravely 2024-2025

Bring Your Vision for Student Success to Life with NGLC and Bravely

March 13, 2024

teacher using Canva on laptop

For Ethical AI, Listen to Teachers

Jason Wilmot

October 23, 2023

students walking across bright hallway

Turning School Libraries into Discipline Centers Is Not the Answer to Disruptive Classroom Behavior

Stephanie McGary

October 4, 2023

student critical thinking skills questionnaire

  • ADEA Connect

' src=

  • Communities
  • Career Opportunities
  • New Thinking
  • ADEA Governance
  • House of Delegates
  • Board of Directors
  • Advisory Committees
  • Sections and Special Interest Groups
  • Governance Documents and Publications
  • Dental Faculty Code of Conduct
  • ADEAGies Foundation
  • About ADEAGies Foundation
  • ADEAGies Newsroom
  • Gies Awards
  • Press Center
  • Strategic Directions
  • 2022 Annual Report
  • ADEA Membership
  • Institutions
  • Faculty and Staff
  • Individuals
  • Corporations
  • ADEA Members
  • Predoctoral Dental
  • Allied Dental
  • Nonfederal Advanced Dental
  • U.S. Federal Dental
  • Students, Residents and Fellows
  • Corporate Members
  • Member Directory
  • Directory of Institutional Members (DIM)
  • 5 Questions With
  • ADEA Member to Member Recruitment
  • Students, Residents, and Fellows
  • Information For
  • Deans & Program Directors
  • Current Students & Residents
  • Prospective Students
  • Educational Meetings
  • Upcoming Events
  • 2024 Annual Session & Exhibition
  • eLearn Webinars
  • Past Events
  • Professional Development
  • eLearn Micro-credentials
  • Leadership Institute
  • Leadership Institute Alumni Association (LIAA)
  • Faculty Development Programs
  • ADEA Scholarships, Awards and Fellowships
  • Academic Fellowship
  • For Students
  • For Dental Educators
  • For Leadership Institute Fellows
  • Teaching Resources
  • ADEA weTeach®
  • MedEdPORTAL

Critical Thinking Skills Toolbox

  • Resources for Teaching
  • Policy Topics
  • Task Force Report
  • Opioid Epidemic
  • Financing Dental Education
  • Holistic Review
  • Sex-based Health Differences
  • Access, Diversity and Inclusion
  • ADEA Commission on Change and Innovation in Dental Education
  • Tool Resources
  • Campus Liaisons
  • Policy Resources
  • Policy Publications
  • Holistic Review Workshops
  • Leading Conversations Webinar Series
  • Collaborations
  • Summer Health Professions Education Program
  • Minority Dental Faculty Development Program
  • Federal Advocacy
  • Dental School Legislators
  • Policy Letters and Memos
  • Legislative Process
  • Federal Advocacy Toolkit
  • State Information
  • Opioid Abuse
  • Tracking Map
  • Loan Forgiveness Programs
  • State Advocacy Toolkit
  • Canadian Information
  • Dental Schools
  • Provincial Information
  • ADEA Advocate
  • Books and Guides
  • About ADEA Publications
  • 2023-24 Official Guide
  • Dental School Explorer
  • Dental Education Trends
  • Ordering Publications
  • ADEA Bookstore
  • Newsletters
  • About ADEA Newsletters
  • Bulletin of Dental Education
  • Charting Progress
  • Subscribe to Newsletter
  • Journal of Dental Education
  • Subscriptions
  • Submissions FAQs
  • Data, Analysis and Research
  • Educational Institutions
  • Applicants, Enrollees and Graduates
  • Dental School Seniors
  • ADEA AADSAS® (Dental School)
  • AADSAS Applicants
  • Health Professions Advisors
  • Admissions Officers
  • ADEA CAAPID® (International Dentists)
  • CAAPID Applicants
  • Program Finder
  • ADEA DHCAS® (Dental Hygiene Programs)
  • DHCAS Applicants
  • Program Directors
  • ADEA PASS® (Advanced Dental Education Programs)
  • PASS Applicants
  • PASS Evaluators
  • DentEd Jobs
  • Information For:

student critical thinking skills questionnaire

  • Introduction
  • Overview of Critical Thinking Skills
  • Teaching Observations
  • Avenues for Research

CTS Tools for Faculty and Student Assessment

  • Critical Thinking and Assessment
  • Conclusions
  • Bibliography
  • Helpful Links
  • Appendix A. Author's Impressions of Vignettes

A number of critical thinking skills inventories and measures have been developed:

     Watson-Glaser Critical Thinking Appraisal (WGCTA)      Cornell Critical Thinking Test      California Critical Thinking Disposition Inventory (CCTDI)      California Critical Thinking Skills Test (CCTST)      Health Science Reasoning Test (HSRT)      Professional Judgment Rating Form (PJRF)      Teaching for Thinking Student Course Evaluation Form      Holistic Critical Thinking Scoring Rubric      Peer Evaluation of Group Presentation Form

Excluding the Watson-Glaser Critical Thinking Appraisal and the Cornell Critical Thinking Test, Facione and Facione developed the critical thinking skills instruments listed above. However, it is important to point out that all of these measures are of questionable utility for dental educators because their content is general rather than dental education specific. (See Critical Thinking and Assessment .)

Table 7. Purposes of Critical Thinking Skills Instruments

  Reliability and Validity

Reliability means that individual scores from an instrument should be the same or nearly the same from one administration of the instrument to another. The instrument can be assumed to be free of bias and measurement error (68). Alpha coefficients are often used to report an estimate of internal consistency. Scores of .70 or higher indicate that the instrument has high reliability when the stakes are moderate. Scores of .80 and higher are appropriate when the stakes are high.

Validity means that individual scores from a particular instrument are meaningful, make sense, and allow researchers to draw conclusions from the sample to the population that is being studied (69) Researchers often refer to "content" or "face" validity. Content validity or face validity is the extent to which questions on an instrument are representative of the possible questions that a researcher could ask about that particular content or skills.

Watson-Glaser Critical Thinking Appraisal-FS (WGCTA-FS)

The WGCTA-FS is a 40-item inventory created to replace Forms A and B of the original test, which participants reported was too long.70 This inventory assesses test takers' skills in:

     (a) Inference: the extent to which the individual recognizes whether assumptions are clearly stated      (b) Recognition of assumptions: whether an individual recognizes whether assumptions are clearly stated      (c) Deduction: whether an individual decides if certain conclusions follow the information provided      (d) Interpretation: whether an individual considers evidence provided and determines whether generalizations from data are warranted      (e) Evaluation of arguments: whether an individual distinguishes strong and relevant arguments from weak and irrelevant arguments

Researchers investigated the reliability and validity of the WGCTA-FS for subjects in academic fields. Participants included 586 university students. Internal consistencies for the total WGCTA-FS among students majoring in psychology, educational psychology, and special education, including undergraduates and graduates, ranged from .74 to .92. The correlations between course grades and total WGCTA-FS scores for all groups ranged from .24 to .62 and were significant at the p < .05 of p < .01. In addition, internal consistency and test-retest reliability for the WGCTA-FS have been measured as .81. The WGCTA-FS was found to be a reliable and valid instrument for measuring critical thinking (71).

Cornell Critical Thinking Test (CCTT)

There are two forms of the CCTT, X and Z. Form X is for students in grades 4-14. Form Z is for advanced and gifted high school students, undergraduate and graduate students, and adults. Reliability estimates for Form Z range from .49 to .87 across the 42 groups who have been tested. Measures of validity were computed in standard conditions, roughly defined as conditions that do not adversely affect test performance. Correlations between Level Z and other measures of critical thinking are about .50.72 The CCTT is reportedly as predictive of graduate school grades as the Graduate Record Exam (GRE), a measure of aptitude, and the Miller Analogies Test, and tends to correlate between .2 and .4.73

California Critical Thinking Disposition Inventory (CCTDI)

Facione and Facione have reported significant relationships between the CCTDI and the CCTST. When faculty focus on critical thinking in planning curriculum development, modest cross-sectional and longitudinal gains have been demonstrated in students' CTS.74 The CCTDI consists of seven subscales and an overall score. The recommended cut-off score for each scale is 40, the suggested target score is 50, and the maximum score is 60. Scores below 40 on a specific scale are weak in that CT disposition, and scores above 50 on a scale are strong in that dispositional aspect. An overall score of 280 shows serious deficiency in disposition toward CT, while an overall score of 350 (while rare) shows across the board strength. The seven subscales are analyticity, self-confidence, inquisitiveness, maturity, open-mindedness, systematicity, and truth seeking (75).

In a study of instructional strategies and their influence on the development of critical thinking among undergraduate nursing students, Tiwari, Lai, and Yuen found that, compared with lecture students, PBL students showed significantly greater improvement in overall CCTDI (p = .0048), Truth seeking (p = .0008), Analyticity (p =.0368) and Critical Thinking Self-confidence (p =.0342) subscales from the first to the second time points; in overall CCTDI (p = .0083), Truth seeking (p= .0090), and Analyticity (p =.0354) subscales from the second to the third time points; and in Truth seeking (p = .0173) and Systematicity (p = .0440) subscales scores from the first to the fourth time points (76). California Critical Thinking Skills Test (CCTST)

Studies have shown the California Critical Thinking Skills Test captured gain scores in students' critical thinking over one quarter or one semester. Multiple health science programs have demonstrated significant gains in students' critical thinking using site-specific curriculum. Studies conducted to control for re-test bias showed no testing effect from pre- to post-test means using two independent groups of CT students. Since behavioral science measures can be impacted by social-desirability bias-the participant's desire to answer in ways that would please the researcher-researchers are urged to have participants take the Marlowe Crowne Social Desirability Scale simultaneously when measuring pre- and post-test changes in critical thinking skills. The CCTST is a 34-item instrument. This test has been correlated with the CCTDI with a sample of 1,557 nursing education students. Results show that, r = .201, and the relationship between the CCTST and the CCTDI is significant at p< .001. Significant relationships between CCTST and other measures including the GRE total, GRE-analytic, GRE-Verbal, GRE-Quantitative, the WGCTA, and the SAT Math and Verbal have also been reported. The two forms of the CCTST, A and B, are considered statistically significant. Depending on the testing, context KR-20 alphas range from .70 to .75. The newest version is CCTST Form 2000, and depending on the testing context, KR-20 alphas range from .78-.84.77

The Health Science Reasoning Test (HSRT)

Items within this inventory cover the domain of CT cognitive skills identified by a Delphi group of experts whose work resulted in the development of the CCTDI and CCTST. This test measures health science undergraduate and graduate students' CTS. Although test items are set in health sciences and clinical practice contexts, test takers are not required to have discipline-specific health sciences knowledge. For this reason, the test may have limited utility in dental education (78).

Preliminary estimates of internal consistency show that overall KR-20 coefficients range from .77 to .83.79 The instrument has moderate reliability on analysis and inference subscales, although the factor loadings appear adequate. The low K-20 coefficients may be result of small sample size, variance in item response, or both (see following table).

Table 8. Estimates of Internal Consistency and Factor Loading by Subscale for HSRT

Professional Judgment Rating Form (PJRF)

The scale consists of two sets of descriptors. The first set relates primarily to the attitudinal (habits of mind) dimension of CT. The second set relates primarily to CTS.

A single rater should know the student well enough to respond to at least 17 or the 20 descriptors with confidence. If not, the validity of the ratings may be questionable. If a single rater is used and ratings over time show some consistency, comparisons between ratings may be used to assess changes. If more than one rater is used, then inter-rater reliability must be established among the raters to yield meaningful results. While the PJRF can be used to assess the effectiveness of training programs for individuals or groups, the evaluation of participants' actual skills are best measured by an objective tool such as the California Critical Thinking Skills Test.

Teaching for Thinking Student Course Evaluation Form

Course evaluations typically ask for responses of "agree" or "disagree" to items focusing on teacher behavior. Typically the questions do not solicit information about student learning. Because contemporary thinking about curriculum is interested in student learning, this form was developed to address differences in pedagogy and subject matter, learning outcomes, student demographics, and course level characteristic of education today. This form also grew out of a "one size fits all" approach to teaching evaluations and a recognition of the limitations of this practice. It offers information about how a particular course enhances student knowledge, sensitivities, and dispositions. The form gives students an opportunity to provide feedback that can be used to improve instruction.

Holistic Critical Thinking Scoring Rubric

This assessment tool uses a four-point classification schema that lists particular opposing reasoning skills for select criteria. One advantage of a rubric is that it offers clearly delineated components and scales for evaluating outcomes. This rubric explains how students' CTS will be evaluated, and it provides a consistent framework for the professor as evaluator. Users can add or delete any of the statements to reflect their institution's effort to measure CT. Like most rubrics, this form is likely to have high face validity since the items tend to be relevant or descriptive of the target concept. This rubric can be used to rate student work or to assess learning outcomes. Experienced evaluators should engage in a process leading to consensus regarding what kinds of things should be classified and in what ways.80 If used improperly or by inexperienced evaluators, unreliable results may occur.

Peer Evaluation of Group Presentation Form

This form offers a common set of criteria to be used by peers and the instructor to evaluate student-led group presentations regarding concepts, analysis of arguments or positions, and conclusions.81 Users have an opportunity to rate the degree to which each component was demonstrated. Open-ended questions give users an opportunity to cite examples of how concepts, the analysis of arguments or positions, and conclusions were demonstrated.

Table 8. Proposed Universal Criteria for Evaluating Students' Critical Thinking Skills 

Aside from the use of the above-mentioned assessment tools, Dexter et al. recommended that all schools develop universal criteria for evaluating students' development of critical thinking skills (82).

Their rationale for the proposed criteria is that if faculty give feedback using these criteria, graduates will internalize these skills and use them to monitor their own thinking and practice (see Table 4).

' src=

  • Application Information
  • ADEA GoDental
  • ADEA AADSAS
  • ADEA CAAPID
  • Events & Professional Development
  • Scholarships, Awards & Fellowships
  • Publications & Data
  • Official Guide to Dental Schools
  • Data, Analysis & Research
  • Follow Us On:

' src=

  • ADEA Privacy Policy
  • Terms of Use
  • Website Feedback
  • Website Help

student critical thinking skills questionnaire

  • WordPress.org
  • Documentation
  • Learn WordPress
  • Members Newsfeed

student critical thinking skills questionnaire

16 Critical Thinking Questions For Students

student critical thinking skills questionnaire

Critical thinking is an essential skill that empowers students to think critically and make informed decisions. It encourages them to explore different perspectives, analyze information, and develop logical reasoning. To foster critical thinking skills, it is crucial to ask students thought-provoking questions that challenge their assumptions and encourage deeper analysis. Here are 16 critical thinking questions for students to enhance their problem-solving abilities:

  • What evidence supports this argument?
  • Can you identify any biases in this article?
  • How does this relate to what we have learned previously?
  • What alternative solutions can you propose to this problem?
  • How might different cultures perceive this situation?
  • What assumptions underlie this theory?
  • How reliable is the source of this information?
  • Can you identify any logical fallacies in this argument?
  • What impact does this decision have on various stakeholders?
  • What are the strengths and weaknesses of this argument?
  • How might you approach this problem differently?
  • Wha t ethical considerations need to be taken into account?
  • Can you identify any gaps in the evidence provided?
  • How does this concept apply to real-world situations?
  • What are the potential consequences of this decision?
  • How might you evaluate the credibility of this research?

By incorporating these critical thinking questions, educators can help students develop essential skills such as analyzing information, evaluating arguments, and problem-solving. Encouraging students to think critically will not only benefit their academic performance but also prepare them for success in various aspects of their lives.

Remember, critical thinking is a skill that can be nurtured and strengthened with practice. By guiding students to ask and answer these thought-provoking questions, educators can create a learning environment that fosters independent thinking and creativity.

icon

Related Articles

student critical thinking skills questionnaire

The first year of teaching can be a thrilling and challenging experience…

no reactions

Dice games can be a fantastic tool for teachers looking to incorporate…

230

In the ever-changing landscape of education, teachers constantly find new ways to…

student critical thinking skills questionnaire

Pedagogue is a social media network where educators can learn and grow. It's a safe space where they can share advice, strategies, tools, hacks, resources, etc., and work together to improve their teaching skills and the academic performance of the students in their charge.

If you want to collaborate with educators from around the globe, facilitate remote learning, etc., sign up for a free account today and start making connections.

Pedagogue is Free Now, and Free Forever!

  • New? Start Here
  • Frequently Asked Questions
  • Privacy Policy
  • Terms of Service
  • Registration

Don't you have an account? Register Now! it's really simple and you can start enjoying all the benefits!

We just sent you an Email. Please Open it up to activate your account.

I allow this website to collect and store submitted data.

This content cannot be displayed without JavaScript. Please enable JavaScript and reload the page.

  • Teaching & Learning
  • Using quizzes to evaluate student learning

Writing quiz questions that assess student understanding and critical thinking

This article discusses the writing of effective multiple-choice style quiz questions that assess students' understanding and critical thinking skills, and guidance on how to design quiz questions that provide a reliable and valid measure of learning.

Designing questions to assess higher-order thinking

One of the greatest challenges of multiple-choice design is assessing higher-order thinking skills. Higher-order thinking is often used to refer to 'transfer', 'critical thinking' and 'problem-solving.' 

When designing a quiz that assesses higher-order thinking skills, it is necessary to write questions/problems that require students to:  

  • use information, methods, concepts, or theories in new situations  
  • predict sequences and outcomes  
  • solve problems in which students must select the approach to use  
  • see patterns and organization of parts (e.g. classify, order)  
  • determine the quality/importance of different pieces of information  
  • discriminate among ideas  
  • examine pieces of evidence to determine the likelihood of certain outcomes/scenarios  
  • make choices based on reasoned argument.

The number of questions included in a quiz will vary depending on the content and purpose of the quiz. Generally, 10-20 questions are appropriate to ensure that students are only assessed on terms and concepts that are directly connected to learning outcomes. It is essential to consider including different question types to design a quiz that evaluates different levels of cognition and aligns with principles of academic integrity. Below the video there is an overview of the different question types available in Blackboard tests.

Parts of a Quiz

A typical multiple-choice question is comprised of 3 parts: the stem, the distractors and a single correct answer. For a simple, lower-order thinking or knowledge-based question this usually suffices.

parts of a multiple choice question the stem, distractors, and correct answer

However, an effective quiz question that addresses critical thinking should be made up of the following components:  

  • Context/introduction  
  • Question/problem (also known as the 'stem')
  • Correct answer/s
  • Distractors (wrong answer/s)

Adding a context or introduction is important. Because students are being asked to examine more information relevant to the question being asked they are much less likely to guess, or depend on the 'recall' of facts. 

Increase question complexity

Making students select multiple possible correct answers increases the number of potential answers, requires students to exercise discrimination, and reduces the risk that students can "guess" the right answer.

student critical thinking skills questionnaire

The sections below outline how to write each component of an effective quiz question effectively.

1. Writing the context/introduction

It is critical to introduce the question and to relate it to a ‘real world’ context so that students can relate to what is being asked. A well-written context sets the stage for the question and should be phrased in a concise manner using familiar vocabulary .  

After making a documentation error which action should the nurse take? 

a) Use correcting liquid to cover the mistake and make a new entry b) Draw a line through it and write error above the entry c) Draw a line and write through it and write mistaken entry above d) Draw a line and write through it and write mistaken entry and initials above

Imagine you are a nurse in the palliative care unit in a hospital, during your shift one of your patients has an allergic reaction to a new medication. As you are documenting the allergic reaction, you accidentally put down the wrong medication.  

How do you correct the documentation error? 

a) Use liquid paper correction fluid to cover the mistake and write the correct medication on top b) Strikethrough the error and write the correct medication above c) Draw a line and write mistaken entry and your initials then write the correct medication above d) Always write in pencil and use an eraser to correct the entry

Using visuals to provide context

Images can help to convey context in very few words. Where appropriate, you could consider using charts and graphs to provide the context for the question as this will require students to carefully interpret the information.

2. Writing the question 

A well-written question enables students to:  

  • Understand the question without reading it several times  
  • Answer the question without reading all the options  
  • Answer the question without knowing the answers to other quiz questions  

The question should be:  

  • A direct/complete question rather than incomplete statements  
  • Concise and brief avoiding undue complexity, redundancy, and irrelevant information  
  • Stated in a positive form as negatively phrased questions are easily misunderstood  
  • Using familiar language , avoiding any unfamiliar terminology  

To increase the validity and reliability of the quiz, it is also a good idea to randomize the position of the correct responses throughout the quiz. One way to ensure this is to consistently organize question-order alphabetically or from lowest-highest number. Blackboard also gives an option to randomize the order of questions and the order of answers for each student. Utilizing these features when designing a quiz is a way to further align it with SCU values connected to academic integrity .  

  Example:

A hotel attempting to match customers’ purchase patterns and their demand for guest rooms with future occupancy forecasts is known as:  

a) Integrated management b) Yield management   c) Sophisticated management   d) Reservations management

Imagine you are the manager at a busy hotel. As the manager, you are asked to match customer purchase patterns and demand for guest rooms with future occupancy forecasts. 

What is this process known as? 

a) Integrated management b) Yield management c) Sophisticated management d) Reservations management 

3. Writing the answer 

A well-written answer enables students to s elect the right response without having to sort out complexities that have nothing to do with knowing the correct answer. However, students should not be able to guess the correct answer from the way the responses are written.

Quiz questions often have around 4 options for the students to choose from. However, there may be situations where it may make sense to have fewer or additional response options.  

The best way to achieve more accurate and achievable budgets is to:  

a) Have the budget committee monitor actual results on a frequent basis so that quick punitive action can be taken when actual results do not comply with budgeted expectations b) Have the budget prepared by top executives only c) Have all employees participated in the preparation of the budget   d) All of the above

What is the best way to achieve more accurate and achievable budgets? 

a) Have a budget committee that frequently monitors results b) Have a budget that is prepared by executives only c) Have all employees participate in monitoring results d) Have a budget that is created by employees

4. Writing effective distractors 

It is critical to carefully write appropriate distractors of the same length and complexity, that are neither too similar that the answer is vague or arguable, nor too obvious that anyone could guess. An appropriate distractor:  

  • Mirrors the correct answer in length, style, complexity, phrasing and style  
  • Is plausible rather than exaggerated or unrealistic in a way that gives away the correct answer

Avoid the following

  • Avoid distractors that contain minuscule or vague distinctions from the correct answer , as this may confuse the students (except where these distinctions are  significant to demonstrating the unit learning outcomes). 
  • Using “ all of the above ” or “ both a) & c) ,” make it easier for students to guess the correct answer with only partial knowledge. Instead, use a multiple answer question type or add more appropriate distractors.
  • Verbal or grammatical clues that give away the correct answer.
  • Using “ none of the above ” unless there is an objectively correct answer (e.g. a mathematics quiz). 

Breathing rate may increase as a result of:  

a) A small decrease in oxygen levels in the body b) A small increase in carbon dioxide levels in the body c) A decrease in blood pH d) Both b and c  

Imagine you are going for a run. As you run faster, you begin to breathe faster as well. 

What reactions in the body causes the breathing rate to increase? (Select all that apply) 

a) A decrease in oxygen levels in the body b) An increase in carbon dioxide levels in the body c) A decrease in blood pH d) An increase in blood pH 

All other things being equal, an increased seat turnover will :  

a) Increase total revenue b) Has no impact on the average check c) Increase the average check d) Decrease total revenue

If all other variables are unchanged, what will an increased turnover result in? 

a) An increase in total revenue b) An increase in the average check c) A decrease in total revenue d) A decrease in the average check 

Reflecting on quiz questions used for assessing student learning

Once you have written a quiz question reflect carefully and analytically to ensure that:

  • Students can't recall the answer from a case study already covered in class (where the question requires critical thinking)
  • Check that the distractors are plausible. 
  • The correct answer is not notably longer or shorter than the others
  • Check spelling and grammar and are correct

Also make sure to test the quiz out with others, especially those familiar with the unit content.

Analysing Blackboard Tests

Where you are using Blackboard tests for online quizzes it is possible to analyse the student responses to determine the discrimination and difficulty of a particular quiz. Please see the following article: Analysing questions/responses for a Blackboard test (quiz)

Further Resources

  • Is This a Trick Question? A Short Guide to Writing Effective Test Questions
  • Multiple choice questions - Charles Sturt University

The Importance of Critical Thinking Skills for Students

Link Copied

Share on Facebook

Share on Twitter

Share on LinkedIn

The Importance of Critical Thinking Skills for Students

Brains at Work!

If you’re moving toward the end of your high school career, you’ve likely heard a lot about college life and how different it is from high school. Classes are more intense, professors are stricter, and the curriculum is more complicated. All in all, it’s very different compared to high school.

Different doesn’t have to mean scary, though. If you’re nervous about beginning college and you’re worried about how you’ll learn in a place so different from high school, there are steps you can take to help you thrive in your college career.

If you’re wondering how to get accepted into college and how to succeed as a freshman in such a new environment, the answer is simple: harness the power of critical thinking skills for students.

What is critical thinking?

Critical thinking entails using reasoning and the questioning of assumptions to address problems, assess information, identify biases, and more. It's a skillset crucial for students navigating their academic journey and beyond, including how to get accepted into college . At its crux, critical thinking for students has everything to do with self-discipline and making active decisions to 'think outside the box,' allowing individuals to think beyond a concept alone in order to understand it better.

Critical thinking skills for students is a concept highly encouraged in any and every educational setting, and with good reason. Possessing strong critical thinking skills will make you a better student and, frankly, help you gain valuable life skills. Not only will you be more efficient in gathering knowledge and processing information, but you will also enhance your ability to analyse and comprehend it.

Importance of critical thinking for students

Developing critical thinking skills for students is essential for success at all academic levels, particularly in college. It introduces reflection and perspective while encouraging you to question what you’re learning! Even if you’ve seen solid facts. Asking questions, considering other perspectives, and self-reflection cultivate resilient students with endless potential for learning, retention, and personal growth.A well-developed set of critical thinking skills for students will help them excel in many areas. Here are some critical thinking examples for students:

1. Decision-making

If you’re thinking critically, you’re not making impulse decisions or snap judgments; you’re taking the time to weigh the pros and cons. You’re making informed decisions. Critical thinking skills for students can make all the difference.

2. Problem-solving

Students with critical thinking skills are more effective in problem-solving. This reflective thinking process helps you use your own experiences to ideate innovations, solutions, and decisions.

3. Communication

Strong communication skills are a vital aspect of critical thinking for students, helping with their overall critical thinking abilities. How can you learn without asking questions? Critical thinking for students is what helps them produce the questions they may not have ever thought to ask. As a critical thinker, you’ll get better at expressing your ideas concisely and logically, facilitating thoughtful discussion, and learning from your teachers and peers.

4. Analytical skills

Developing analytical skills is a key component of strong critical thinking skills for students. It goes beyond study tips on reviewing data or learning a concept. It’s about the “Who? What? Where? Why? When? How?” When you’re thinking critically, these questions will come naturally, and you’ll be an expert learner because of it.

How can students develop critical thinking skills

Although critical thinking skills for students is an important and necessary process, it isn’t necessarily difficult to develop these observational skills. All it takes is a conscious effort and a little bit of practice. Here are a few tips to get you started:

1. Never stop asking questions

This is the best way to learn critical thinking skills for students. As stated earlier, ask questions—even if you’re presented with facts to begin with. When you’re examining a problem or learning a concept, ask as many questions as you can. Not only will you be better acquainted with what you’re learning, but it’ll soon become second nature to follow this process in every class you take and help you improve your GPA .

2. Practice active listening

As important as asking questions is, it is equally vital to be a good listener to your peers. It is astounding how much we can learn from each other in a collaborative environment! Diverse perspectives are key to fostering critical thinking skills for students. Keep an open mind and view every discussion as an opportunity to learn.

3. Dive into your creativity

Although a college environment is vastly different from high school classrooms, one thing remains constant through all levels of education: the importance of creativity. Creativity is a guiding factor through all facets of critical thinking skills for students. It fosters collaborative discussion, innovative solutions, and thoughtful analyses.

4. Engage in debates and discussions

Participating in debates and discussions helps you articulate your thoughts clearly and consider opposing viewpoints. It challenges the critical thinking skills of students about the evidence presented, decoding arguments, and constructing logical reasoning. Look for debates and discussion opportunities in class, online forums, or extracurricular activities.

5. Look out for diverse sources of information 

In today's digital age, information is easily available from a variety of sources. Make it a habit to explore different opinions, perspectives, and sources of information. This not only broadens one's understanding of a subject but also helps in distinguishing between reliable and biased sources, honing the critical thinking skills of students.

Unlock the power of critical thinking skills while enjoying a seamless student living experience!

Book through amber today!

6. Practice problem-solving

Try engaging in challenging problems, riddles or puzzles that require critical thinking skills for students to solve. Whether it's solving mathematical equations, tackling complex scenarios in literature, or analysing data in science experiments, regular practice of problem-solving tasks sharpens your analytical skills. It enhances your ability to think critically under pressure.

Nurturing critical thinking skills helps students with the tools to navigate the complexities of academia and beyond. By learning active listening, curiosity, creativity, and problem-solving, students can create a sturdy foundation for lifelong learning. By building upon all these skills, you’ll be an expert critical thinker in no time—and you’ll be ready to conquer all that college has to offer! 

Frequently Asked Questions

What questions should i ask to be a better critical thinker, how can i sharpen critical thinking skills for students, how do i avoid bias, can i use my critical thinking skills outside of school, will critical thinking skills help students in their future careers.

Your ideal student home & a flight ticket awaits

Follow us on :

cta

Related Posts

student critical thinking skills questionnaire

25 Amazing Hobbies For Students To Develop In 2024

student critical thinking skills questionnaire

10 Best Apps For Monitoring Phone Usage

student critical thinking skills questionnaire

OneNote Tips and Tricks : 15 Best Ways to Enchance Productivity

student critical thinking skills questionnaire

amber © 2023. All rights reserved.

4.8/5 on Trustpilot

Rated as "Excellent" • 4700+ Reviews by Students

student critical thinking skills questionnaire

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

Constructing a critical thinking evaluation framework for college students majoring in the humanities

Associated data.

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author/s.

Introduction

Education for sustainable development (ESD) has focused on the promotion of sustainable thinking skills, capacities, or abilities for learners of different educational stages. Critical thinking (CT) plays an important role in the lifelong development of college students, which is also one of the key competencies in ESD. The development of a valuable framework for assessing college students’ CT is important for understanding their level of CT. Therefore, this study aimed to construct a reliable self-evaluation CT framework for college students majoring in the humanities.

Exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and Item analysis were conducted to explore the reliability and validity of the CT evaluation framework. Six hundred and forty-two college students majoring in the humanities were collected. The sample was randomly divided into two subsamples ( n 1 = 321, n 2 = 321).

The Cronbach’s alpha coefficient for the whole scale was 0.909, and the values of the Cronbach’s alpha coefficients for individual factors of the scale ranged from 0.724 to 0.878. Then CFA was conducted within the scope of the validity study of the scale. In this way, the structure of the 7-factor scale was confirmed. Results indicated that the constructed evaluation framework performed consistently with the collected data. CFA also confirmed a good model fitting of the relevant 22 factors of the college students’ CT framework ( χ 2 /df  = 3.110, RMSEA = 0.056, GFI = 0.927, AGFI = 0.902, NFI = 0.923, and CFI = 0.946).

These findings revealed that the CT abilities self-evaluation scale was a valid and reliable instrument for measuring the CT abilities of college students in the humanities. Therefore, the college students’ CT self-evaluation framework included three dimensions: discipline cognition (DC), CT disposition, and CT skills. Among them, CT disposition consisted of motivation (MO), attention (AT), and open-mindedness (OM), while CT skills included clarification skills (CS), organization skills (OS), and reflection (RE). Therefore, this framework can be an effective instrument to support college students’ CT measurement. Consequently, some suggestions are also put forward regarding how to apply the instrument in future studies.

Nowadays, individuals should be equipped with the abilities of identifying problems, in-depth thinking, and generating effective solutions to cope with various risks and challenges caused by the rapid development of science and technology ( Arisoy and Aybek, 2021 ). In this context, critical thinking (CT) is gaining increasing attention. Promoting college students’ CT is an important way of improving their abilities of problem solving and decision making to further enhance their lifelong development ( Feng et al., 2010 ). Although human beings are not born with CT abilities ( Scriven and Paul, 2005 ), they can be acquired through learning and training, and are always sustainable ( Barta et al., 2022 ).

Especially in the field of education, CT should be valued ( Pnevmatikos et al., 2019 ). Students should be good thinkers who possess the abilities of applying critical evaluation, finding, and collating evidence for their views, as well as maintaining a doubting attitude regarding the validity of facts provided by their teachers or other students ( Sulaiman et al., 2010 ). Many countries have regarded the development of students’ CT as one of the fundamental educational goals ( Flores et al., 2012 ; Ennis, 2018 ). CT is helpful for students to develop their constructive, creative, and productive thinking, as well as to foster their independence ( Wechsler et al., 2018 ; Odebiyi and Odebiyi, 2021 ). It also provides the power to broaden their horizons ( Les and Moroz, 2021 ). Meanwhile, when college students have a high level of CT abilities, they will likely perform better in their future careers ( Stone et al., 2017 ; Cáceres et al., 2020 ). Therefore, college students should be capable of learning to access knowledge, solve problems, and embrace different ideas to develop their CT ability ( Ulger, 2018 ; Arisoy and Aybek, 2021 ).

Due to the significant meaningfulness of CT abilities at all education levels and in various disciplines, how to cultivate students’ CT abilities has been the focus of CT-related research ( Fernández-Santín and Feliu-Torruella, 2020 ). Many studies have shown that inquiry-based learning activities or programs are an effective way to exercise and enhance students’ CT abilities ( Thaiposri and Wannapiroon, 2015 ; Liang and Fung, 2020 ; Boso et al., 2021 ; Chen et al., 2022 ). Students not only need the motivation and belief to actively participate in such learning activities and to commit to problem solving, but also need the learning skills to cope with the problems that may be encountered in problem-solving oriented learning activities. These requirements are in line with the cultivation of students’ CT abilities. Meanwhile, research has also indicated that there is an interrelationship between problem solving and CT ( Dunne, 2015 ; Kanbay and Okanlı, 2017 ).

However, another important issue is how to test whether learning activities contribute to improving the level of students’ CT abilities. It is effective to measure students’ CT abilities through using CT measurement instruments. Some CT measurement frameworks have been developed to cope with the need to cultivate CT abilities in teaching and learning activities ( Saad and Zainudin, 2022 ). However, there are still some imperfections in these existing CT evaluation frameworks. For example, most studies on college students’ CT are in the field of science, with very little research on students in the humanities, and even less on specifically developing CT assessment frameworks for college students in the humanities. Only Khandaghi et al. (2011) conducted a study on the CT disposition of college students in the humanities, and the result indicated that their CT abilities were at an intermediate level. However, there are few descriptions of college students’ CT with a background in humanities disciplines. Compared to humanities disciplines, science disciplines seem to place more emphasis on logical and rational thinking, which might cater more to the development of CT abilities ( Li, 2021 ). However, it is also vital for college students in the humanities to engage in rational thinking processes ( Al-Khatib, 2019 ). Hence, it is worth performing CT abilities evaluations of college students in the humanities by constructing a CT evaluation framework specifically for such students. In addition, previous measurements of CT have tended to be constructed according to one dimension of CT only, either CT skills or CT disposition. CT skills and disposition are equally important factors, and the level of CT abilities can be assessed more comprehensively and accurately by measuring both dimensions simultaneously. Therefore, the purpose of this study was to develop a self-evaluation CT framework for college students that integrates both CT skills and disposition dimensions to comprehensively evaluate the CT ability of college students in the humanities.

Literature review

Ct of college students in the humanities.

CT is hardly a new concept, as it can be traced back 2,500 years to the dialogs of Socrates ( Giannouli and Giannoulis, 2021 ). In the book, How We Think, Dewey (1933 , p 9; first edition, 1910) mentioned that thinking critically can help us move forward in our thinking. Subsequently, different explanations of CT have been presented through different perspectives by researchers. Some researchers think that CT means to think with logic and reasonableness ( Mulnix and Mulnix, 2010 ), while others suggest that CT refers to the specific learning process in which learners need to think critically to achieve learning objectives through making decisions and problem solving ( Ennis, 1987 ).

Generally, for a consensus, CT involves two aspects: CT skills and CT disposition ( Bensley et al., 2010 ; Sosu, 2013 ). CT skills refer to the abilities to understand problems and produce reasonable solutions to problems, such as analysis, interpretation, and the drawing of conclusions ( Chan, 2019 ; Ahmady and Shahbazi, 2020 ). CT disposition emphasizes the willingness of individuals to apply the skills mentioned above when there is a problem or issue that needs to be solved ( Chen et al., 2020 ). People are urged by CT disposition to engage in a reflective, inferential thinking process about the information they receive ( Álvarez-Huerta et al., 2022 ), and then in specific problem-solving processes, specific CT skills would be applied. CT disposition is the motivation for critical behavior and an important quality for the learning and use of critical skills ( Lederer, 2007 ; Jiang et al., 2018 ).

For college students, the cultivation of their CT abilities is usually based on specific learning curriculums ( O’Reilly et al., 2022 ). Hence, many studies about students’ CT have been conducted in various disciplines. For example, in science education, Ma et al.’s (2021) study confirmed that there was a significant relationship between CT and science achievement, so they suggested that it might be valuable to consider fostering CT as a considerable outcome in science education. In political science, when developing college students’ CT, teachers should focus on not only the development of skills, but also of meta-awareness ( Berdahl et al., 2021 ), which emphasizes the importance of CT disposition, i.e., learners not only need to acquire CT skills, such as analysis, inference, and interpretation, but also need to have clear cognition of how to apply these skills at a cognitive level. Duro et al. (2013) found that psychology students valued explicit CT training. For students majoring in mathematics, Basri and Rahman (2019) developed an assessment framework to investigate students’ CT when solving mathematical problems. According to the above literature review, there have been many studies on CT in various disciplines, which also reflects the significant importance of CT for the development of students in various disciplines. However, most studies on CT have been conducted in the field of science subjects, such as mathematics, business, nursing, and so on ( Kim et al., 2014 ; Siew and Mapeala, 2016 ; Basri and Rahman, 2019 ), but there have been few studies on the CT of students in the humanities ( Ennis, 2018 ).

There is a widespread stereotype that compared to humanities subjects, science majors are more logical, and so more attention should be paid to their CT ( Lin, 2016 ). This begs the question, are all students in the humanities (e.g., history, pedagogy, Chinese language literature, and so on) sensual or “romantic”? Do they not also need to develop independent, logical, and CT? Can they depend only on “romantic” thinking? This may be a prejudice. In fact, the humanities are subjects that focus on humanities and our society ( Lin, 2020 ). Humanities should be seen as the purpose rather than as a tool. The academic literacy of humanities needs to be developed and enhanced through a long-term, subtle learning process ( Bhatt and Samanhudi, 2022 ), and the significance for individuals is profound. Hence, the subjects of both humanities and sciences play an equally important role in an individual’s lifelong development. As such, what should students majoring in humanities subjects do to develop and enhance their professional competence? Chen and Wei (2021) suggested that individuals in the humanities should have the abilities to identify and tackle unstructured problems to adapt to the changing environments, and this suggestion is in line with a developmental pathway for fostering CT. Therefore, developing their CT abilities is an important way to foster the humanistic literacy of students in the humanities. Specifically, it is important to be equipped with the abilities to think independently and questioningly, to read individually, and to interpret texts in depth and in multiple senses. They also need to learn and understand the content of texts and evaluate the views of others in order to expand the breadth of their thinking ( Barrett, 2005 ). Moreover, they need the ability to analyze issues dialectically and rationally, and to continually reflect on themselves and offer constructive comments ( Klugman, 2018 ; Dumitru, 2019 ). Collegiate CT skills are taught via independent courses or embedded modules ( Zhang et al., 2022 ). The humanities are no exception. Yang (2007) once designed thematic history projects, as independent courses, to foster students’ disposition toward CT concerning the subject of history, and the results showed that the history projects can support learners’ development of historical literacy and CT. In a word, the humanities also play an important role in fostering the development and enhancement of college students’ CT, esthetic appreciation and creativity, and cultural heritage and understanding ( Jomli et al., 2021 ). Having good CT therefore also plays a crucial role in the lifelong development of students in the humanities.

An accurate assessment of the level of CT abilities is an important prerequisite for targeted improvement of students’ CT abilities in special disciplines ( Braeuning et al., 2021 ). Therefore, it might be meaningful to construct a self-evaluation CT framework for college students in the humanities according to their professional traits.

Evaluating college students’ CT

Given that CT can be cultivated ( Butler et al., 2017 ), more attention has been paid to how to improve students’ CT abilities level in instruction and learning ( Araya, 2020 ; Suh et al., 2021 ). However, it is also important to examine how CT can be better assessed. The evaluation of thinking is helpful for students to think at higher levels ( Kilic et al., 2020 ). Although the definitions of CT are controversial ( Hashemi and Ghanizadeh, 2012 ), many researchers have reached a consensus on the main components of CT: skills and disposition ( Bensley et al., 2016 ), and different CT evaluation frameworks have been developed according to one of the two dimensions. For example, Li and Liu (2021) developed a five-skill framework for high school students which included analysis, inference, evaluation, construct, and self-reflection. Meanwhile, in recent years, the assessment of CT disposition has also attracted the interest of a growing number of researchers. Sosu (2013) developed the “Critical Thinking Disposition Scale” (STDS), which included two dimensions: critical openness and reflective skepticism. The specific taxonomies of the evaluation framework of CT skills and dispositions is shown in Table 1 . As illustrated in Table 1 , there are some universal core items to describe CT skills. For the dimension of CT skills, the sub-dimensions of interpretation, analysis, inference, and evaluation are the important components. Those CT skills are usually applied along with the general process of learning activities ( Hsu et al., 2022 ). For instance, at the beginning of learning activities, students should have a clear understanding of the issues raised and the knowledge utilized through applying interpretation skills. Likewise, there are some universal core items to describe CT dispositions, such as open-mindedness, attentiveness, flexibility, curiosity, and so on.

Taxonomies of the evaluation framework of CT skills and dispositions.

For a good critical thinker, it is equally important to have both dispositional CT and CT skills. Students need to have the awareness of applying CT abilities to think about problem-solving and subsequently be able to utilize a variety of CT skills in specific problem-solving processes. Therefore, we argue that designing a CT self-evaluation framework that integrates the two dimensions will provide a more comprehensive assessment of college students’ CT. In terms of CT disposition, motivation, attentiveness, and open-mindedness were included as the three sub-dimensions of CT disposition. Motivation is an important prerequisite for all thinking activities ( Rodríguez-Sabiote et al., 2022 ). Especially in problem-solving-oriented learning activities, the development of CT abilities will be significantly influenced by the motivation level ( Berestova et al., 2021 ). Attentiveness refers to the state of concentration of the learner during the learning process, which reflects the learners’ level of commitment to learning, playing a crucial role in the development of CT abilities during the learning process. Open-mindedness requires learners to keep an open mind to the views of others when engaging in learning activities. The three sub-dimensions have been used to reflect leaners’ disposition to think critically. Especially in the humanities, it is only through in-depth communication between learners that a crash of minds and an improvement in abilities can take place ( Liu et al., 2022 ), and it is therefore essential that learners maintain a high level of motivation, attentiveness, and open-mindedness in this process to develop their CT abilities. In terms of CT skills, three sub-dimensions were also selected to measure the level of learners’ CT skills, namely clarification skills, organization skills, and reflection. In the humanities, it should be essential abilities for students to understand, analyze, and describe the literature and problems comprehensively and exactly ( Chen and Wei, 2021 ). Then, following the ability to extract key information about the problem, to organize and process it, and to organize the information with the help of organizational tools such as diagrams and mind maps. Finally, the whole process of problem solving is reflected upon and evaluated ( Ghanizadeh, 2016 ), and research has shown that reflection learning intervention could significantly improve learners’ CT abilities ( Chen et al., 2019 ).

Research purpose

CT plays an important role in college students’ academic and lifelong career development ( Din, 2020 ). In the current study on college students’ CT measurement, it can be improved in two main ways.

Firstly, the attention to the discipline cognition related to CT in previous studies is insufficient. Generally, students’ CT abilities can be cultivated based on two contexts: the subject-specific instructional context and the general skills instructional context ( Ennis, 1989 ; Swartz, 2018 ). In authentic teaching and learning contexts, the generation and development of CT usually takes place in problem-oriented learning activities ( Liang and Fung, 2020 ), in which students need to achieve their learning objectives by identifying problems and solving them. According to Willingham (2007) , if you are to think critically, you must have a sound knowledge base of the problem or topic of enquiry and view it from multiple perspectives. Due to the difference in nature of the disciplines, the format of specific learning activities should also vary. Hence, an adequate cognition of the discipline is an important prerequisite for learning activities; meanwhile, college students’ cognition level regarding their discipline should also be an important assessment criterion for them to understand their own level of CT abilities. Cognition refers to the acquisition of knowledge through mental activity (e.g., forming concepts, perceptions, judgments, or imagination; Colling et al., 2022 ). Learners’ thinking, beliefs, and feelings will affect how they behave ( Han et al., 2021 ). Analogically speaking, discipline cognition refers to an individual’s understanding of their discipline’s backgrounds and knowledge ( Flynn et al., 2021 ). Cognition should be an important variable in CT instruction ( Ma and Luo, 2020 ). In the current study, we added the dimension of discipline cognition into the self-evaluation CT framework of college students in the humanities. What’s more, in order to represent the learning contexts of humanities disciplines, the specific descriptions of items are concerned with the knowledge of the humanities, (e.g., “I can recognize the strengths and limitations of the discipline I am majoring in.,” and “Through studying this subject, my understanding of the world and life is constantly developing.”).

Secondly, the measurement factors of CT skills and disposition should be more specific according to the specific humanities background. In previous studies, researchers tended to measure students’ CT in terms of one of the two dimensions of CT skills. CT thinking skills used to be measured from perspectives such as analysis, interpretation, inference, self-regulation, and evaluation. However, in specific learning processes, how should students concretely analyze and interpret the problems they encounter, and how can they self-regulate their learning processes and evaluate their learning outcomes? Those issues should also be considered to evaluate college students’ levels of CT abilities more accurately. Therefore, the current study attempted to construct a CT framework in a more specific way, and by integrating both dimensions of CT disposition and skills. Therefore, what specific factors would work well as dimensions for evaluating the CT abilities of college students in the humanities? In the current study, firstly, students’ disposition to think critically is assessed in terms of three sub-dimensions: motivation, attention, and open-mindedness, to help students understand the strength of their own awareness to engage in CT ( Bravo et al., 2020 ). Motivation is an important prerequisite for all thinking activities ( Rodríguez-Sabiote et al., 2022 ), and it could contribute to the development of engagement, behavior, and analysis of problems ( Berestova et al., 2021 ). Meanwhile, there was a positive relationship between academic motivation and CT. Therefore, in the current study, motivation is still one of the crucial factors. The sub-dimension of attentiveness was also an important measurement factor, which aimed to investigate the level of the persistence of attention. Attentiveness also has a positive influence on a variety of student behaviors ( Reynolds, 2008 ), while the sub-dimension of open-mindedness mainly assesses college students’ flexibility of thinking, which is also an important factor of CT ( Southworth, 2020 ). A good critical thinker should be receptive of some views that might be challenging to their own prior beliefs with an open-minded attitude ( Southworth, 2022 ). Secondly, college students’ CT skills were then assessed in the following three sub-dimensions of clarification skills, organization skills, and reflection, with the aim of understanding how well students use CT skills in the problem-solving process ( Tumkaya et al., 2009 ). The three sub-dimensions of CT skills selected in this framework are consistent with the specific learning process of problem solving, which begins with a clear description and understanding of the problem, i.e., clarification skills. In the humanities, it should be an essential competence for students to understand, analyze, and describe the literature and problems comprehensively and exactly ( Chen and Wei, 2021 ).

We thus constructed a model for evaluating the CT of college students in the humanities (see Figure 1 ). The proposed evaluation framework incorporates three dimensions: discipline cognition (DC), CT disposition, and CT skills. Among them, CT disposition includes the three sub-dimensions of motivation (MO), attention (AT), and open-mindedness (OM), while CT skills include the three sub-dimensions of clarification skills (CS), organization skills (OS), and reflection (RE). In other words, this study aimed to construct a seven-dimensional evaluation framework and to test whether it is an effective instrument for measuring the CT of college students in the humanities.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-13-1017885-g001.jpg

A model for evaluating the CT abilities of college students in the humanities.

Materials and methods

Research design.

In order to address the two problems of the existing college students’ CT evaluation frameworks mentioned above, a CT self-evaluation framework for college students in the humanities was preliminarily developed in this study, including the following seven factors: discipline cognition (2 items), motivation (5 items), attentiveness (5 items), open-mindedness (5 items), clarification skills (3 items), organization (3 items), and reflection (4 items).

Then, to ensure the content validity of the measurement framework, four experts who have studied CT and five teachers who have worked in the field of humanities were invited to review all items and give feedback. The research team compared the similarities and differences in expert opinions and made joint decisions. Meanwhile, to ensure the popularity, accuracy, and objectivity of the items, 25 college students majoring in humanities participated in the pretest, and the presentation and description of the items was improved according to their feedback. Finally, a questionnaire consisting of 30 items was constructed, including three items for participants’ socio-demographic information (e.g., gender, grade, and subject), two for discipline cognition, five for motivation, five for attention, five for open-mindedness, three for clarification skills, three for organization skills, and four for reflection (as shown in Table 2 ). For each item, a 5-point Likert-style scale (5 = strongly agree, 4 = agree, 3 = neutral, 2 = disagree, 1 = strongly disagree) was used.

Dimensions and items of the college students’ CTS evaluation framework.

Participants and data collection

In the current study, simple random sampling was adopted and the online questionnaire was uploaded on Questionnaire Star 1 (accessed on 18 March 2022), a professional online survey tool widely used in China ( Sarjinder, 2003 ). The link to the online questionnaire was sent to the teachers in the humanities of some colleges in Jiangsu, China. Then teachers sent the link to their students. In the first part of the questionnaire, students were told that they were participating in an anonymous study, the content of which may be published without any commercial use. If they did not want to participate in the survey, they could quit the website of the online questionnaire. Students who agreed to participate in the survey filled in the questionnaire. In addition, to ensure the reliability of the results of the subsequent data analysis, the ratio of the number of questionnaire items to the number of participants should be 1:5, and the larger the sample size the better ( Gorsuch, 1983 ). Therefore, eventually, 654 college students agreed to take part in the study, and completed the online questionnaire. After deleting those questionnaires with the same answer for all items or overly short response times, the effective number of samples was 642, with an effective rate of 98.2%.

The recruited effective sample comprised 642 participants, of whom 67.4% were female ( n  = 433), and 32.6% were male ( n  = 209). Sophomores ( n  = 215, 33.5%) and juniors ( n  = 249, 38.8%) made up most of the total number of participants. Meanwhile, the current study aimed to construct a CT framework for college students in the humanities field; hence, all participants were students in humanities disciplines, such as history ( n  = 187, 29.1%), educational history ( n  = 78, 12.2%), philosophy ( n  = 97, 15.1%), Chinese language and literature ( n  = 221, 34.4%), and pedagogy ( n  = 59, 9.2%). The specific socio-demographic information is shown in Table 3 .

Socio-demographic profile of respondents.

Data analysis

To construct an evaluation framework of college students’ CT skills and to confirm its reliability and validity, exploratory factor analysis (EFA), confirmatory factor analysis (CFA), and item analysis were carried out. Firstly, 642 samples were randomly assigned to two groups, with 321 samples in each ( Yurdakul et al., 2012 ) to avoid inflation of the Cronbach’s alpha value or other effects ( Devellis, 2011 ). EFA was used to analyze the first group of samples. CFA was applied to the second sample. Firstly, EFA was conducted in order to determine the underlying factor structure of the CT-evaluation framework and to make decisions about item retention ( Kieffer, 1988 ). During this process, principal component analysis (PCA) was applied as an EFA factor extraction technique ( Vogel et al., 2009 ). CFA was then used to confirm the factor structure of the scale using the second group of 321 samples ( Kline, 2005 ). Lastly, all samples were analyzed to test the differentiation and suitability of the items ( Yurdakul et al., 2012 ). SPSS 18.0 and AMOS 24.0 were applied to analyze the collected data.

SPSS 22.0 was used for conducting EFA, and the maximum variance method was adopted for factor rotation.

Reliability analysis of the scale

Prior to the EFA, sample variance and sample size evaluations were conducted. An evaluation of Bartlett’s Test of Sphericity was found to be significant, thus confirming homogeneity of variance ( χ 2  = 9162.198; p  < 0.001). Then, the Cronbach’s alpha value ( Pallant, 2007 ) was applied to evaluate the reliability of the scale, and the results showed that the whole scale had good reliability ( α  = 0.909). Specifically, the Cronbach’s alpha values of the seven factors were 0.724 (DC), 0.771 (MO), 0.878 (AT), 0.839 (OM), 0.819 (CL), 0.755 (OR), and 0.878 (RE), indicating their reliability. The Kaiser-Meyer Olkin (KMO) value of the questionnaire was 0.907, showing the appropriateness of the EFA ( Kaiser, 1974 ).

Validity analysis of the scale

To confirm the validity of the evaluation dimensions, the method of PCA was applied to extract factors, and maximum variance rotation was used for the EFA. Seven factors were finally obtained. Kieffer (1988) suggested that two strategies should be applied for EFA. Thus, oblique rotation and orthogonal rotation were both used. If the results of the two methods are similar, the results obtained by the orthogonal rotation method can be used. Therefore, in the current study, two methods were both applied for EFA, namely optimal skew and maximum variance orthogonal rotation. The results of the two methods showed no significant difference. This study thus applied the results of the maximum variance orthogonal rotation method. MO5, OM4, and OM5 were removed since their maximum factor loadings were not in line with their initial evaluation dimension ( Conway and Huffcutt, 2016 ). In addition, the factors with an eigenvalue higher than 1 were picked. Items with a factor loading of less than 0.4 and with inconsistent content were removed through the multiple orthogonal rotations ( Zhao et al., 2021 ). There were 25 items with eigenvalues greater than 1 and independent factor loadings greater than 0.5 which were retained ( Fabrigar et al., 1999 ). Table 4 presents the results of the component transformation matrix. Finally, seven factors were selected, with a cumulative variance contribution of 71.413% ( Conway and Huffcutt, 2016 ). The eigenvalues and cumulative variance contributions of the seven factors are shown in Table 5 .

The factor analysis of college students’ CT framework ( N  = 321).

The eigenvalues and contribution rates of the five factors in the model.

The first-order CFA was adopted to determine the validity, convergence, and identifiability of the framework in this study ( Kline, 2005 ). CFA was used to explore the relationships between each factor, and then to construct the evaluation framework of humanities college students’ CT.

Fitting validity analysis for the framework

As shown in Figure 2 , first-order CFA was conducted. According to Hair et al. (2014) , items that do not meet the standard load (<0.5) must be eliminated. The absolute and relative fitting indexes were applied to verify the framework fit. The Chi-square/ df in this research was 3.651, and the value of RMSEA was 0.044 (<0.08; Liu et al., 2021 ). In addition, the goodness-of-fit index (GFI) and adjusted fitness index (AGFI) were 0.923 and 0.906 respectively, which both met the reference standard proposed by Foster et al. (1993) . Moreover, consistent with Hair et al. (2014) recommendations, the normed fitness index (NFI), comparative fitness index (CFI), incremental fitness index (IFI), and relative fitness index (RFI) were 0.975, 0.982, and 0.972 (>0.9). In addition, the values of simplifying the specification fitness index (PNFI), and streamlining fitness indicator (PGFI) were more than 0.5. Therefore, these results indicated the good fitting validity of the framework ( Table 6 ).

An external file that holds a picture, illustration, etc.
Object name is fpsyg-13-1017885-g002.jpg

The first-order CFA model.

The fitting index of the evaluation framework.

Convergence validity analysis for the framework

The CFA results are shown in Table 7 . The comprehensive reliability (CR) and average variance extracted (AVE) were used to test the construct validity of the framework. According to Hair et al. (2014) , the CR value of all items should be more than 0.7. Thus, the CR of the 22 remaining items was good. What is more, Fornell and Larcker (1981) pointed out that if the AVE is higher than 0.5, the framework shows good convergence validity. Therefore, the results in Table 5 show that this evaluation framework has high validity and is reasonable.

Results of the confirmatory factor analysis.

Discriminant validity analysis of the framework

The discriminant validity of the framework could be ensured by testing the correlation matrix among dimensions. Schumacker and Lomax (2016) proposed that in the structural discriminant validity analysis of tools, the AVE square root of all factors must be more than the absolute value of the Pearson correlation coefficient between two factors in order to be recognized as having discriminant validity. Therefore, as shown in Table 8 , the result of structural discriminant validity analysis indicated that this framework had good discriminant validity.

The results of interrelated coefficient matrix and square roots of AVE.

***Significant at the 0.001 level; **Significant at the 0.01 level; *Significant at the 0.05 level.

Item analysis

Item analysis was conducted to determine how well the items discriminate between college students with high abilities and those with low abilities in terms of CT within the scope of the item validity of the CT-evaluation scale form. In order to accomplish this goal, item discrimination statistics were calculated based on the differences between the lowest group means of 27% and the highest group means of 27% of the participants determined according to the scores of each item and to the total scores of the scale ( Aridag and Yüksel, 2010 ). Therefore, first, the total scores for everyone were calculated by using the scale. This was followed by the calculation of total scores that were then ranked from the highest to the lowest. Of all the participants constituting the study group ( N  = 642), 27% (174) of them who had the highest scores were determined to be the higher group, and 27% of all the participants who had the lowest scores were determined to be the lower group. The independent samples t -test was applied for the purpose of statistically testing the difference between the mean scores of the two groups. The results obtained are presented in Table 9 . Further, items with dimensional Pearson correlation coefficients and standardized factor loadings that did not reach the standard value (less than 0.4 and 0.45 respectively) were eliminated. Finally, for the remaining 22 items, the decisive values were higher than 0.3, and the gross interrelated coefficient between questions and items was higher than 0.4. Overall, the item analysis results showed that the remaining 22 items reached the standard.

t -test results for the item means of the high-low-27% group.

CT is one of the key competencies that college students need to acquire ( Bandyopadhyay and Szostek, 2019 ). This study aimed to construct a self-evaluation CT framework for college students in the humanities. In the initial framework, three dimensions and 27 items were conceived; then EFA was conducted, and items with independent factor loadings below 0.5 were excluded ( Fabrigar et al., 1999 ). As a result, 25 items were retained for CFA. The results showed that three items should be eliminated because of their lower standard load (less than 0.5). Subsequently, the evaluation model with 22 items had an acceptable fitting index; meanwhile, good convergence and discriminant validity of the framework was also shown by calculating CR, AVE, and the square roots of AVE. Finally, to verify the suitability and distinctiveness of the constructed items, item analysis was conducted. The result showed that for the remaining 22 items, the decisive values were higher than 0.3, and the gross interrelated coefficient between questions and items was higher than 0.4, so the remaining 22 items reached the standard. Therefore, the final self-evaluation CT framework is a 22-item instrument, measuring three dimensions and six sub-dimensions: discipline cognition, CT disposition (open-mindedness, motivation, and attentiveness), and CT skills (reflection, organization skills, and clarification skills).

Compared to previous studies about the construction of an assessment framework for CT, this study focused on three important issues: the CT abilities of college students majoring in the humanities was the focus of this study; both CT skills and CT dispositions were included; and more specific dimensions of CT were the core measurement factors. In previous CT assessment frameworks, students in the disciplines of science (mathematics, business, nursing, engineering, etc.) were often the main subjects of study ( Kim et al., 2014 ; Michaluk et al., 2016 ; Siew and Mapeala, 2016 ; Basri and Rahman, 2019 ), while college students majoring in the humanities have received less attention. However, CT as a guide of belief and action ( Gyenes, 2021 ) is an important ability for college students in all fields ( Davies, 2013 ; Zhang et al., 2022 ). In humanities subjects, research has shown that independent thinking skills are valuable indicators of students’ discipline-specific abilities in humanities subjects ( Bertram et al., 2021 ). College students in the humanities need CT abilities to identify problems and find critical solutions ( Baş et al., 2022 ). Meanwhile, the assessment instrument developed in this study added the dimension of disciplinary cognition, which is considered a prerequisite to help college students have a clear idea of their subject background. Therefore, the CT assessment framework provided a practical method for teachers and learners in the humanities to investigate the level of their CT abilities. For example, in the discipline of history, thematic history projects could be applied to foster students’ CT abilities in authentic history teaching contexts ( Yang, 2007 ). In order to verify whether the projects help to improve learners’ CT abilities, this CT evaluation framework can be applied before and at the end of the project to determine whether there are differences in learners’ levels of CT abilities before and after learning. Likewise, in philosophy classroom, philosophical whole-class dialog can be useful teaching strategies to activate learners to think critically about moral values ( Rombout et al., 2021 ). Learners in dialogs must take others’ perspectives into account ( Kim and Wilkinson, 2019 ), which is in line with the sub-dimension of open-mindedness in the current CT evaluation framework. Hence, the CT evaluation framework can also be applied in specific disciplines.

In addition, in the current CT evaluation framework, both CT skills and CT dispositions were included, and more specific dimensions of CT were the core measurement factors. In terms of CT disposition, it reflects the strength of students’ belief to think and act critically. In the current evaluation instrument, the three sub-dimensions of motivation, open-mindedness, and attentiveness are the evaluation factors. The cultivation of college students’ CT abilities is usually based on specific educational activities. When college students get involved in learning activities, there are opportunities for them to foster their CT abilities ( Liu, 2014 ; Huang et al., 2022 ). An important factor influencing student engagement is motivation ( Singh et al., 2022 ), which has an important effect on college students’ behavior, emotion, and cognitive process ( Gao et al., 2022 ). Hence, it makes sense to regard motivation as a measure factor of CT disposition, and it is crucial for college students to self-assess their motivation level in the first place to help them have a clear insight into their overall level of CT. The sub-dimension of attentiveness was also an important measurement factor, which aimed to investigate the level of the persistence of attention. Attentiveness also has a positive influence on a variety of student behaviors ( Reynolds, 2008 ), while the sub-dimension of open-mindedness mainly assesses college students’ flexibility of thinking, which is also an important factor of CT ( Southworth, 2020 ). A good critical thinker should be receptive of some views that might be challenging to their own prior beliefs with an open-minded attitude ( Southworth, 2022 ). CT skills were then assessed in the following three sub-dimensions of clarification skills, organization skills, and reflection, with the aim of understanding how well students use CT skills in the problem-solving process ( Tumkaya et al., 2009 ). The three sub-dimensions of CT skills selected in this framework are consistent with the specific learning process of problem solving, which begins with a clear description and understanding of the problem, i.e., clarification skills, followed by the ability to extract key information about the problem, to organize and process it, and to organize the information with the help of organizational tools such as diagrams and mind maps. Finally, the whole process of problem solving is reflected upon and evaluated, and research has shown that reflection learning intervention could significantly improve learners’ CT abilities ( Chen et al., 2019 ).

In other words, the self-evaluation framework of college students’ CT constructed in this study focused on the investigation of college students in the humanities, and the descriptions of specific items combined the characteristics of the humanities. What’s more, because there are some differences in the extent to which students apply specific CT skills and are aware of how to use CT to solve problems based on their different disciplinary backgrounds ( Belluigi and Cundill, 2017 ), the construction of the CT assessment framework for college students provides a practical pathway and a more comprehensive instrument for assessing the CT abilities of college students majoring in the humanities, and a research entry point was provided for researchers to better research the CT of college students majoring in the humanities.

Based on a previous literature review of CT, this study further investigated the necessity of college students’ CT to construct a framework for evaluating the CT of college students in the humanities, and to test its effectiveness. The EFA, CFA, and item analysis methods were conducted in this study to construct a three-dimensional college students’ CT self-evaluation framework. The results indicate that the framework constructed in this study has good reliability and validity. Finally, a framework with three dimensions (discipline cognition, CT disposition, and CT skills) and seven sub-dimensions (discipline cognition, motivation, attentiveness, open-mindedness, reflection, organization skills, and clarification skills) totaling 22 items was developed.

Implications

The main significance of this study is reflected in three aspects. Firstly, the current study constructed a CT-evaluation framework for college students majoring in the humanities. The results of the EFA, CFA, and item analysis supported the reliability and validity of the three-dimensional framework which indicates that it consists of discipline cognition, CT disposition, and CT skills. The specific assessment factors not only integrate the two dimensions of CT (skills and disposition), making the assessment framework more comprehensive, but also integrate the dimension of discipline cognition, enabling specific measures to be developed based on specific disciplinary contexts, ensuring that CT is assessed more accurately and relevantly. Second, the CT-evaluation framework can be applied in specific instruction and learning contexts. It is well known that CT has become one of the abilities in the 21st century. In instruction and learning, specific instructional strategies and learning activities should be purposefully applied according to specific humanistic backgrounds. Prior to undertaking specific teaching activities, it is worth having a prerequisite understanding of college students’ level of CT abilities by inviting students to complete the self-evaluation CT competence instrument. Likewise, after the learning activities, it is also an important instrument to evaluate the effectiveness of learning activities in terms of cultivating college students’ CT abilities. Finally, the construction of the CT assessment framework for college students provides a practical pathway for assessing the CT abilities of college students majoring in the humanities, and a research entry point was provided for researchers to better research the CT of these students majoring in the humanities in the future.

Limitations and future work

There are two main limitations of this study. First, the sample in this study was from one area and was selected by random sampling, which cannot cover all the college students in the major. More and larger representative samples will be needed in the future to assess the extent to which the findings are applicable to other population groups to confirm the conclusions of the study. In addition, this evaluation framework of college students’ CT is still in the theoretical research stage and has not yet been put into practice. Therefore, the framework should be practically applied in further research to improve its applicability and usability according to practical feedback.

Data availability statement

Ethics statement.

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.

Author contributions

QL: conceptualization. SL: methodology. SL and ST: writing—original draft preparation. SL, XG, and QL: writing—review and editing. All authors have read and agreed to the published version of the manuscript.

This study was supported by the School Curriculum Ideological and Political Construction Project (no. 1812200046KCSZ2211).

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of inter.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

1 www.wjx.cn

  • Abrami P., Bernard R. M., Borokhovski E., Wade A., Surkes M. A., Tamim R., et al.. (2008). Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis . Rev. Educ. Res. 78 , 1102–1134. doi: 10.3102/0034654308326084 [ CrossRef ] [ Google Scholar ]
  • Ahmady S., Shahbazi S. (2020). Impact of social problem-solving training on critical thinking and decision making of nursing students . BMC Nurs. 19 :94. doi: 10.1186/s12912-020-00487-x, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Al-Khatib O. (2019). A Framework for Implementing Higher-Order Thinking Skills (Problem-Solving, Critical Thinking, Creative Thinking, and Decision-Making) in Engineering & Humanities. Advances in Science and Engineering Technology International Conferences (ASET).
  • Álvarez-Huerta A., Muela A., Larrea I. (2022). Disposition toward critical thinking and creative confidence beliefs in higher education students: the mediating role of openness to diversity and challenge . Think. Skills Creat. 43 :101003. doi: 10.1016/j.tsc.2022.101003 [ CrossRef ] [ Google Scholar ]
  • Araya A. E. M. (2020). Critical thinking for civic life in elementary education: combining storytelling and thinking tools/pensamiento critico Para la Vida ciudadanaen educacion primaria: combinando narrativa y herramientas de pensamiento . Educacion 44 , 23–43. [ Google Scholar ]
  • Aridag N. C., Yüksel A. (2010). Analysis of the relationship between moral judgment competences and empathic skills of university students . Kuram ve Uygulamada Egitim Bilimleri 10 , 707–724. [ Google Scholar ]
  • Arisoy B., Aybek B. (2021). The effects of subject-based critical thinking education in mathematics on students’ critical thinking skills and virtues . Eur J of Educ Res 21 , 99–120. doi: 10.14689/ejer.2021.92.6 [ CrossRef ] [ Google Scholar ]
  • Bandyopadhyay S., Szostek J. (2019). Thinking critically about critical thinking: assessing critical thinking of business students using multiple measures . J. Educ. Bus. 94 , 259–270. doi: 10.1080/08832323.2018.1524355 [ CrossRef ] [ Google Scholar ]
  • Barrett A. (2005). The information-seeking habits of graduate student researchers in the humanities . J Acad Libr 31 , 324–331. doi: 10.1016/j.acalib.2005.04.005 [ CrossRef ] [ Google Scholar ]
  • Barta A., Fodor L. A., Tamas B., Szamoskozi I. (2022). The development of students critical thinking abilities and dispositions through the concept mapping learning method – a meta-analysis . Educ. Res. Rev. 37 :100481. doi: 10.1016/j.edurev.2022.100481 [ CrossRef ] [ Google Scholar ]
  • Baş M. T., Özpulat F., Molu B., Dönmez H. (2022). The effect of decorative arts course on nursing students’ creativity and critical thinking dispositions . Nurse Educ. Today :105584. doi: 10.1016/j.nedt.2022.105584 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Basri H., Rahman A. A. (2019). Investigating critical thinking skill of junior High School in Solving Mathematical Problem . Int. J. Instr. 12 , 745–758. doi: 10.29333/iji.2019.12345a [ CrossRef ] [ Google Scholar ]
  • Bellaera L., Weinstein-Jones Y., Ilie S., Baker S. T. (2021). Critical thinking in practice: the priorities and practices of instructors teaching in higher education . Think. Skills Creat. 41 :100856. doi: 10.1016/j.tsc.2021.100856 [ CrossRef ] [ Google Scholar ]
  • Belluigi D. Z., Cundill G. (2017). Establishing enabling conditions to develop critical thinking skills: a case of innovative curriculum design in environmental science . Environ. Educ. Res. 23 , 950–971. doi: 10.1080/13504622.2015.1072802 [ CrossRef ] [ Google Scholar ]
  • Bensley D. A., Crowe D. S., Bernhardt P., Buckner C., Allman A. L. (2010). Teaching and assessing critical thinking skills for argument analysis in psychology . Teach. Psychol. 37 , 91–96. doi: 10.1080/00986281003626656 [ CrossRef ] [ Google Scholar ]
  • Bensley D. A., Rainey C., Murtagh M. P., Flinn J. A., Maschiocchi C., Bernhardt P. C., et al.. (2016). Closing the assessment loop on critical thinking: the challenges of multidimensional testing and low test-taking motivation . Think. Skills Creat. 21 , 158–168. doi: 10.1016/j.tsc.2016.06.006 [ CrossRef ] [ Google Scholar ]
  • Bentler P. M. (1990). Comparative fit indexes in structural models . Psychol. Bull. 107 , 238–246. doi: 10.1037/0033-2909.107.2.238, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Berdahl L., Hoessler C., Mulhall S., Matheson K. (2021). Teaching critical thinking in political science: A case study . J. Political Sci. Educ. 17 , 910–925. doi: 10.1080/15512169.2020.1744158 [ CrossRef ] [ Google Scholar ]
  • Berestova A., Kolosov S., Tsvetkova M., Grib E. (2021). Academic motivation as a predictor of the development of critical thinking in students . J. Appl. Res. High. Educ. 14 , 1041–1054. doi: 10.1108/JARHE-02-2021-0081 [ CrossRef ] [ Google Scholar ]
  • Bertram C., Weiss Z., Zachrich L., Ziai R. (2021). Artificial intelligence in history education. Linguistic content and complexity analyses of student writings in the CAHisT project (computational assessment of historical thinking) . Comput. Educ. Artif. Intell 100038 :100038. doi: 10.1016/j.caeai.2021.100038 [ CrossRef ] [ Google Scholar ]
  • Bhatt I., Samanhudi U. (2022). From academic writing to academics writing: transitioning towards literacies for research productivity . Int. J. Educ. Res. 111 :101917. doi: 10.1016/j.ijer.2021.101917 [ CrossRef ] [ Google Scholar ]
  • Boso C. M., Van Der Merwe A. S., Gross J. (2021). Students’ and educators’ experiences with instructional activities towards critical thinking skills acquisition in a nursing school . Int J. Afr Nurs Sci 14 :100293. doi: 10.1016/j.ijans.2021.100293 [ CrossRef ] [ Google Scholar ]
  • Braeuning D., Hornung C., Hoffmann D., Lambert K., Ugen S., Fischbach A., et al.. (2021). Cognitive development . 58 :101008. doi: 10.1016/j.cogdev.2021.101008, [ CrossRef ] [ Google Scholar ]
  • Bravo M. J., Galiana L., Rodrigo M. F., Navarro-Pérez J. J., Oliver A. (2020). An adaptation of the critical thinking disposition scale in Spanish youth . Think. Skills Creat. 38 :100748. doi: 10.1016/j.tsc.2020.100748 [ CrossRef ] [ Google Scholar ]
  • Butler H. A., Pentoney C., Bong M. P. (2017). Predicting real-world outcomes: critical thinking ability is a better predictor of life decisions than intelligence . Think. Skills Creat. 25 , 38–46. doi: 10.1016/j.tsc.2017.06.005 [ CrossRef ] [ Google Scholar ]
  • Cáceres M., Nussbaum M., Ortiz J. (2020). Integrating critical thinking into the classroom: a teacher’s perspective . Think. Skills Creat. 37 , 100674–100618. doi: 10.1016/j.tsc.2020.100674 [ CrossRef ] [ Google Scholar ]
  • Chan C. (2019). Using digital storytelling to facilitate critical thinking disposition in youth civic engagement: A randomized control trial . Child Youth Serv. Rev. 107 :104522. doi: 10.1016/j.childyouth.2019.104522 [ CrossRef ] [ Google Scholar ]
  • Chen F., Chen S., Pai H. (2019). Self-reflection and critical thinking: the influence of professional qualifications on registered nurses . Contem Nurs 55 , 59–70. doi: 10.1080/10376178.2019.1590154, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chen Q., Liu D., Zhou C., Tang S. (2020). Relationship between critical thinking disposition and research competence among clinical nurses: A cross-sectional study . J. Clin. Nurs. 29 , 1332–1340. doi: 10.1111/jocn.15201, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chen K. L., Wei X. (2021). Boya education in China: lessons from liberal arts education in the U.S. and Hong Kong . Int. J. Educ. Dev. 84 :102419. doi: 10.1016/j.ijedudev.2021.102419 [ CrossRef ] [ Google Scholar ]
  • Chen X., Zhai X., Zhu Y., Li Y. (2022). Exploring debaters and audiences’ depth of critical thinking and its relationship with their participation in debate activities . Think. Skills Creat. 44 :101035. doi: 10.1016/j.tsc.2022.101035 [ CrossRef ] [ Google Scholar ]
  • Colling J., Wollschlager R., Keller U., Preckel F., Fischbach A. (2022). Need for cognition and its relation to academic achievement in different learning environments . Learn. Individ. Differ. 93 :1021110. doi: 10.1016/j.lindif.2021.102110 [ CrossRef ] [ Google Scholar ]
  • Conway J. M., Huffcutt A. I. (2016). A review and evaluation of exploratory factor analysis practices in organizational research . Organ. Res. Methods 6 , 147–168. doi: 10.1177/1094428103251541 [ CrossRef ] [ Google Scholar ]
  • Davies M. (2013). Critical thinking and the disciplines reconsidered . High. Educ. Res. Dev. 32 , 529–544. doi: 10.1080/07294360.2012.697878 [ CrossRef ] [ Google Scholar ]
  • Devellis R. F. (2011). Scale Development . New York: SAGE Publications, Inc. [ Google Scholar ]
  • Dewey J. (1933). How We Think . D C Heath, Boston. [ Google Scholar ]
  • Din M. (2020). Evaluating university students’ critical thinking ability as reflected in their critical reading skill: A study at bachelor level in Pakistan . Think. Skills Creat. 35 :100627. doi: 10.1016/j.tsc.2020.100627 [ CrossRef ] [ Google Scholar ]
  • Dumitru D. (2019). Creating meaning. The importance of arts, humanities and culture for critical thinking development . Stud. High. Educ. 44 , 870–879. doi: 10.1080/03075079.2019.1586345 [ CrossRef ] [ Google Scholar ]
  • Dunne G. (2015). Beyond critical thinking to critical being: criticality in higher education and life . Int. J. Educ. Res. 71 , 86–99. doi: 10.1016/j.ijer.2015.03.003 [ CrossRef ] [ Google Scholar ]
  • Duro E., Elander J., Maratos F. A., Stupple E. J. N., Aubeeluck A. (2013). In search of critical thinking in psychology: an exploration of student and lecturer understandings in higher education . Psychol. Learn. Teach. 12 , 275–281. doi: 10.2304/plat.2013.12.3.275 [ CrossRef ] [ Google Scholar ]
  • Dwyer C. P., Hogan M. J., Stewart I. (2014). An integrated critical thinking framework for the 21st century . Think. Skills Creat. 12 , 43–52. doi: 10.1016/j.tsc.2013.12.004 [ CrossRef ] [ Google Scholar ]
  • Ennis R. H. (1962). A concept of critical thinking . Harvard Educ Rev 32 , 81–111. [ Google Scholar ]
  • Ennis R. H. (1987). Critical Thinking and the Curriculum Think Skills Ins: Con. Tec., 40–48. [ Google Scholar ]
  • Ennis R. H. (1989). Critical thinking and subject specificity: clarification and needed research . Educ Res. 18 , 4–10. doi: 10.3102/0013189X018003004 [ CrossRef ] [ Google Scholar ]
  • Ennis R. H. (2018). Critical thinking across the curriculum: A vision . Springer 37 , 165–184. doi: 10.1007/s11245-016-9401-4 [ CrossRef ] [ Google Scholar ]
  • Fabrigar L. R., Wegener D. T., Mac Callum R. C., Strahan E. J. (1999). Evaluating the use of exploratory factor analysis in psychological research . Psychol. Method. 4 , 272–299. doi: 10.1037/1082-989X.4.3.272 [ CrossRef ] [ Google Scholar ]
  • Facione P. A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations. Available at: http://files.eric.ed.gov/fulltext/ED315423.pdf (Accessed November 3, 2022).
  • Facione N. C., Facione P. A., Sanchez C. A. (1994). Critical thinking disposition as a measure of competent clinical judgment: the development of the California critical thinking disposition inventory . J. Nurs. Educ. 33 , 345–350. doi: 10.3928/0148-4834-19941001-05, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Feng R. C., Chen M. J., Chen M. C., Pai Y. C. (2010). Critical thinking competence and disposition of clinical nurses in a medical center . J. Nurs. Res. 18 , 77–87. doi: 10.1097/JNR.0b013e3181dda6f6 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fernández-Santín M., Feliu-Torruella M. (2020). Developing critical thinking in early childhood through the philosophy of Reggio Emilia . Think. Skills Creat. 37 :100686. doi: 10.1016/j.tsc.2020.100686 [ CrossRef ] [ Google Scholar ]
  • Flores K. L., Matkin G. S., Burbach M. E., Quinn C. E., Harding H. (2012). Deficient critical thinking skills among college graduates: implications for leadership . Educ. Philos. Theory 44 , 212–230. doi: 10.1111/j.1469-5812.2010.00672.x [ CrossRef ] [ Google Scholar ]
  • Flynn R. M., Kleinknecht E., Ricker A. A., Blumberg F. C. (2021). A narrative review of methods used to examine digital gaming impacts on learning and cognition during middle childhood . Int. J. Child Comput. Int. 30 :100325. doi: 10.1016/j.ijcci.2021.100325 [ CrossRef ] [ Google Scholar ]
  • Fornell C., Larcker D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error . J. Mark. Res. 18 , 39–50. doi: 10.1177/002224378101800312 [ CrossRef ] [ Google Scholar ]
  • Foster J., Barkus E., Yavorsky C. (1993). Understanding and Using Advanced Statistics. New York: SAGE Publications. [ Google Scholar ]
  • Gao Q. Q., Cao B. W., Guan X., Gu T. Y., Bao X., Wu J. Y., et al.. (2022). Emotion recognition in conversations with emotion shift detection based on multi-task learning . Knowl-based Syst. 248 :108861. doi: 10.1016/j.knosys.2022.108861 [ CrossRef ] [ Google Scholar ]
  • Ghanizadeh A. (2016). The interplay between reflective thinking, critical thinking, self-monitoring, and academic achievement in higher education . High. Educ. 74 , 101–114. doi: 10.1007/s10734-016-0031-y [ CrossRef ] [ Google Scholar ]
  • Giannouli V., Giannoulis K. (2021). Critical thinking and leadership: can we escape modern Circe's spells in nursing? Nursing leadership Toronto . Ont 34 , 38–44. doi: 10.12927/cjnl.2021.26456 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gorsuch R. (1983). Factor analysis (2nd ed). Hillsdale, NJ: Lawrence Erlbaum Associates. [ Google Scholar ]
  • Gyenes A. (2021). Student perceptions of critical thinking in EMI programs at Japanese universities: A Q-methodology study . J. Eng Aca Pur 54 :101053. doi: 10.1016/j.jeap.2021.101053 [ CrossRef ] [ Google Scholar ]
  • Hair J. F., Black W. C., Babin B. J., Anderson R. E. (2014). Multivariate Data Analysis , 7th Edn. Upper Saddle River, NJ: Pearson Prentice Hall. [ Google Scholar ]
  • Halpern D. F. (1998). Teaching critical thinking for transfer across domains: disposition, skills, structure training, and metacognitive monitoring . Am. Psychol. 53 , 449–455. doi: 10.1037/0003-066X.53.4.449, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Han J., Usher E. L., Brown C. S. (2021). Trajectories in quantitative and humanities self-efficacy during the first year of college . Learn. Individ. Differ. 91 :102054. doi: 10.1016/j.lindif.2021.102054 [ CrossRef ] [ Google Scholar ]
  • Hashemi M. R., Ghanizadeh A. (2012). Critical discourse analysis and critical thinking: an experimental study in an EFL context . System 40 , 37–47. doi: 10.1016/j.system.2012.01.009 [ CrossRef ] [ Google Scholar ]
  • Hsu F. H., Lin I. H., Yeh H. C., Chen N. S. (2022). Effect of Socratic reflection prompts via video-based learning system on elementary school students’ critical thinking skills . Comput. Educ. 183 :104497. doi: 10.1016/j.compedu.2022.104497 [ CrossRef ] [ Google Scholar ]
  • Huang Y. M., Silitonga L. M., Wu T. T. (2022). Applying a business simulation game in a flip classroon to enhance engagement, learning achievement, and higher-order thinking skills . Comput. Educ. 183 :104497. doi: 10.1016/j.compedu.2022.104494 [ CrossRef ] [ Google Scholar ]
  • Jiang J., Gao A., Yang B. Y. (2018). Employees' critical thinking, Leaders' inspirational motivation, and voice behavior the mediating role of voice efficacy . J. Pers. Psychol. 17 , 33–41. doi: 10.1027/1866-5888/a000193 [ CrossRef ] [ Google Scholar ]
  • Jomli R., Ouertani J., Jemli H., Ouali U., Zgueb Y., Nacef F. (2021). Comparative study of affective temperaments between medical students and humanities students (evaluation by validated temps-a) . Eur. Psychiatry 64 :S199. doi: 10.1192/j.eurpsy.2021.529 [ CrossRef ] [ Google Scholar ]
  • Kaiser H. F. (1974). An index of factorial simplicity . Psychometrika 39 , 31–36. doi: 10.1007/BF02291575 [ CrossRef ] [ Google Scholar ]
  • Kanbay Y., Okanlı A. (2017). The effect of critical thinking education on nursing students’ problem-solving skills . Contemp. Nurse 53 , 313–321. doi: 10.1080/10376178.2017.1339567, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kember D., Leung D. Y. P., Jones A., Loke A. Y., Mckay J., Sinclair K., et al.. (2010). Development of a questionnaire to measure the level of reflective thinking . Asses. Eval. High. Edu. 25 , 381–395. doi: 10.1080/713611442 [ CrossRef ] [ Google Scholar ]
  • Khandaghi M. A., Pakmehr H., Amiri E. (2011). The status of college students’ critical thinking disposition in humanities . Proc. Soc. Behav. Sci 15 , 1866–1869. doi: 10.1016/j.sbspro.2011.04.017 [ CrossRef ] [ Google Scholar ]
  • Kieffer K. M. (1988). Orthogonal Versus Oblique Factor Rotation: A Review of the Literature Regarding the Pros and Cons. In Proceedings 554 of the Annual Meeting of the 27th Mid-South Educational Research Association, New Orleans, LA;4 November 1998, 4-6, 555. Available at: https://files.eric.ed.gov/fulltext/ED427031.pdf (Accessed November 3, 2022).
  • Kilic S., Gokoglu S., Ozturk M. A. (2020). Valid and reliable scale for developing programming-oriented computational thinking . J. Educ. Comput. Res. 59 , 257–286. doi: 10.1177/0735633120964402 [ CrossRef ] [ Google Scholar ]
  • Kim D. H., Moon S., Kim E. J., Kim Y. J., Lee S. (2014). Nursing students' critical thinking disposition according to academic level and satisfaction with nursing . Nurs. Educ. Today 34 , 78–82. doi: 10.1016/j.nedt.2013.03.012, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kim M.-Y., Wilkinson I. A. G. (2019). What is dialogic teaching? Constructing, deconstructing, and reconstructing a pedagogy of classroom talk . Learn. Cult Soc. Inter 21 , 70–86. doi: 10.1016/j.lcsi.2019.02.003 [ CrossRef ] [ Google Scholar ]
  • Kline T. J. B. (2005). Psychological Testing: A Practical Approach to Design and Evaluation . Thousand Oaks, London, New Delhi: Sage Publications. [ Google Scholar ]
  • Klugman C. M. (2018). How health humanities will save the life of the humanities . J. Med. Humanit. 38 , 419–430. doi: 10.1007/s10912-017-9453-5 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lederer J. M. (2007). Disposition toward critical thinking among occupational therapy students . Am. J. Occup. Ther. 61 , 519–526. doi: 10.5014/ajot.61.5.519, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Les T., Moroz J. (2021). More critical thinking in critical thinking concepts (?) A constructivist point of view . J Crit Educ Policy Sci 19 , 98–124. [ Google Scholar ]
  • Li N. (2021). Reasonable or unwarranted? Benevolent gender prejudice in education in China . Asia Pac. Educ. Res 31 , 155–163. doi: 10.1007/s40299-020-00546-6 [ CrossRef ] [ Google Scholar ]
  • Li X. Y., Liu J. D. (2021). Mapping the taxonomy of critical thinking ability in EFL . Think. Skills Creat. 41 :100880. doi: 10.1016/j.tsc.2021.100880 [ CrossRef ] [ Google Scholar ]
  • Liang W., Fung D. (2020). Development and evaluation of a WebQuest-based teaching programme: students’ use of exploratory talk to exercise critical thinking . Int. J. Educ. Res. 104 :101652. doi: 10.1016/j.ijer.2020.101652, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lin S. S. (2016). Science and non-science undergraduate students’ critical thinking and argumentation performance in reading a science news report . Int. J. Sci. Math. Educ. 12 , 1023–1046. doi: 10.1007/s10763-013-9451-7 [ CrossRef ] [ Google Scholar ]
  • Lin L. (2020). The future of "useless" Liberal arts . Univ. Mon. Rev. Philos. Cult. 47 , 93–110. [ Google Scholar ]
  • Liu O. L., Frankel L., Roohr K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessment . ETS Res. Rep. Series 2014 , 1–23. doi: 10.1002/ets2.12009 [ CrossRef ] [ Google Scholar ]
  • Liu H., Shao M., Liu X., Zhao L. (2021). Exploring the influential factors on readers' continuance intentions of e-book APPs: personalization, usefulness, playfulness, and satisfaction . Front. Psychol. 12 :640110. doi: 10.3389/fpsyg.2021.640110, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Liu T., Zhao R., Lam K.-M., Kong J. (2022). Visual-semantic graph neural network with pose-position attentive learning for group activity recognition . Neurocomputing 491 , 217–231. doi: 10.1016/j.neucom.2022.03.066 [ CrossRef ] [ Google Scholar ]
  • Ma L., Luo H. (2020). Chinese pre-service teachers’ cognitions about cultivating critical thinking in teaching English as a foreign language . Asia Pac. J. Educ 41 , 543–557. doi: 10.1080/02188791.2020.1793733 [ CrossRef ] [ Google Scholar ]
  • Michaluk L. M., Martens J., Damron R. L., High K. A. (2016). Developing a methodology for teaching and evaluating critical thinking skills in first-year engineering students . Int. J. Eng. Educ. 32 , 84–99. [ Google Scholar ]
  • Mulnix J. W., Mulnix M. J. (2010). Using a writing portfolio project to teach critical thinking skills . Teac. Phi 33 , 27–54. doi: 10.5840/teachphil20103313 [ CrossRef ] [ Google Scholar ]
  • Murphy E. (2004). An instrument to support thinking critically about critical in thinking online asynchronous discussions . Aust. J. Educ. Technol. 20 , 295–315. doi: 10.14742/ajet.1349 [ CrossRef ] [ Google Scholar ]
  • Nair G. G., Stamler L. L. (2013). A conceptual framework for developing a critical thinking self-assessment scale . J. Nurs. Educ. 52 , 131–138. doi: 10.3928/01484834-20120215-01, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • O’Reilly C., Devitt A., Hayes N. (2022). Critical thinking in the preschool classroom - a systematic literature review . Think. Skills Creat. 46 :101110. doi: 10.1016/j.tsc.2022.101110 [ CrossRef ] [ Google Scholar ]
  • Odebiyi O. M., Odebiyi A. T. (2021). Critical thinking in social contexts: A trajectory analysis of states’ K-5 social studies content standards . J. Soc. Stud. Res. 45 , 277–288. doi: 10.1016/j.jssr.2021.05.002 [ CrossRef ] [ Google Scholar ]
  • Pallant J. F. (2007). SPSS Survival Manual: A Step by Step Guide to Data Analysis Using SPSS . 3rd Edn.. Routledge. Region 6th Series, Bangi, 551 . [ Google Scholar ]
  • Perkins D. N., Jay E., Tishman S. (1993). Beyond Abilities: A Dispositional Theory of Thinking. Merrill-Palmer Quarterly (1982), 1-21. Available at: http://www.jstor.org/stable/23087298 (Accessed November 3, 2022).
  • Pnevmatikos D., Christodoulou P., Georgiadou T. (2019). Promoting critical thinking in higher education through the values and knowledge education (VaKE) method . Stud. High. Educ. 44 , 892–901. doi: 10.1080/03075079.2019.1586340 [ CrossRef ] [ Google Scholar ]
  • Quinn S., Hogan M. J., Dwyer C., Finn P. (2020). Development and validation of the student-educator negotiated critical thinking dispositions scale (SENCTDS) . Think. Skills Creat. 38 :100710. doi: 10.1016/j.tsc.2020.100710 [ CrossRef ] [ Google Scholar ]
  • Reynolds S. J. (2008). Moral attentiveness: who pays attention to the moral aspects of life? J. Appl. Psycho. 93 , 1027–1041. doi: 10.1037/0021-9010.93.5.1027, PMID: [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rodríguez-Sabiote C., Olmedo-Moreno E. M., Expósito-López J. (2022). The effects of teamwork on critical thinking: A serial mediation analysis of the influence of work skills and educational motivation in secondary school students . Think. Skills Creat. 45 :101063. doi: 10.1016/j.tsc.2022.101063 [ CrossRef ] [ Google Scholar ]
  • Rombout F., Schuitema J. A., Volman M. L. L. (2021). Teaching strategies for value-loaded critical thinking in philosophy classroom dialogues . Think. Skills Creat. 43 :100991. doi: 10.1016/j.tsc.2021.100991 [ CrossRef ] [ Google Scholar ]
  • Saad A., Zainudin S. (2022). A review of project-based learning (PBL) and computational thinking (CT) in teaching and learning . Learn. Motiv. 78 :101802. doi: 10.1016/j.lmot.2022.101802 [ CrossRef ] [ Google Scholar ]
  • Sarjinder S. (2003). “ Simple random sampling ,” in Advanced Sampling Theory with Application (Dordrecht: Springer; ) [ Google Scholar ]
  • Schumacker R. E., Lomax R. G. (2016). A Beginner' s Guide to Structural Equation Modeling (4th Edn..) New York: Routledge. [ Google Scholar ]
  • Scriven M., Paul R. (2005). The Critical Thinking Community. Available at: http://www.criticalthinking.org (Accessed November 3, 2022).
  • Siew M., Mapeala R. (2016). The effects of problem-based learning with thinking maps on fifth graders’ science critical thinking . J. Balt. Sci. Educ. 15 , 602–616. doi: 10.33225/jbse/16.15.602 [ CrossRef ] [ Google Scholar ]
  • Simpson E., Courtney M. (2002). Critical thinking in nursing education: Literature review . Int. J. Nurs. Pract. 8 , 89–98. doi: 10.1046/j.1440-172x.2002.00340.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Singh M., James P. S., Paul H., Bolar K. (2022). Impact of cognitive-behavioral motivation on student engagement . Helyon 8 . doi: 10.1016/j.heliyon.2022.e09843 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sosu E. M. (2013). The development and psychometric validation of a critical thinking disposition scale . Think. Skills Creat. 9 , 107–119. doi: 10.1016/j.tsc.2012.09.002 [ CrossRef ] [ Google Scholar ]
  • Southworth J. (2020). How argumentative writing stifles open-mindedness . Arts Hum. High. Educ. 20 , 207–227. doi: 10.1177/1474022220903426 [ CrossRef ] [ Google Scholar ]
  • Southworth J. (2022). A perspective-taking theory of open-mindedness: confronting the challenge of motivated reasoning . Educ. Theory 93 , 1027–1041. doi: 10.1037/0021-9010.93.5.1027 [ CrossRef ] [ Google Scholar ]
  • Stone G. A., Duffy L. N., Pinckney H. P., Templeton-Bradley R. (2017). Teaching for critical thinking: preparing hospitality and tourism students for careers in the twenty-first century . J. Teach. Travel Tour. 17 , 67–84. doi: 10.1080/15313220.2017.1279036 [ CrossRef ] [ Google Scholar ]
  • Suh J., Matson K., Seshaiyer P., Jamieson S., Tate H. (2021). Mathematical modeling as a catalyst for equitable mathematics instruction: preparing teachers and young learners with 21st century skills . Mathematics 9 :162. doi: 10.3390/math9020162 [ CrossRef ] [ Google Scholar ]
  • Sulaiman W. S. W., Rahman W. R. A., Dzulkifli M. A. (2010). Examining the construct validity of the adapted California critical thinking dispositions (CCTDI) among university students in Malaysia . Procedia Soc. Behav. Sci. 7 , 282–288. doi: 10.1016/j.sbspro.2010.10.039 [ CrossRef ] [ Google Scholar ]
  • Swartz R. J. (2018). “ Critical thinking, the curriculum, and the problem of transfer ,” in Thinking: The Second International Conference . eds. Perkins D. N., Lochhead J., Bishop J. (New York: Routledge; ), 261–284. [ Google Scholar ]
  • Thaiposri P., Wannapiroon P. (2015). Enhancing students’ critical thinking skills through teaching and learning by inquiry-based learning activities using social network and cloud computing . Procedia Soc. Behav. Sci. 174 , 2137–2144. doi: 10.1016/j.sbspro.2015.02.013 [ CrossRef ] [ Google Scholar ]
  • Thomas K., Lok B. (2015). “ Teaching critical thinking: an operational framework ,” in The Palgrave Handbook of Critical Thinking in Higher Education . eds. Davies M., Barnett R. (New York: Palgrave Handbooks; ), 93–106. [ Google Scholar ]
  • Tumkaya S., Aybek B., Aldag H. (2009). An investigation of university Students' critical thinking disposition and perceived problem-solving skills . Eurasian J. Educ. Res. 9 , 57–74. [ Google Scholar ]
  • Ulger K. (2018). The effect of problem-based learning on the creative thinking and critical thinking disposition of students in visual arts education . Interdis. J. Probl-Bas. 12 :10. doi: 10.7771/1541-5015.1649 [ CrossRef ] [ Google Scholar ]
  • Vogel D. L., Wade N. G., Ascheman P. L. (2009). Measuring perceptions of stigmatization by others for seeking psychological help: reliability and validity of a new stigma scale with college students . J. Couns. Psychol. 56 , 301–308. doi: 10.1037/a0014903 [ CrossRef ] [ Google Scholar ]
  • Wechsler S. M., Saiz C., Rivas S. F., Vendramini C. M. M., Almeida L. S., Mundim M. C., et al.. (2018). Creative and critical thinking: independent or overlapping components? Think. Skills Creat. 27 , 114–122. doi: 10.1016/j.tsc.2017.12.003 [ CrossRef ] [ Google Scholar ]
  • Willingham D. T. (2007). Critical thinking: why it is so hard to teach? Am. Fed. Teach. Summer 2007 , 8–19. [ Google Scholar ]
  • Yang S. (2007). E-critical/thematic doing history project: integrating the critical thinking approach with computer-mediated history learning . Comput. Hum. Behav. 23 , 2095–2112. doi: 10.1016/j.chb.2006.02.012 [ CrossRef ] [ Google Scholar ]
  • Yurdakul I. K., Odabasi H. F., Kiliçer K., Çoklar A. N., Birinci G., Kurt A. A. (2012). The development, validity and reliability of TPACK-deep: A technological pedagogical content knowledge scale . Comput. Educ. 58 , 964–977. doi: 10.1016/j.compedu.2011.10.012 [ CrossRef ] [ Google Scholar ]
  • Zhang Q., Tang H., Xu X. (2022). Analyzing collegiate critical thinking course effectiveness: evidence from a quasi-experimental study in China . Think. Skills Creat. 45 :101105. doi: 10.1016/j.tsc.2022.101105 [ CrossRef ] [ Google Scholar ]
  • Zhao L., He W., Su Y. S. (2021). Innovative pedagogy and design-based research on flipped learning in higher education . Front. Psychol. 12 :577002. doi: 10.3389/fpsyg.2021.577002, PMID: [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

SEP home page

  • Table of Contents
  • New in this Archive
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Back to Entry
  • Entry Contents
  • Entry Bibliography
  • Academic Tools
  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Supplement to Critical Thinking

How can one assess, for purposes of instruction or research, the degree to which a person possesses the dispositions, skills and knowledge of a critical thinker?

In psychometrics, assessment instruments are judged according to their validity and reliability.

Roughly speaking, an instrument is valid if it measures accurately what it purports to measure, given standard conditions. More precisely, the degree of validity is “the degree to which evidence and theory support the interpretations of test scores for proposed uses of tests” (American Educational Research Association 2014: 11). In other words, a test is not valid or invalid in itself. Rather, validity is a property of an interpretation of a given score on a given test for a specified use. Determining the degree of validity of such an interpretation requires collection and integration of the relevant evidence, which may be based on test content, test takers’ response processes, a test’s internal structure, relationship of test scores to other variables, and consequences of the interpretation (American Educational Research Association 2014: 13–21). Criterion-related evidence consists of correlations between scores on the test and performance on another test of the same construct; its weight depends on how well supported is the assumption that the other test can be used as a criterion. Content-related evidence is evidence that the test covers the full range of abilities that it claims to test. Construct-related evidence is evidence that a correct answer reflects good performance of the kind being measured and an incorrect answer reflects poor performance.

An instrument is reliable if it consistently produces the same result, whether across different forms of the same test (parallel-forms reliability), across different items (internal consistency), across different administrations to the same person (test-retest reliability), or across ratings of the same answer by different people (inter-rater reliability). Internal consistency should be expected only if the instrument purports to measure a single undifferentiated construct, and thus should not be expected of a test that measures a suite of critical thinking dispositions or critical thinking abilities, assuming that some people are better in some of the respects measured than in others (for example, very willing to inquire but rather closed-minded). Otherwise, reliability is a necessary but not a sufficient condition of validity; a standard example of a reliable instrument that is not valid is a bathroom scale that consistently under-reports a person’s weight.

Assessing dispositions is difficult if one uses a multiple-choice format with known adverse consequences of a low score. It is pretty easy to tell what answer to the question “How open-minded are you?” will get the highest score and to give that answer, even if one knows that the answer is incorrect. If an item probes less directly for a critical thinking disposition, for example by asking how often the test taker pays close attention to views with which the test taker disagrees, the answer may differ from reality because of self-deception or simple lack of awareness of one’s personal thinking style, and its interpretation is problematic, even if factor analysis enables one to identify a distinct factor measured by a group of questions that includes this one (Ennis 1996). Nevertheless, Facione, Sánchez, and Facione (1994) used this approach to develop the California Critical Thinking Dispositions Inventory (CCTDI). They began with 225 statements expressive of a disposition towards or away from critical thinking (using the long list of dispositions in Facione 1990a), validated the statements with talk-aloud and conversational strategies in focus groups to determine whether people in the target population understood the items in the way intended, administered a pilot version of the test with 150 items, and eliminated items that failed to discriminate among test takers or were inversely correlated with overall results or added little refinement to overall scores (Facione 2000). They used item analysis and factor analysis to group the measured dispositions into seven broad constructs: open-mindedness, analyticity, cognitive maturity, truth-seeking, systematicity, inquisitiveness, and self-confidence (Facione, Sánchez, and Facione 1994). The resulting test consists of 75 agree-disagree statements and takes 20 minutes to administer. A repeated disturbing finding is that North American students taking the test tend to score low on the truth-seeking sub-scale (on which a low score results from agreeing to such statements as the following: “To get people to agree with me I would give any reason that worked”. “Everyone always argues from their own self-interest, including me”. “If there are four reasons in favor and one against, I’ll go with the four”.) Development of the CCTDI made it possible to test whether good critical thinking abilities and good critical thinking dispositions go together, in which case it might be enough to teach one without the other. Facione (2000) reports that administration of the CCTDI and the California Critical Thinking Skills Test (CCTST) to almost 8,000 post-secondary students in the United States revealed a statistically significant but weak correlation between total scores on the two tests, and also between paired sub-scores from the two tests. The implication is that both abilities and dispositions need to be taught, that one cannot expect improvement in one to bring with it improvement in the other.

A more direct way of assessing critical thinking dispositions would be to see what people do when put in a situation where the dispositions would reveal themselves. Ennis (1996) reports promising initial work with guided open-ended opportunities to give evidence of dispositions, but no standardized test seems to have emerged from this work. There are however standardized aspect-specific tests of critical thinking dispositions. The Critical Problem Solving Scale (Berman et al. 2001: 518) takes as a measure of the disposition to suspend judgment the number of distinct good aspects attributed to an option judged to be the worst among those generated by the test taker. Stanovich, West and Toplak (2011: 800–810) list tests developed by cognitive psychologists of the following dispositions: resistance to miserly information processing, resistance to myside thinking, absence of irrelevant context effects in decision-making, actively open-minded thinking, valuing reason and truth, tendency to seek information, objective reasoning style, tendency to seek consistency, sense of self-efficacy, prudent discounting of the future, self-control skills, and emotional regulation.

It is easier to measure critical thinking skills or abilities than to measure dispositions. The following eight currently available standardized tests purport to measure them: the Watson-Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), the Cornell Critical Thinking Tests Level X and Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), the Ennis-Weir Critical Thinking Essay Test (Ennis & Weir 1985), the California Critical Thinking Skills Test (Facione 1990b, 1992), the Halpern Critical Thinking Assessment (Halpern 2016), the Critical Thinking Assessment Test (Center for Assessment & Improvement of Learning 2017), the Collegiate Learning Assessment (Council for Aid to Education 2017), the HEIghten Critical Thinking Assessment (https://territorium.com/heighten/), and a suite of critical thinking assessments for different groups and purposes offered by Insight Assessment (https://www.insightassessment.com/products). The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students’ critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level certificates in critical thinking on the basis of an examination (OCR 2011). Many of these standardized tests have received scholarly evaluations at the hands of, among others, Ennis (1958), McPeck (1981), Norris and Ennis (1989), Fisher and Scriven (1997), Possin (2008, 2013a, 2013b, 2013c, 2014, 2020) and Hatcher and Possin (2021). Their evaluations provide a useful set of criteria that such tests ideally should meet, as does the description by Ennis (1984) of problems in testing for competence in critical thinking: the soundness of multiple-choice items, the clarity and soundness of instructions to test takers, the information and mental processing used in selecting an answer to a multiple-choice item, the role of background beliefs and ideological commitments in selecting an answer to a multiple-choice item, the tenability of a test’s underlying conception of critical thinking and its component abilities, the set of abilities that the test manual claims are covered by the test, the extent to which the test actually covers these abilities, the appropriateness of the weighting given to various abilities in the scoring system, the accuracy and intellectual honesty of the test manual, the interest of the test to the target population of test takers, the scope for guessing, the scope for choosing a keyed answer by being test-wise, precautions against cheating in the administration of the test, clarity and soundness of materials for training essay graders, inter-rater reliability in grading essays, and clarity and soundness of advance guidance to test takers on what is required in an essay. Rear (2019) has challenged the use of standardized tests of critical thinking as a way to measure educational outcomes, on the grounds that  they (1) fail to take into account disputes about conceptions of critical thinking, (2) are not completely valid or reliable, and (3) fail to evaluate skills used in real academic tasks. He proposes instead assessments based on discipline-specific content.

There are also aspect-specific standardized tests of critical thinking abilities. Stanovich, West and Toplak (2011: 800–810) list tests of probabilistic reasoning, insights into qualitative decision theory, knowledge of scientific reasoning, knowledge of rules of logical consistency and validity, and economic thinking. They also list instruments that probe for irrational thinking, such as superstitious thinking, belief in the superiority of intuition, over-reliance on folk wisdom and folk psychology, belief in “special” expertise, financial misconceptions, overestimation of one’s introspective powers, dysfunctional beliefs, and a notion of self that encourages egocentric processing. They regard these tests along with the previously mentioned tests of critical thinking dispositions as the building blocks for a comprehensive test of rationality, whose development (they write) may be logistically difficult and would require millions of dollars.

A superb example of assessment of an aspect of critical thinking ability is the Test on Appraising Observations (Norris & King 1983, 1985, 1990a, 1990b), which was designed for classroom administration to senior high school students. The test focuses entirely on the ability to appraise observation statements and in particular on the ability to determine in a specified context which of two statements there is more reason to believe. According to the test manual (Norris & King 1985, 1990b), a person’s score on the multiple-choice version of the test, which is the number of items that are answered correctly, can justifiably be given either a criterion-referenced or a norm-referenced interpretation.

On a criterion-referenced interpretation, those who do well on the test have a firm grasp of the principles for appraising observation statements, and those who do poorly have a weak grasp of them. This interpretation can be justified by the content of the test and the way it was developed, which incorporated a method of controlling for background beliefs articulated and defended by Norris (1985). Norris and King synthesized from judicial practice, psychological research and common-sense psychology 31 principles for appraising observation statements, in the form of empirical generalizations about tendencies, such as the principle that observation statements tend to be more believable than inferences based on them (Norris & King 1984). They constructed items in which exactly one of the 31 principles determined which of two statements was more believable. Using a carefully constructed protocol, they interviewed about 100 students who responded to these items in order to determine the thinking that led them to choose the answers they did (Norris & King 1984). In several iterations of the test, they adjusted items so that selection of the correct answer generally reflected good thinking and selection of an incorrect answer reflected poor thinking. Thus they have good evidence that good performance on the test is due to good thinking about observation statements and that poor performance is due to poor thinking about observation statements. Collectively, the 50 items on the final version of the test require application of 29 of the 31 principles for appraising observation statements, with 13 principles tested by one item, 12 by two items, three by three items, and one by four items. Thus there is comprehensive coverage of the principles for appraising observation statements. Fisher and Scriven (1997: 135–136) judge the items to be well worked and sound, with one exception. The test is clearly written at a grade 6 reading level, meaning that poor performance cannot be attributed to difficulties in reading comprehension by the intended adolescent test takers. The stories that frame the items are realistic, and are engaging enough to stimulate test takers’ interest. Thus the most plausible explanation of a given score on the test is that it reflects roughly the degree to which the test taker can apply principles for appraising observations in real situations. In other words, there is good justification of the proposed interpretation that those who do well on the test have a firm grasp of the principles for appraising observation statements and those who do poorly have a weak grasp of them.

To get norms for performance on the test, Norris and King arranged for seven groups of high school students in different types of communities and with different levels of academic ability to take the test. The test manual includes percentiles, means, and standard deviations for each of these seven groups. These norms allow teachers to compare the performance of their class on the test to that of a similar group of students.

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Warren Berger

A Crash Course in Critical Thinking

What you need to know—and read—about one of the essential skills needed today..

Posted April 8, 2024 | Reviewed by Michelle Quirk

  • In research for "A More Beautiful Question," I did a deep dive into the current crisis in critical thinking.
  • Many people may think of themselves as critical thinkers, but they actually are not.
  • Here is a series of questions you can ask yourself to try to ensure that you are thinking critically.

Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion about who and what to believe.

These are some of the hallmarks of the current crisis in critical thinking—which just might be the issue of our times. Because if people aren’t willing or able to think critically as they choose potential leaders, they’re apt to choose bad ones. And if they can’t judge whether the information they’re receiving is sound, they may follow faulty advice while ignoring recommendations that are science-based and solid (and perhaps life-saving).

Moreover, as a society, if we can’t think critically about the many serious challenges we face, it becomes more difficult to agree on what those challenges are—much less solve them.

On a personal level, critical thinking can enable you to make better everyday decisions. It can help you make sense of an increasingly complex and confusing world.

In the new expanded edition of my book A More Beautiful Question ( AMBQ ), I took a deep dive into critical thinking. Here are a few key things I learned.

First off, before you can get better at critical thinking, you should understand what it is. It’s not just about being a skeptic. When thinking critically, we are thoughtfully reasoning, evaluating, and making decisions based on evidence and logic. And—perhaps most important—while doing this, a critical thinker always strives to be open-minded and fair-minded . That’s not easy: It demands that you constantly question your assumptions and biases and that you always remain open to considering opposing views.

In today’s polarized environment, many people think of themselves as critical thinkers simply because they ask skeptical questions—often directed at, say, certain government policies or ideas espoused by those on the “other side” of the political divide. The problem is, they may not be asking these questions with an open mind or a willingness to fairly consider opposing views.

When people do this, they’re engaging in “weak-sense critical thinking”—a term popularized by the late Richard Paul, a co-founder of The Foundation for Critical Thinking . “Weak-sense critical thinking” means applying the tools and practices of critical thinking—questioning, investigating, evaluating—but with the sole purpose of confirming one’s own bias or serving an agenda.

In AMBQ , I lay out a series of questions you can ask yourself to try to ensure that you’re thinking critically. Here are some of the questions to consider:

  • Why do I believe what I believe?
  • Are my views based on evidence?
  • Have I fairly and thoughtfully considered differing viewpoints?
  • Am I truly open to changing my mind?

Of course, becoming a better critical thinker is not as simple as just asking yourself a few questions. Critical thinking is a habit of mind that must be developed and strengthened over time. In effect, you must train yourself to think in a manner that is more effortful, aware, grounded, and balanced.

For those interested in giving themselves a crash course in critical thinking—something I did myself, as I was working on my book—I thought it might be helpful to share a list of some of the books that have shaped my own thinking on this subject. As a self-interested author, I naturally would suggest that you start with the new 10th-anniversary edition of A More Beautiful Question , but beyond that, here are the top eight critical-thinking books I’d recommend.

The Demon-Haunted World: Science as a Candle in the Dark , by Carl Sagan

This book simply must top the list, because the late scientist and author Carl Sagan continues to be such a bright shining light in the critical thinking universe. Chapter 12 includes the details on Sagan’s famous “baloney detection kit,” a collection of lessons and tips on how to deal with bogus arguments and logical fallacies.

student critical thinking skills questionnaire

Clear Thinking: Turning Ordinary Moments Into Extraordinary Results , by Shane Parrish

The creator of the Farnham Street website and host of the “Knowledge Project” podcast explains how to contend with biases and unconscious reactions so you can make better everyday decisions. It contains insights from many of the brilliant thinkers Shane has studied.

Good Thinking: Why Flawed Logic Puts Us All at Risk and How Critical Thinking Can Save the World , by David Robert Grimes

A brilliant, comprehensive 2021 book on critical thinking that, to my mind, hasn’t received nearly enough attention . The scientist Grimes dissects bad thinking, shows why it persists, and offers the tools to defeat it.

Think Again: The Power of Knowing What You Don't Know , by Adam Grant

Intellectual humility—being willing to admit that you might be wrong—is what this book is primarily about. But Adam, the renowned Wharton psychology professor and bestselling author, takes the reader on a mind-opening journey with colorful stories and characters.

Think Like a Detective: A Kid's Guide to Critical Thinking , by David Pakman

The popular YouTuber and podcast host Pakman—normally known for talking politics —has written a terrific primer on critical thinking for children. The illustrated book presents critical thinking as a “superpower” that enables kids to unlock mysteries and dig for truth. (I also recommend Pakman’s second kids’ book called Think Like a Scientist .)

Rationality: What It Is, Why It Seems Scarce, Why It Matters , by Steven Pinker

The Harvard psychology professor Pinker tackles conspiracy theories head-on but also explores concepts involving risk/reward, probability and randomness, and correlation/causation. And if that strikes you as daunting, be assured that Pinker makes it lively and accessible.

How Minds Change: The Surprising Science of Belief, Opinion and Persuasion , by David McRaney

David is a science writer who hosts the popular podcast “You Are Not So Smart” (and his ideas are featured in A More Beautiful Question ). His well-written book looks at ways you can actually get through to people who see the world very differently than you (hint: bludgeoning them with facts definitely won’t work).

A Healthy Democracy's Best Hope: Building the Critical Thinking Habit , by M Neil Browne and Chelsea Kulhanek

Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the world around us. His newest book, co-authored with Chelsea Kulhanek, breaks down critical thinking into “11 explosive questions”—including the “priors question” (which challenges us to question assumptions), the “evidence question” (focusing on how to evaluate and weigh evidence), and the “humility question” (which reminds us that a critical thinker must be humble enough to consider the possibility of being wrong).

Warren Berger

Warren Berger is a longtime journalist and author of A More Beautiful Question .

  • Find a Therapist
  • Find a Treatment Center
  • Find a Support Group
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

COMMENTS

  1. Critical Thinking Questionnaire (CThQ) -construction and application of critical thinking test tool

    Thesis. The article presents the construction process and psychometric properties of the Critical Thinking Questionnaire (CThQ). The questionnaire is a critical thinking test tool designed for ...

  2. PDF Student Self-Assessment Critical Thinking Questionnaire

    The questions follow the student's critical thinking process during a given activity or a project. The questionnaire will be done individually. The teacher will not check the answers but may ask the student to give general feedback about their critical thinking process. The Student Self-Assessment Critical Thinking Questionnaire is not a test.

  3. Assessing Critical Thinking in Higher Education: Current State and

    The CAAP Critical Thinking measures students' skills in analyzing elements of an argument, evaluating an argument, and extending arguments (CAAP Program Management, ... In the spring 2013 survey of the current state of student learning outcomes assessment in U.S. higher education by the National Institute for Learning Outcomes Assessment ...

  4. Helping Students Hone Their Critical Thinking Skills

    Teach Reasoning Skills. Reasoning skills are another key component of critical thinking, involving the abilities to think logically, evaluate evidence, identify assumptions, and analyze arguments. Students who learn how to use reasoning skills will be better equipped to make informed decisions, form and defend opinions, and solve problems.

  5. Teaching, Measuring & Assessing Critical Thinking Skills

    Yes, We Can Define, Teach, and Assess Critical Thinking Skills. Critical thinking is a thing. We can define it; we can teach it; and we can assess it. While the idea of teaching critical thinking has been bandied around in education circles since at least the time of John Dewey, it has taken greater prominence in the education debates with the ...

  6. PDF Deeper Learning through Questioning

    Deeper Learning through Questioning. Asking good questions is central to learning and sometimes can be more important than getting the answers, particularly when the questions en-courage students to think critically. "Skill in the art of questioning lies at the basis of all good teach-ing " (Betts, 1910, p. 55).

  7. CTS Tools for Faculty and Student Assessment

    Cornell Critical Thinking Test (CCTT) There are two forms of the CCTT, X and Z. Form X is for students in grades 4-14. Form Z is for advanced and gifted high school students, undergraduate and graduate students, and adults. Reliability estimates for Form Z range from .49 to .87 across the 42 groups who have been tested.

  8. Assessment of the High School Students' Critical Thinking Skills

    This study is conducted to determine the high school students' critical thinking skills. The study is descriptive and done with the survey model. In order to measure the critical thinking skills of the students a 5 point Likert-type questionnaire composed of 21 questions is developed by the researcher. The sample of the study is 722 high school ...

  9. 16 Critical Thinking Questions For Students

    16 Critical Thinking Questions For Students. Critical thinking is an essential skill that empowers students to think critically and make informed decisions. It encourages them to explore different perspectives, analyze information, and develop logical reasoning. To foster critical thinking skills, it is crucial to ask students thought-provoking ...

  10. PDF Moving Beyond Assessment to Improving Students' Critical Thinking

    greater willingness to place more emphasis on critical thinking assessments and less on factual knowledge assessments in their courses as a result of participation in training. Keywords: critical thinking, assessment, faculty development, teaching, learning Research shows a consensus in the need for students to develop critical thinking skills.

  11. Writing quiz questions that assess student understanding and critical

    Higher-order thinking is often used to refer to 'transfer', 'critical thinking' and 'problem-solving.' When designing a quiz that assesses higher-order thinking skills, it is necessary to write questions/problems that require students to: use information, methods, concepts, or theories in new situations ; predict sequences and outcomes ; solve ...

  12. Global critical thinking survey: The results

    According to a survey by the Times Education Supplement, 85% of teachers worldwide feel their students don't have the critical thinking skills they need when they start university. The ability to think clearly and rationally and engage in independent and reflective thinking, empowers students to form their own opinions and make better choices.

  13. The Importance of Critical Thinking Skills for Students

    Importance of critical thinking for students 1. Decision-making 2. Problem-solving 3. Communication 4. Analytical skills How can students develop critical thinking skills 1. Never stop asking questions 2. Practice active listening 3. Dive into your creativity 4. Engage in debates and discussions 5.

  14. PDF Measuring Student Success Skills: a Review of The Literature on

    an important question, to be sure. If critical thinking is generic, then it arguably could be taught independently in separate courses, with the sole focus being on the development of critical thinking skills. But if critical thinking is regarded as particular to a discipline, then it should be taught embedded within subject-matter content.

  15. PDF An examination of high school students' critical thinking dispositions

    The purpose of this correlational survey study is to examine the critical thinking dispositions and ... thinking skills of students in high performing schools were higher than those in low performing schools. ... their critical thinking skills (Murphy et al., 2014). Facione et al. (2000) and Profetto-McGrath (2003) state that one of the ...

  16. Constructing a critical thinking evaluation framework for college

    An investigation of university Students' critical thinking disposition and perceived problem-solving skills. Eurasian J. Educ. Res. 9, 57-74. [Google Scholar] Ulger K. (2018). The effect of problem-based learning on the creative thinking and critical thinking disposition of students in visual arts education. Interdis. J.

  17. Critical Thinking

    Facione (2000) reports that administration of the CCTDI and the California Critical Thinking Skills Test (CCTST) to almost 8,000 post-secondary students in the United States revealed a statistically significant but weak correlation between total scores on the two tests, and also between paired sub-scores from the two tests.

  18. PDF "Students' Critical Thinking Skills and Students' Perceptions of the

    Student Outcomes.....28 1.2.5 Associations between ICT Learning Environments and Student Outcomes .....36 1.3 STUDENTS' PERCEPTIONS OF CLASSROOM LEARNING ENVIRONMENTS, STUDENTS' CRITICAL THINKING SKILLS, AND

  19. PDF Developing Critical Thinking through Questioning Strategy among ...

    A critical thinking questionnaire was adapted from Paul (1994) model. The purpose of the questionnaire was to analyze the change in thinking of the students if any because of the intervention. Therefore, role-playing can be productive strategy in teaching of English in Pakistani schools at elementary level if these bring change in students ...

  20. The role of student's critical asking question in developing student's

    Research that studies the relationship between questioning and students' critical thinking skills is little, if any. The aim of this study is to examine how student's questions skill correlates to student's critical thinking skills in learning of chemistry. The research design used was one group pretest-posttest design.

  21. A Crash Course in Critical Thinking

    Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the ...

  22. ChatGPT effects on cognitive skills of undergraduate students

    For instance, multiple studies (Liang, 2022 & Long et al., 2016) discovered that leveraging technology-based learning intervention can improve university students' critical thinking skills. Notably, the improvement in critical thinking skills was observed not only in the overall score of the critical thinking scale but also in its two dimensions.