• Subject List
  • Take a Tour
  • For Authors
  • Subscriber Services
  • Publications
  • African American Studies
  • African Studies
  • American Literature
  • Anthropology
  • Architecture Planning and Preservation
  • Art History
  • Atlantic History
  • Biblical Studies
  • British and Irish Literature
  • Childhood Studies
  • Chinese Studies
  • Cinema and Media Studies
  • Communication
  • Criminology
  • Environmental Science
  • Evolutionary Biology
  • International Law
  • International Relations
  • Islamic Studies
  • Jewish Studies
  • Latin American Studies
  • Latino Studies
  • Linguistics
  • Literary and Critical Theory
  • Medieval Studies
  • Military History
  • Political Science
  • Public Health
  • Renaissance and Reformation
  • Social Work
  • Urban Studies
  • Victorian Literature
  • Browse All Subjects

How to Subscribe

  • Free Trials

In This Article Expand or collapse the "in this article" section Critical Thinking

Introduction, general overviews.

  • Importance of Thinking Critically
  • Defining Critical Thinking
  • General Skills
  • Specific Skills
  • Metacognitive Monitoring Skills
  • Critical Thinking Dispositions
  • Teaching Specific Skills
  • Encouraging a Disposition toward Thinking Critically
  • Transfer to Other Domains
  • Metacognitive Monitoring
  • General or Comprehensive Assessments
  • Metacognition Assessments
  • Critical Thinking Disposition Assessments
  • Thinking Critically about Critical Thinking

Related Articles Expand or collapse the "related articles" section about

About related articles close popup.

Lorem Ipsum Sit Dolor Amet

Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Aliquam ligula odio, euismod ut aliquam et, vestibulum nec risus. Nulla viverra, arcu et iaculis consequat, justo diam ornare tellus, semper ultrices tellus nunc eu tellus.

  • Artificial Intelligence, Machine Learning, and Psychology
  • Assessment of Thinking in Educational Settings
  • Human Memory
  • Learning Theory
  • Mindfulness
  • Problem Solving and Decision Making
  • Procrastination
  • Student Success in College
  • Teaching of Psychology
  • Thinking Skills in Educational Settings
  • Women and Science, Technology, Engineering, and Math (STEM)

Other Subject Areas

Forthcoming articles expand or collapse the "forthcoming articles" section.

  • Data Visualization
  • Remote Work
  • Workforce Training Evaluation
  • Find more forthcoming articles...
  • Export Citations
  • Share This Facebook LinkedIn Twitter

Critical Thinking by Heather Butler , Diane Halpern LAST REVIEWED: 26 August 2022 LAST MODIFIED: 29 November 2011 DOI: 10.1093/obo/9780199828340-0019

Critical thinking has been described in many ways, but researchers generally agree that critical thinking involves rational, purposeful, and goal-directed thinking (see Defining Critical Thinking ). Diane F. Halpern defined critical thinking as an attempt to increase the probability of a desired outcome (e.g., making a sound decision, successfully solving a problem) by using certain cognitive skills and strategies. Critical thinking is more than just a collection of skills and strategies: it is a disposition toward engaging with problems. Critical thinkers are flexible, open-minded, persistent, and willing to exert mental energy working on tough problems. Unlike poor thinkers, critical thinkers are willing to admit they have made an error in judgment if confronted with contradictory evidence, and they operate on autopilot much less than poor thinkers (see Critical Thinking Dispositions ). There is good evidence that critical thinking skills and dispositions can be taught (see Teaching Critical Thinking ). This guide includes (a) sources that extol the importance of critical thinking, (b) research that identifies specific critical thinking skills and conceptualizations of critical thinking dispositions, (c) a list of the best practices for teaching critical thinking skills and dispositions, and (d) a review of research into ways of assessing critical thinking skills and dispositions (see Assessments ).

The sources highlighted here include textbooks, literature reviews, and meta-analyses related to critical thinking. These contributions come from both psychological ( Halpern 2003 ; Nisbett 1993 ; Sternberg, et al. 2007 ) and philosophical ( Ennis 1962 , Facione 1990 ) perspectives. Many of these general overviews are textbooks ( Facione 2011b ; Halpern 2003 ; Nisbett 1993 ; Sternberg, et al. 2007 ), while the other sources are review articles or commentaries. Most resources were intended for a general audience, but Sternberg, et al. 2007 was written specifically to address critical thinking in psychology. Those interested in a historical reference are referred to Ennis 1962 , which is credited by some as renewing contemporary interest in critical thinking. Those interested in a more recent conceptualization of critical thinking are referred to Facione 2011a , which is a short introduction to the field of critical thinking that would be appropriate for those new to the field, or Facione 1990 , which summarizes a collaborative definition of critical thinking among philosophers using the Delphi method. Facione 2011b would be a valuable resource for philosophers teaching critical thinking or logic courses to general audiences. For psychologists teaching critical thinking courses to a general audience, Halpern 2003 , an empirically based textbook, covers a wide range of topics; a new edition is expected soon. Fisher 2001 is also intended for general audiences and teaches a wide variety of critical thinking skills. Nisbett 1993 tackles the question of whether critical thinking skills can be taught and provides ample empirical evidence to that end. Sternberg, et al. 2007 is a good resource for psychology students interested in learning how to improve their scientific reasoning skills, a specific set of thinking skills needed by psychology and other science students.

Ennis, Robert H. 1962. A concept of critical thinking: A proposed basis of research in the teaching and evaluation of critical thinking. Harvard Educational Review 32:81–111.

A discussion of how critical thinking is conceptualized from a philosopher’s perspective. Critical of psychology’s definition of critical thinking at the time. Emphasizes twelve aspects of critical thinking.

Facione, Peter A. 1990. Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction; Executive Summary of The Delphi Report . Millbrae, CA: California Academic Press.

Describes the critical thinking movement, definitions of critical thinking agreed upon by philosophers using the Delphi method, the assessment of critical thinking, and how critical thinking can be taught.

Facione, Peter A. 2011a. Critical thinking: What it is and why it counts . Millbrae, CA: Insight Assessment.

This accessible paper defines critical thinking, elaborates on specific critical thinking skills, and discusses what it means to have (or not have) a critical thinking disposition. A distinction is made between system 1 (shallow processing) and system 2 (deeper processing) thinking. Good resource for students new to the field.

Facione, Peter A. 2011b. THINK critically . Upper Saddle River, NJ: Prentice Hall.

Written from a philosophical perspective this critical thinking textbook emphasizes the application of critical thinking to the real world and offers positive examples of critical thinking. Chapters cover inductive, deductive, comparative, ideological, and empirical reasoning

Fisher, Alec. 2001. Critical thinking: An introduction . Cambridge, UK: Cambridge Univ. Press.

Textbook intended for college students discusses various types of reasoning, causality, argument analysis, and decision making. Includes exercises for students and teachers.

Halpern, Diane F. 2003. Thought & knowledge: An introduction to critical thinking . 4th ed. Mahwah, NJ: Lawrence Erlbaum.

This textbook, written by a cognitive psychologist, is grounded in theory and research from the learning sciences and offers practical examples. Chapters include an introduction to the topic and the correlates of critical thinking, memory, thought and language, reasoning, analyzing arguments, thinking as hypothesis testing, likelihood and uncertainty, decision making, development of problem-solving skills, and creative thinking.

Nisbett, Richard E. 1993. Rules for reasoning . Hillsdale, NJ: Lawrence Erlbaum.

This text is rich with empirical evidence that critical thinking skills can be taught to undergraduate and graduate students. Each chapter discusses research on an aspect of reasoning (e.g., statistical reasoning, heuristics, inductive reasoning) with special emphasis on teaching the application of these skills to everyday problems.

Sternberg, Robert J., Henry L. Roediger III, and Diane F. Halpern, eds. 2007. Critical thinking in psychology . New York: Cambridge Univ. Press.

This edited book explores several aspects of critical thinking that are needed to fully understand key topics in psychology such as experiment research, statistical inference, case studies, logical fallacies, and ethical judgments. Experts discuss the critical thinking strategies they engage in. Interesting discussion of historical breakthroughs due to critical thinking.

back to top

Users without a subscription are not able to see the full content on this page. Please subscribe or login .

Oxford Bibliographies Online is available by subscription and perpetual access to institutions. For more information or to contact an Oxford Sales Representative click here .

  • About Psychology »
  • Meet the Editorial Board »
  • Abnormal Psychology
  • Academic Assessment
  • Acculturation and Health
  • Action Regulation Theory
  • Action Research
  • Addictive Behavior
  • Adolescence
  • Adoption, Social, Psychological, and Evolutionary Perspect...
  • Advanced Theory of Mind
  • Affective Forecasting
  • Affirmative Action
  • Ageism at Work
  • Allport, Gordon
  • Alzheimer’s Disease
  • Ambulatory Assessment in Behavioral Science
  • Analysis of Covariance (ANCOVA)
  • Animal Behavior
  • Animal Learning
  • Anxiety Disorders
  • Art and Aesthetics, Psychology of
  • Assessment and Clinical Applications of Individual Differe...
  • Attachment in Social and Emotional Development across the ...
  • Attention-Deficit/Hyperactivity Disorder (ADHD) in Adults
  • Attention-Deficit/Hyperactivity Disorder (ADHD) in Childre...
  • Attitudinal Ambivalence
  • Attraction in Close Relationships
  • Attribution Theory
  • Authoritarian Personality
  • Bayesian Statistical Methods in Psychology
  • Behavior Therapy, Rational Emotive
  • Behavioral Economics
  • Behavioral Genetics
  • Belief Perseverance
  • Bereavement and Grief
  • Biological Psychology
  • Birth Order
  • Body Image in Men and Women
  • Bystander Effect
  • Categorical Data Analysis in Psychology
  • Childhood and Adolescence, Peer Victimization and Bullying...
  • Clark, Mamie Phipps
  • Clinical Neuropsychology
  • Clinical Psychology
  • Cognitive Consistency Theories
  • Cognitive Dissonance Theory
  • Cognitive Neuroscience
  • Communication, Nonverbal Cues and
  • Comparative Psychology
  • Competence to Stand Trial: Restoration Services
  • Competency to Stand Trial
  • Computational Psychology
  • Conflict Management in the Workplace
  • Conformity, Compliance, and Obedience
  • Consciousness
  • Coping Processes
  • Correspondence Analysis in Psychology
  • Counseling Psychology
  • Creativity at Work
  • Critical Thinking
  • Cross-Cultural Psychology
  • Cultural Psychology
  • Daily Life, Research Methods for Studying
  • Data Science Methods for Psychology
  • Data Sharing in Psychology
  • Death and Dying
  • Deceiving and Detecting Deceit
  • Defensive Processes
  • Depressive Disorders
  • Development, Prenatal
  • Developmental Psychology (Cognitive)
  • Developmental Psychology (Social)
  • Diagnostic and Statistical Manual of Mental Disorders (DSM...
  • Discrimination
  • Dissociative Disorders
  • Drugs and Behavior
  • Eating Disorders
  • Ecological Psychology
  • Educational Settings, Assessment of Thinking in
  • Effect Size
  • Embodiment and Embodied Cognition
  • Emerging Adulthood
  • Emotional Intelligence
  • Empathy and Altruism
  • Employee Stress and Well-Being
  • Environmental Neuroscience and Environmental Psychology
  • Ethics in Psychological Practice
  • Event Perception
  • Evolutionary Psychology
  • Expansive Posture
  • Experimental Existential Psychology
  • Exploratory Data Analysis
  • Eyewitness Testimony
  • Eysenck, Hans
  • Factor Analysis
  • Festinger, Leon
  • Five-Factor Model of Personality
  • Flynn Effect, The
  • Forensic Psychology
  • Forgiveness
  • Friendships, Children's
  • Fundamental Attribution Error/Correspondence Bias
  • Gambler's Fallacy
  • Game Theory and Psychology
  • Geropsychology, Clinical
  • Global Mental Health
  • Habit Formation and Behavior Change
  • Health Psychology
  • Health Psychology Research and Practice, Measurement in
  • Heider, Fritz
  • Heuristics and Biases
  • History of Psychology
  • Human Factors
  • Humanistic Psychology
  • Implicit Association Test (IAT)
  • Industrial and Organizational Psychology
  • Inferential Statistics in Psychology
  • Insanity Defense, The
  • Intelligence
  • Intelligence, Crystallized and Fluid
  • Intercultural Psychology
  • Intergroup Conflict
  • International Classification of Diseases and Related Healt...
  • International Psychology
  • Interviewing in Forensic Settings
  • Intimate Partner Violence, Psychological Perspectives on
  • Introversion–Extraversion
  • Item Response Theory
  • Law, Psychology and
  • Lazarus, Richard
  • Learned Helplessness
  • Learning versus Performance
  • LGBTQ+ Romantic Relationships
  • Lie Detection in a Forensic Context
  • Life-Span Development
  • Locus of Control
  • Loneliness and Health
  • Mathematical Psychology
  • Meaning in Life
  • Mechanisms and Processes of Peer Contagion
  • Media Violence, Psychological Perspectives on
  • Mediation Analysis
  • Memories, Autobiographical
  • Memories, Flashbulb
  • Memories, Repressed and Recovered
  • Memory, False
  • Memory, Human
  • Memory, Implicit versus Explicit
  • Memory in Educational Settings
  • Memory, Semantic
  • Meta-Analysis
  • Metacognition
  • Metaphor, Psychological Perspectives on
  • Microaggressions
  • Military Psychology
  • Mindfulness and Education
  • Minnesota Multiphasic Personality Inventory (MMPI)
  • Money, Psychology of
  • Moral Conviction
  • Moral Development
  • Moral Psychology
  • Moral Reasoning
  • Nature versus Nurture Debate in Psychology
  • Neuroscience of Associative Learning
  • Nonergodicity in Psychology and Neuroscience
  • Nonparametric Statistical Analysis in Psychology
  • Observational (Non-Randomized) Studies
  • Obsessive-Complusive Disorder (OCD)
  • Occupational Health Psychology
  • Olfaction, Human
  • Operant Conditioning
  • Optimism and Pessimism
  • Organizational Justice
  • Parenting Stress
  • Parenting Styles
  • Parents' Beliefs about Children
  • Path Models
  • Peace Psychology
  • Perception, Person
  • Performance Appraisal
  • Personality and Health
  • Personality Disorders
  • Personality Psychology
  • Phenomenological Psychology
  • Placebo Effects in Psychology
  • Play Behavior
  • Positive Psychological Capital (PsyCap)
  • Positive Psychology
  • Posttraumatic Stress Disorder (PTSD)
  • Prejudice and Stereotyping
  • Pretrial Publicity
  • Prisoner's Dilemma
  • Prosocial Behavior
  • Prosocial Spending and Well-Being
  • Protocol Analysis
  • Psycholinguistics
  • Psychological Literacy
  • Psychological Perspectives on Food and Eating
  • Psychology, Political
  • Psychoneuroimmunology
  • Psychophysics, Visual
  • Psychotherapy
  • Psychotic Disorders
  • Publication Bias in Psychology
  • Reasoning, Counterfactual
  • Rehabilitation Psychology
  • Relationships
  • Reliability–Contemporary Psychometric Conceptions
  • Religion, Psychology and
  • Replication Initiatives in Psychology
  • Research Methods
  • Risk Taking
  • Role of the Expert Witness in Forensic Psychology, The
  • Sample Size Planning for Statistical Power and Accurate Es...
  • Schizophrenic Disorders
  • School Psychology
  • School Psychology, Counseling Services in
  • Self, Gender and
  • Self, Psychology of the
  • Self-Construal
  • Self-Control
  • Self-Deception
  • Self-Determination Theory
  • Self-Efficacy
  • Self-Esteem
  • Self-Monitoring
  • Self-Regulation in Educational Settings
  • Self-Report Tests, Measures, and Inventories in Clinical P...
  • Sensation Seeking
  • Sex and Gender
  • Sexual Minority Parenting
  • Sexual Orientation
  • Signal Detection Theory and its Applications
  • Simpson's Paradox in Psychology
  • Single People
  • Single-Case Experimental Designs
  • Skinner, B.F.
  • Sleep and Dreaming
  • Small Groups
  • Social Class and Social Status
  • Social Cognition
  • Social Neuroscience
  • Social Support
  • Social Touch and Massage Therapy Research
  • Somatoform Disorders
  • Spatial Attention
  • Sports Psychology
  • Stanford Prison Experiment (SPE): Icon and Controversy
  • Stereotype Threat
  • Stereotypes
  • Stress and Coping, Psychology of
  • Subjective Wellbeing Homeostasis
  • Taste, Psychological Perspectives on
  • Terror Management Theory
  • Testing and Assessment
  • The Concept of Validity in Psychological Assessment
  • The Neuroscience of Emotion Regulation
  • The Reasoned Action Approach and the Theories of Reasoned ...
  • The Weapon Focus Effect in Eyewitness Memory
  • Theory of Mind
  • Therapies, Person-Centered
  • Therapy, Cognitive-Behavioral
  • Time Perception
  • Trait Perspective
  • Trauma Psychology
  • Twin Studies
  • Type A Behavior Pattern (Coronary Prone Personality)
  • Unconscious Processes
  • Video Games and Violent Content
  • Virtues and Character Strengths
  • Women and Science, Technology, Engineering, and Math (STEM...
  • Women, Psychology of
  • Work Well-Being
  • Wundt, Wilhelm
  • Privacy Policy
  • Cookie Policy
  • Legal Notice
  • Accessibility

Powered by:

  • [66.249.64.20|162.248.224.4]
  • 162.248.224.4

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.17(1); Spring 2018

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

Jason e. dowd.

† Department of Biology, Duke University, Durham, NC 27708

Robert J. Thompson, Jr.

‡ Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Leslie A. Schiff

§ Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Julie A. Reynolds

Associated data.

This study empirically examines the relationship between students’ critical-thinking skills and scientific reasoning as reflected in undergraduate thesis writing in biology. Writing offers a unique window into studying this relationship, and the findings raise potential implications for instruction.

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

Theses assessment protocol dimensions

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

Descriptive statistics of CCTST dimensions a

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

Correlations between dimensions of CCTST and dimensions of BioTAP a

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

The t statistics and effect sizes of differences in ­dimensions of CCTST across dimensions of BioTAP a

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

Partial sum (questions 1–5) of BioTAP scores ( n = 52)

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

Supplementary Material

Acknowledgments.

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC: Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . [ Google Scholar ]
  • August D. (2016). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. [ Google Scholar ]
  • Beyer C. H., Taylor E., Gillmore G. M. (2013). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. [ Google Scholar ]
  • Bissell A. N., Lemons P. P. (2006). A new method for assessing critical thinking in the classroom . BioScience , ( 1 ), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . [ Google Scholar ]
  • Blattner N. H., Frazier C. L. (2002). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , ( 1 ), 47–64. [ Google Scholar ]
  • Clase K. L., Gundlach E., Pelaez N. J. (2010). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , ( 5 ), 290–295. [ PubMed ] [ Google Scholar ]
  • Condon W., Kelly-Riley D. (2004). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , ( 1 ), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . [ Google Scholar ]
  • Ding L., Wei X., Liu X. (2016). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , ( 5 ), 613–632. https://doi.org/10.1007/s11165-015-9473-y . [ Google Scholar ]
  • Dowd J. E., Connolly M. P., Thompson R. J., Jr., Reynolds J. A. (2015a). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , ( 1 ), 14–27. https://doi.org/10.1080/00220485.2014.978924 . [ Google Scholar ]
  • Dowd J. E., Roy C. P., Thompson R. J., Jr., Reynolds J. A. (2015b). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , ( 1 ), 39–45. https://doi.org/10.1021/ed500298r . [ Google Scholar ]
  • Dowd J. E., Thompson R. J., Jr., Reynolds J. A. (2016). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , , 36–51. [ Google Scholar ]
  • Facione P. A. (1990). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association; Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . [ Google Scholar ]
  • Gerdeman R. D., Russell A. A., Worden K. J., Gerdeman R. D., Russell A. A., Worden K. J. (2007). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , ( 5 ), 46–52. [ Google Scholar ]
  • Greenhoot A. F., Semb G., Colombo J., Schreiber T. (2004). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , ( 2 ), 203–221. https://doi.org/10.1002/acp.959 . [ Google Scholar ]
  • Haaga D. A. F. (1993). Peer review of term papers in graduate psychology courses . Teaching of Psychology , ( 1 ), 28–32. https://doi.org/10.1207/s15328023top2001_5 . [ Google Scholar ]
  • Halonen J. S., Bosack T., Clay S., McCarthy M., Dunn D. S., Hill G. W., Whitlock K. (2003). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , ( 3 ), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . [ Google Scholar ]
  • Hand B., Keys C. W. (1999). Inquiry investigation . Science Teacher , ( 4 ), 27–29. [ Google Scholar ]
  • Holm S. (1979). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , ( 2 ), 65–70. [ Google Scholar ]
  • Holyoak K. J., Morrison R. G. (2005). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. [ Google Scholar ]
  • Insight Assessment. (2016a). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST .
  • Insight Assessment. (2016b). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 .
  • Kelly G. J., Takao A. (2002). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , ( 3 ), 314–342. https://doi.org/10.1002/sce.10024 . [ Google Scholar ]
  • Kuhn D., Dean D., Jr. (2004). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , ( 2 ), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . [ Google Scholar ]
  • Kuhn D., Iordanou K., Pease M., Wirkala C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , ( 4 ), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . [ Google Scholar ]
  • Lawson A. E. (2010). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , ( 2 ), 336–364. https://doi.org/­10.1002/sce.20357 . [ Google Scholar ]
  • Meizlish D., LaVaque-Manty D., Silver N., Kaplan M. (2013). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. [ Google Scholar ]
  • Miri B., David B.-C., Uri Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , ( 4 ), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . [ Google Scholar ]
  • Moshman D. (2015). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. [ Google Scholar ]
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school . Expanded ed. Washington, DC: National Academies Press. [ Google Scholar ]
  • Pukkila P. J. (2004). Introducing student inquiry in large introductory genetics classes . Genetics , ( 1 ), 11–18. https://doi.org/10.1534/genetics.166.1.11 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Faiola C. L., Johnson J. E., Kurtz M. J. (2008). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , ( 3 ), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Kurtz M. J. (2007). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , ( 2 ), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Smith R., Moskovitz C., Sayle A. (2009). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , ( 10 ), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . [ Google Scholar ]
  • Reynolds J. A., Thaiss C., Katkin W., Thompson R. J. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , ( 1 ), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Thompson R. J. (2011). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , ( 2 ), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rhemtulla M., Brosseau-Liard P. E., Savalei V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , ( 3 ), 354–373. https://doi.org/­10.1037/a0029315 . [ PubMed ] [ Google Scholar ]
  • Stephenson N. S., Sadler-McKnight N. P. (2016). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , ( 1 ), 72–79. https://doi.org/­10.1039/C5RP00102A . [ Google Scholar ]
  • Tariq V. N., Stefani L. A. J., Butcher A. C., Heylings D. J. A. (1998). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , ( 3 ), 221–240. https://doi.org/­10.1080/0260293980230301 . [ Google Scholar ]
  • Timmerman B. E. C., Strickland D. C., Johnson R. L., Payne J. R. (2011). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , ( 5 ), 509–547. https://doi.org/10.1080/­02602930903540991 . [ Google Scholar ]
  • Topping K. J., Smith E. F., Swanson I., Elliot A. (2000). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , ( 2 ), 149–169. https://doi.org/10.1080/713611428 . [ Google Scholar ]
  • Willison J., O’Regan K. (2007). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , ( 4 ), 393–409. https://doi.org/10.1080/07294360701658609 . [ Google Scholar ]
  • Woodin T., Carter V. C., Fletcher L. (2010). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , ( 2 ), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zeineddin A., Abd-El-Khalick F. (2010). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , ( 9 ), 1064–1093. https://doi.org/10.1002/tea.20368 . [ Google Scholar ]
  • Zimmerman C. (2000). The development of scientific reasoning skills . Developmental Review , ( 1 ), 99–149. https://doi.org/10.1006/drev.1999.0497 . [ Google Scholar ]
  • Zimmerman C. (2007). The development of scientific thinking skills in elementary and middle school . Developmental Review , ( 2 ), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . [ Google Scholar ]

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

Writing and Critical Thinking Through Literature (Ringo and Kashyap)

  • Last updated
  • Save as PDF
  • Page ID 40355

  • Heather Ringo & Athena Kashyap
  • City College of San Francisco via ASCCC Open Educational Resources Initiative

This text offers instruction in analytical, critical, and argumentative writing, critical thinking, research strategies, information literacy, and proper documentation through the study of literary works from major genres, while developing students’ close reading skills and promoting an appreciation of the aesthetic qualities of literature.

mindtouch.page#thumbnail

Thumbnail: Old book bindings at the Merton College library. (CC BY-SA 3.0; Tom Murphy VII via Wikipedia ).

Advertisement

Advertisement

Scientific Thinking and Critical Thinking in Science Education 

Two Distinct but Symbiotically Related Intellectual Processes

  • Open access
  • Published: 05 September 2023

Cite this article

You have full access to this open access article

  • Antonio García-Carmona   ORCID: orcid.org/0000-0001-5952-0340 1  

4039 Accesses

Explore all metrics

Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of science education, there are quite frequent allusions to one or the other indistinctly to refer to the same cognitive and metacognitive skills, usually leaving unclear what are their differences and what are their common aspects. The present work therefore was aimed at elucidating what the differences and relationships between these two types of thinking are. The conclusion reached was that, while they differ in regard to the purposes of their application and some skills or processes, they also share others and are related symbiotically in a metaphorical sense; i.e., each one makes sense or develops appropriately when it is nourished or enriched by the other. Finally, an orientative proposal is presented for an integrated development of the two types of thinking in science classes.

Similar content being viewed by others

critical thinking related literature

Social Learning Theory—Albert Bandura

Discovery learning—jerome bruner.

critical thinking related literature

Opportunities and Challenges of STEM Education

Avoid common mistakes on your manuscript.

Education is not the learning of facts, but the training of the mind to think. Albert Einstein

1 Introduction

In consulting technical reports, theoretical frameworks, research, and curricular reforms related to science education, one commonly finds appeals to scientific thinking and critical thinking as essential educational processes or objectives. This is confirmed in some studies that include exhaustive reviews of the literature in this regard such as those of Bailin ( 2002 ), Costa et al. ( 2020 ), and Santos ( 2017 ) on critical thinking, and of Klarh et al. ( 2019 ) and Lehrer and Schauble ( 2006 ) on scientific thinking. However, conceptualizing and differentiating between both types of thinking based on the above-mentioned documents of science education are generally difficult. In many cases, they are referred to without defining them, or they are used interchangeably to represent virtually the same thing. Thus, for example, the document A Framework for K-12 Science Education points out that “Critical thinking is required, whether in developing and refining an idea (an explanation or design) or in conducting an investigation” (National Research Council (NRC), 2012 , p. 46). The same document also refers to scientific thinking when it suggests that basic scientific education should “provide students with opportunities for a range of scientific activities and scientific thinking , including, but not limited to inquiry and investigation, collection and analysis of evidence, logical reasoning, and communication and application of information” (NRC, 2012 , p. 251).

A few years earlier, the report Science Teaching in Schools in Europe: Policies and Research (European Commission/Eurydice, 2006 ) included the dimension “scientific thinking” as part of standardized national science tests in European countries. This dimension consisted of three basic abilities: (i) to solve problems formulated in theoretical terms , (ii) to frame a problem in scientific terms , and (iii) to formulate scientific hypotheses . In contrast, critical thinking was not even mentioned in such a report. However, in subsequent similar reports by the European Commission/Eurydice ( 2011 , 2022 ), there are some references to the fact that the development of critical thinking should be a basic objective of science teaching, although these reports do not define it at any point.

The ENCIENDE report on early-year science education in Spain also includes an explicit allusion to critical thinking among its recommendations: “Providing students with learning tools means helping them to develop critical thinking , to form their own opinions, to distinguish between knowledge founded on the evidence available at a certain moment (evidence which can change) and unfounded beliefs” (Confederation of Scientific Societies in Spain (COSCE), 2011 , p. 62). However, the report makes no explicit mention to scientific thinking. More recently, the document “ Enseñando ciencia con ciencia ” (Teaching science with science) (Couso et al., 2020 ), sponsored by Spain’s Ministry of Education, also addresses critical thinking:

(…) with the teaching approach through guided inquiry students learn scientific content, learn to do science (procedures), learn what science is and how it is built, and this (...) helps to develop critical thinking , that is, to question any statement that is not supported by evidence. (Couso et al., 2020 , p. 54)

On the other hand, in referring to what is practically the same thing, the European report Science Education for Responsible Citizenship speaks of scientific thinking when it establishes that one of the challenges of scientific education should be: “To promote a culture of scientific thinking and inspire citizens to use evidence-based reasoning for decision making” (European Commission, 2015 , p. 14). However, the Pisa 2024 Strategic Vision and Direction for Science report does not mention scientific thinking but does mention critical thinking in noting that “More generally, (students) should be able to recognize the limitations of scientific inquiry and apply critical thinking when engaging with its results” (Organization for Economic Co-operation and Development (OECD), 2020 , p. 9).

The new Spanish science curriculum for basic education (Royal Decree 217/ 2022 ) does make explicit reference to scientific thinking. For example, one of the STEM (Science, Technology, Engineering, and Mathematics) competency descriptors for compulsory secondary education reads:

Use scientific thinking to understand and explain the phenomena that occur around them, trusting in knowledge as a motor for development, asking questions and checking hypotheses through experimentation and inquiry (...) showing a critical attitude about the scope and limitations of science. (p. 41,599)

Furthermore, when developing the curriculum for the subjects of physics and chemistry, the same provision clarifies that “The essence of scientific thinking is to understand what are the reasons for the phenomena that occur in the natural environment to then try to explain them through the appropriate laws of physics and chemistry” (Royal Decree 217/ 2022 , p. 41,659). However, within the science subjects (i.e., Biology and Geology, and Physics and Chemistry), critical thinking is not mentioned as such. Footnote 1 It is only more or less directly alluded to with such expressions as “critical analysis”, “critical assessment”, “critical reflection”, “critical attitude”, and “critical spirit”, with no attempt to conceptualize it as is done with regard to scientific thinking.

The above is just a small sample of the concepts of scientific thinking and critical thinking only being differentiated in some cases, while in others they are presented as interchangeable, using one or the other indistinctly to talk about the same cognitive/metacognitive processes or practices. In fairness, however, it has to be acknowledged—as said at the beginning—that it is far from easy to conceptualize these two types of thinking (Bailin, 2002 ; Dwyer et al., 2014 ; Ennis, 2018 ; Lehrer & Schauble, 2006 ; Kuhn, 1993 , 1999 ) since they feed back on each other, partially overlap, and share certain features (Cáceres et al., 2020 ; Vázquez-Alonso & Manassero-Mas, 2018 ). Neither is there unanimity in the literature on how to characterize each of them, and rarely have they been analyzed comparatively (e.g., Hyytinen et al., 2019 ). For these reasons, I believed it necessary to address this issue with the present work in order to offer some guidelines for science teachers interested in deepening into these two intellectual processes to promote them in their classes.

2 An Attempt to Delimit Scientific Thinking in Science Education

For many years, cognitive science has been interested in studying what scientific thinking is and how it can be taught in order to improve students’ science learning (Klarh et al., 2019 ; Zimmerman & Klarh, 2018 ). To this end, Kuhn et al. propose taking a characterization of science as argument (Kuhn, 1993 ; Kuhn et al., 2008 ). They argue that this is a suitable way of linking the activity of how scientists think with that of the students and of the public in general, since science is a social activity which is subject to ongoing debate, in which the construction of arguments plays a key role. Lehrer and Schauble ( 2006 ) link scientific thinking with scientific literacy, paying especial attention to the different images of science. According to those authors, these images would guide the development of the said literacy in class. The images of science that Leherer and Schauble highlight as characterizing scientific thinking are: (i) science-as-logical reasoning (role of domain-general forms of scientific reasoning, including formal logic, heuristic, and strategies applied in different fields of science), (ii) science-as-theory change (science is subject to permanent revision and change), and (iii) science-as-practice (scientific knowledge and reasoning are components of a larger set of activities that include rules of participation, procedural skills, epistemological knowledge, etc.).

Based on a literature review, Jirout ( 2020 ) defines scientific thinking as an intellectual process whose purpose is the intentional search for information about a phenomenon or facts by formulating questions, checking hypotheses, carrying out observations, recognizing patterns, and making inferences (a detailed description of all these scientific practices or competencies can be found, for example, in NRC, 2012 ; OECD, 2019 ). Therefore, for Jirout, the development of scientific thinking would involve bringing into play the basic science skills/practices common to the inquiry-based approach to learning science (García-Carmona, 2020 ; Harlen, 2014 ). For other authors, scientific thinking would include a whole spectrum of scientific reasoning competencies (Krell et al., 2022 ; Moore, 2019 ; Tytler & Peterson, 2004 ). However, these competences usually cover the same science skills/practices mentioned above. Indeed, a conceptual overlap between scientific thinking, scientific reasoning, and scientific inquiry is often found in science education goals (Krell et al., 2022 ). Although, according to Leherer and Schauble ( 2006 ), scientific thinking is a broader construct that encompasses the other two.

It could be said that scientific thinking is a particular way of searching for information using science practices Footnote 2 (Klarh et al., 2019 ; Zimmerman & Klarh, 2018 ; Vázquez-Alonso & Manassero-Mas, 2018 ). This intellectual process provides the individual with the ability to evaluate the robustness of evidence for or against a certain idea, in order to explain a phenomenon (Clouse, 2017 ). But the development of scientific thinking also requires metacognition processes. According to what Kuhn ( 2022 ) argues, metacognition is fundamental to the permanent control or revision of what an individual thinks and knows, as well as that of the other individuals with whom it interacts, when engaging in scientific practices. In short, scientific thinking demands a good connection between reasoning and metacognition (Kuhn, 2022 ). Footnote 3

From that perspective, Zimmerman and Klarh ( 2018 ) have synthesized a taxonomy categorizing scientific thinking, relating cognitive processes with the corresponding science practices (Table 1 ). It has to be noted that this taxonomy was prepared in line with the categorization of scientific practices proposed in the document A Framework for K-12 Science Education (NRC, 2012 ). This is why one needs to understand that, for example, the cognitive process of elaboration and refinement of hypotheses is not explicitly associated with the scientific practice of hypothesizing but only with the formulation of questions. Indeed, the K-12 Framework document does not establish hypothesis formulation as a basic scientific practice. Lederman et al. ( 2014 ) justify it by arguing that not all scientific research necessarily allows or requires the verification of hypotheses, for example, in cases of exploratory or descriptive research. However, the aforementioned document (NRC, 2012 , p. 50) does refer to hypotheses when describing the practice of developing and using models , appealing to the fact that they facilitate the testing of hypothetical explanations .

In the literature, there are also other interesting taxonomies characterizing scientific thinking for educational purposes. One of them is that of Vázquez-Alonso and Manassero-Mas ( 2018 ) who, instead of science practices, refer to skills associated with scientific thinking . Their characterization basically consists of breaking down into greater detail the content of those science practices that would be related to the different cognitive and metacognitive processes of scientific thinking. Also, unlike Zimmerman and Klarh’s ( 2018 ) proposal, Vázquez-Alonso and Manassero-Mas’s ( 2018 ) proposal explicitly mentions metacognition as one of the aspects of scientific thinking, which they call meta-process . In my opinion, the proposal of the latter authors, which shells out scientific thinking into a broader range of skills/practices, can be more conducive in order to favor its approach in science classes, as teachers would have more options to choose from to address components of this intellectual process depending on their teaching interests, the educational needs of their students and/or the learning objectives pursued. Table 2 presents an adapted characterization of the Vázquez-Alonso and Manassero-Mas’s ( 2018 ) proposal to address scientific thinking in science education.

3 Contextualization of Critical Thinking in Science Education

Theorization and research about critical thinking also has a long tradition in the field of the psychology of learning (Ennis, 2018 ; Kuhn, 1999 ), and its application extends far beyond science education (Dwyer et al., 2014 ). Indeed, the development of critical thinking is commonly accepted as being an essential goal of people’s overall education (Ennis, 2018 ; Hitchcock, 2017 ; Kuhn, 1999 ; Willingham, 2008 ). However, its conceptualization is not simple and there is no unanimous position taken on it in the literature (Costa et al., 2020 ; Dwyer et al., 2014 ); especially when trying to relate it to scientific thinking. Thus, while Tena-Sánchez and León-Medina ( 2022 ) Footnote 4 and McBain et al. ( 2020 ) consider critical thinking to be the basis of or forms part of scientific thinking, Dowd et al. ( 2018 ) understand scientific thinking to be just a subset of critical thinking. However, Vázquez-Alonso and Manassero-Mas ( 2018 ) do not seek to determine whether critical thinking encompasses scientific thinking or vice versa. They consider that both types of knowledge share numerous skills/practices and the progressive development of one fosters the development of the other as a virtuous circle of improvement. Other authors, such as Schafersman ( 1991 ), even go so far as to say that critical thinking and scientific thinking are the same thing. In addition, some views on the relationship between critical thinking and scientific thinking seem to be context-dependent. For example, Hyytine et al. ( 2019 ) point out that in the perspective of scientific thinking as a component of critical thinking, the former is often used to designate evidence-based thinking in the sciences, although this view tends to dominate in Europe but not in the USA context. Perhaps because of this lack of consensus, the two types of thinking are often confused, overlapping, or conceived as interchangeable in education.

Even with such a lack of unanimous or consensus vision, there are some interesting theoretical frameworks and definitions for the development of critical thinking in education. One of the most popular definitions of critical thinking is that proposed by The National Council for Excellence in Critical Thinking (1987, cited in Inter-American Teacher Education Network, 2015 , p. 6). This conceives of it as “the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action”. In other words, critical thinking can be regarded as a reflective and reasonable class of thinking that provides people with the ability to evaluate multiple statements or positions that are defensible to then decide which is the most defensible (Clouse, 2017 ; Ennis, 2018 ). It thus requires, in addition to a basic scientific competency, notions about epistemology (Kuhn, 1999 ) to understand how knowledge is constructed. Similarly, it requires skills for metacognition (Hyytine et al., 2019 ; Kuhn, 1999 ; Magno, 2010 ) since critical thinking “entails awareness of one’s own thinking and reflection on the thinking of self and others as objects of cognition” (Dean & Kuhn, 2003 , p. 3).

In science education, one of the most suitable scenarios or resources, but not the only one, Footnote 5 to address all these aspects of critical thinking is through the analysis of socioscientific issues (SSI) (Taylor et al., 2006 ; Zeidler & Nichols, 2009 ). Without wishing to expand on this here, I will only say that interesting works can be found in the literature that have analyzed how the discussion of SSIs can favor the development of critical thinking skills (see, e.g., López-Fernández et al., 2022 ; Solbes et al., 2018 ). For example, López-Fernández et al. ( 2022 ) focused their teaching-learning sequence on the following critical thinking skills: information analysis, argumentation, decision making, and communication of decisions. Even some authors add the nature of science (NOS) to this framework (i.e., SSI-NOS-critical thinking), as, for example, Yacoubian and Khishfe ( 2018 ) in order to develop critical thinking and how this can also favor the understanding of NOS (Yacoubian, 2020 ). In effect, as I argued in another work on the COVID-19 pandemic as an SSI, in which special emphasis was placed on critical thinking, an informed understanding of how science works would have helped the public understand why scientists were changing their criteria to face the pandemic in the light of new data and its reinterpretations, or that it was not possible to go faster to get an effective and secure medical treatment for the disease (García-Carmona, 2021b ).

In the recent literature, there have also been some proposals intended to characterize critical thinking in the context of science education. Table 3 presents two of these by way of example. As can be seen, both proposals share various components for the development of critical thinking (respect for evidence, critically analyzing/assessing the validity/reliability of information, adoption of independent opinions/decisions, participation, etc.), but that of Blanco et al. ( 2017 ) is more clearly contextualized in science education. Likewise, that of these authors includes some more aspects (or at least does so more explicitly), such as developing epistemological Footnote 6 knowledge of science (vision of science…) and on its interactions with technology, society, and environment (STSA relationships), and communication skills. Therefore, it offers a wider range of options for choosing critical thinking skills/processes to promote it in science classes. However, neither proposal refers to metacognitive skills, which are also essential for developing critical thinking (Kuhn, 1999 ).

3.1 Critical thinking vs. scientific thinking in science education: differences and similarities

In accordance with the above, it could be said that scientific thinking is nourished by critical thinking, especially when deciding between several possible interpretations and explanations of the same phenomenon since this generally takes place in a context of debate in the scientific community (Acevedo-Díaz & García-Carmona, 2017 ). Thus, the scientific attitude that is perhaps most clearly linked to critical thinking is the skepticism with which scientists tend to welcome new ideas (Normand, 2008 ; Sagan, 1987 ; Tena-Sánchez and León-Medina, 2022 ), especially if they are contrary to well-established scientific knowledge (Bell, 2009 ). A good example of this was the OPERA experiment (García-Carmona & Acevedo-Díaz, 2016a ), which initially seemed to find that neutrinos could move faster than the speed of light. This finding was supposed to invalidate Albert Einstein’s theory of relativity (the finding was later proved wrong). In response, Nobel laureate in physics Sheldon L. Glashow went so far as to state that:

the result obtained by the OPERA collaboration cannot be correct. If it were, we would have to give up so many things, it would be such a huge sacrifice... But if it is, I am officially announcing it: I will shout to Mother Nature: I’m giving up! And I will give up Physics. (BBVA Foundation, 2011 )

Indeed, scientific thinking is ultimately focused on getting evidence that may support an idea or explanation about a phenomenon, and consequently allow others that are less convincing or precise to be discarded. Therefore when, with the evidence available, science has more than one equally defensible position with respect to a problem, the investigation is considered inconclusive (Clouse, 2017 ). In certain cases, this gives rise to scientific controversies (Acevedo-Díaz & García-Carmona, 2017 ) which are not always resolved based exclusively on epistemic or rational factors (Elliott & McKaughan, 2014 ; Vallverdú, 2005 ). Hence, it is also necessary to integrate non-epistemic practices into the framework of scientific thinking (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ), practices that transcend the purely rational or cognitive processes, including, for example, those related to emotional or affective issues (Sinatra & Hofer, 2021 ). From an educational point of view, this suggests that for students to become more authentically immersed in the way of working or thinking scientifically, they should also learn to feel as scientists do when they carry out their work (Davidson et al., 2020 ). Davidson et al. ( 2020 ) call it epistemic affect , and they suggest that it could be approach in science classes by teaching students to manage their frustrations when they fail to achieve the expected results; Footnote 7 or, for example, to moderate their enthusiasm with favorable results in a scientific inquiry by activating a certain skepticism that encourages them to do more testing. And, as mentioned above, for some authors, having a skeptical attitude is one of the actions that best visualize the application of critical thinking in the framework of scientific thinking (Normand, 2008 ; Sagan, 1987 ; Tena-Sánchez and León-Medina, 2022 ).

On the other hand, critical thinking also draws on many of the skills or practices of scientific thinking, as discussed above. However, in contrast to scientific thinking, the coexistence of two or more defensible ideas is not, in principle, a problem for critical thinking since its purpose is not so much to invalidate some ideas or explanations with respect to others, but rather to provide the individual with the foundations on which to position themself with the idea/argument they find most defensible among several that are possible (Ennis, 2018 ). For example, science with its methods has managed to explain the greenhouse effect, the phenomenon of the tides, or the transmission mechanism of the coronavirus. For this, it had to discard other possible explanations as they were less valid in the investigations carried out. These are therefore issues resolved by the scientific community which create hardly any discussion at the present time. However, taking a position for or against the production of energy in nuclear power plants transcends the scope of scientific thinking since both positions are, in principle, equally defensible. Indeed, within the scientific community itself there are supporters and detractors of the two positions, based on the same scientific knowledge. Consequently, it is critical thinking, which requires the management of knowledge and scientific skills, a basic understanding of epistemic (rational or cognitive) and non-epistemic (social, ethical/moral, economic, psychological, cultural, ...) aspects of the nature of science, as well as metacognitive skills, which helps the individual forge a personal foundation on which to position themself in one place or another, or maintain an uncertain, undecided opinion.

In view of the above, one can summarize that scientific thinking and critical thinking are two different intellectual processes in terms of purpose, but are related symbiotically (i.e., one would make no sense without the other or both feed on each other) and that, in their performance, they share a fair number of features, actions, or mental skills. According to Cáceres et al. ( 2020 ) and Hyytine et al. ( 2019 ), the intellectual skills that are most clearly common to both types of thinking would be searching for relationships between evidence and explanations , as well as investigating and logical thinking to make inferences . To this common space, I would also add skills for metacognition in accordance with what has been discussed about both types of knowledge (Khun, 1999 , 2022 ).

In order to compile in a compact way all that has been argued so far, in Table 4 , I present my overview of the relationship between scientific thinking and critical thinking. I would like to point out that I do not intend to be extremely extensive in the compilation, in the sense that possibly more elements could be added in the different sections, but rather to represent above all the aspects that distinguish and share them, as well as the mutual enrichment (or symbiosis) between them.

4 A Proposal for the Integrated Development of Critical Thinking and Scientific Thinking in Science Classes

Once the differences, common aspects, and relationships between critical thinking and scientific thinking have been discussed, it would be relevant to establish some type of specific proposal to foster them in science classes. Table 5 includes a possible script to address various skills or processes of both types of thinking in an integrated manner. However, before giving guidance on how such skills/processes could be approached, I would like to clarify that while all of them could be dealt within the context of a single school activity, I will not do so in this way. First, because I think that it can give the impression that the proposal is only valid if it is applied all at once in a specific learning situation, which can also discourage science teachers from implementing it in class due to lack of time or training to do so. Second, I think it can be more interesting to conceive the proposal as a set of thinking skills or actions that can be dealt with throughout the different science contents, selecting only (if so decided) some of them, according to educational needs or characteristics of the learning situation posed in each case. Therefore, in the orientations for each point of the script or grouping of these, I will use different examples and/or contexts. Likewise, these orientations in the form of comments, although founded in the literature, should be considered only as possibilities to do so, among many others possible.

Motivation and predisposition to reflect and discuss (point i ) demands, on the one hand, that issues are chosen which are attractive for the students. This can be achieved, for example, by asking the students directly what current issues, related to science and its impact or repercussions, they would like to learn about, and then decide on which issue to focus on (García-Carmona, 2008 ). Or the teacher puts forward the issue directly in class, trying for it be current, to be present in the media, social networks, etc., or what they think may be of interest to their students based on their teaching experience. In this way, each student is encouraged to feel questioned or concerned as a citizen because of the issue that is going to be addressed (García-Carmona, 2008 ). Also of possible interest is the analysis of contemporary, as yet unresolved socioscientific affairs (Solbes et al., 2018 ), such as climate change, science and social justice, transgenic foods, homeopathy, and alcohol and drug use in society. But also, everyday questions can be investigated which demand a decision to be made, such as “What car to buy?” (Moreno-Fontiveros et al., 2022 ), or “How can we prevent the arrival of another pandemic?” (Ushola & Puig, 2023 ).

On the other hand, it is essential that the discussion about the chosen issue is planned through an instructional process that generates an environment conducive to reflection and debate, with a view to engaging the students’ participation in it. This can be achieved, for example, by setting up a role-play game (Blanco-López et al., 2017 ), especially if the issue is socioscientific, or by critical and reflective reading of advertisements with scientific content (Campanario et al., 2001 ) or of science-related news in the daily media (García-Carmona, 2014 , 2021a ; Guerrero-Márquez & García-Carmona, 2020 ; Oliveras et al., 2013 ), etc., for subsequent discussion—all this, in a collaborative learning setting and with a clear democratic spirit.

Respect for scientific evidence (point ii ) should be the indispensable condition in any analysis and discussion from the prisms of scientific and of critical thinking (Erduran, 2021 ). Although scientific knowledge may be impregnated with subjectivity during its construction and is revisable in the light of new evidence ( tentativeness of scientific knowledge), when it is accepted by the scientific community it is as objective as possible (García-Carmona & Acevedo-Díaz, 2016b ). Therefore, promoting trust and respect for scientific evidence should be one of the primary educational challenges to combating pseudoscientists and science deniers (Díaz & Cabrera, 2022 ), whose arguments are based on false beliefs and assumptions, anecdotes, and conspiracy theories (Normand, 2008 ). Nevertheless, it is no simple task to achieve the promotion or respect for scientific evidence (Fackler, 2021 ) since science deniers, for example, consider that science is unreliable because it is imperfect (McIntyre, 2021 ). Hence the need to promote a basic understanding of NOS (point iii ) as a fundamental pillar for the development of both scientific thinking and critical thinking. A good way to do this would be through explicit and reflective discussion about controversies from the history of science (Acevedo-Díaz & García-Carmona, 2017 ) or contemporary controversies (García-Carmona, 2021b ; García-Carmona & Acevedo-Díaz, 2016a ).

Also, with respect to point iii of the proposal, it is necessary to manage basic scientific knowledge in the development of scientific and critical thinking skills (Willingham, 2008 ). Without this, it will be impossible to develop a minimally serious and convincing argument on the issue being analyzed. For example, if one does not know the transmission mechanism of a certain disease, it is likely to be very difficult to understand or justify certain patterns of social behavior when faced with it. In general, possessing appropriate scientific knowledge on the issue in question helps to make the best interpretation of the data and evidence available on this issue (OECD, 2019 ).

The search for information from reliable sources, together with its analysis and interpretation (points iv to vi ), are essential practices both in purely scientific contexts (e.g., learning about the behavior of a given physical phenomenon from literature or through enquiry) and in the application of critical thinking (e.g., when one wishes to take a personal, but informed, position on a particular socio-scientific issue). With regard to determining the credibility of information with scientific content on the Internet, Osborne et al. ( 2022 ) propose, among other strategies, to check whether the source is free of conflicts of interest, i.e., whether or not it is biased by ideological, political or economic motives. Also, it should be checked whether the source and the author(s) of the information are sufficiently reputable.

Regarding the interpretation of data and evidence, several studies have shown the difficulties that students often have with this practice in the context of enquiry activities (e.g., Gobert et al., 2018 ; Kanari & Millar, 2004 ; Pols et al., 2021 ), or when analyzing science news in the press (Norris et al., 2003 ). It is also found that they have significant difficulties in choosing the most appropriate data to support their arguments in causal analyses (Kuhn & Modrek, 2022 ). However, it must be recognized that making interpretations or inferences from data is not a simple task; among other reasons, because their construction is influenced by multiple factors, both epistemic (prior knowledge, experimental designs, etc.) and non-epistemic (personal expectations, ideology, sociopolitical context, etc.), which means that such interpretations are not always the same for all scientists (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ). For this reason, the performance of this scientific practice constitutes one of the phases or processes that generate the most debate or discussion in a scientific community, as long as no consensus is reached. In order to improve the practice of making inferences among students, Kuhn and Lerman ( 2021 ) propose activities that help them develop their own epistemological norms to connect causally their statements with the available evidence.

Point vii refers, on the one hand, to an essential scientific practice: the elaboration of evidence-based scientific explanations which generally, in a reasoned way, account for the causality, properties, and/or behavior of the phenomena (Brigandt, 2016 ). In addition, point vii concerns the practice of argumentation . Unlike scientific explanations, argumentation tries to justify an idea, explanation, or position with the clear purpose of persuading those who defend other different ones (Osborne & Patterson, 2011 ). As noted above, the complexity of most socioscientific issues implies that they have no unique valid solution or response. Therefore, the content of the arguments used to defend one position or another are not always based solely on purely rational factors such as data and scientific evidence. Some authors defend the need to also deal with non-epistemic aspects of the nature of science when teaching it (García-Carmona, 2021a ; García-Carmona & Acevedo-Díaz, 2018 ) since many scientific and socioscientific controversies are resolved by different factors or go beyond just the epistemic (Vallverdú, 2005 ).

To defend an idea or position taken on an issue, it is not enough to have scientific evidence that supports it. It is also essential to have skills for the communication and discussion of ideas (point viii ). The history of science shows how the difficulties some scientists had in communicating their ideas scientifically led to those ideas not being accepted at the time. A good example for students to become aware of this is the historical case of Semmelweis and puerperal fever (Aragón-Méndez et al., 2019 ). Its reflective reading makes it possible to conclude that the proposal of this doctor that gynecologists disinfect their hands, when passing from one parturient to another to avoid contagions that provoked the fever, was rejected by the medical community not only for epistemic reasons, but also for the difficulties that he had to communicate his idea. The history of science also reveals that some scientific interpretations were imposed on others at certain historical moments due to the rhetorical skills of their proponents although none of the explanations would convincingly explain the phenomenon studied. An example is the case of the controversy between Pasteur and Liebig about the phenomenon of fermentation (García-Carmona & Acevedo-Díaz, 2017 ), whose reading and discussion in science class would also be recommended in this context of this critical and scientific thinking skill. With the COVID-19 pandemic, for example, the arguments of some charlatans in the media and on social networks managed to gain a certain influence in the population, even though scientifically they were muddled nonsense (García-Carmona, 2021b ). Therefore, the reflective reading of news on current SSIs such as this also constitutes a good resource for the same educational purpose. In general, according to Spektor-Levy et al. ( 2009 ), scientific communication skills should be addressed explicitly in class, in a progressive and continuous manner, including tasks of information seeking, reading, scientific writing, representation of information, and representation of the knowledge acquired.

Finally (point ix ), a good scientific/critical thinker must be aware of what they know, of what they have doubts about or do not know, to this end continuously practicing metacognitive exercises (Dean & Kuhn, 2003 ; Hyytine et al., 2019 ; Magno, 2010 ; Willingham, 2008 ). At the same time, they must recognize the weaknesses and strengths of the arguments of their peers in the debate in order to be self-critical if necessary, as well as to revising their own ideas and arguments to improve and reorient them, etc. ( self-regulation ). I see one of the keys of both scientific and critical thinking being the capacity or willingness to change one’s mind, without it being frowned upon. Indeed, quite the opposite since one assumes it to occur thanks to the arguments being enriched and more solidly founded. In other words, scientific and critical thinking and arrogance or haughtiness towards the rectification of ideas or opinions do not stick well together.

5 Final Remarks

For decades, scientific thinking and critical thinking have received particular attention from different disciplines such as psychology, philosophy, pedagogy, and specific areas of this last such as science education. The two types of knowledge represent intellectual processes whose development in students, and in society in general, is considered indispensable for the exercise of responsible citizenship in accord with the demands of today’s society (European Commission, 2006 , 2015 ; NRC, 2012 ; OECD, 2020 ). As has been shown however, the task of their conceptualization is complex, and teaching students to think scientifically and critically is a difficult educational challenge (Willingham, 2008 ).

Aware of this, and after many years dedicated to science education, I felt the need to organize my ideas regarding the aforementioned two types of thinking. In consulting the literature about these, I found that, in many publications, scientific thinking and critical thinking are presented or perceived as being interchangeable or indistinguishable; a conclusion also shared by Hyytine et al. ( 2019 ). Rarely have their differences, relationships, or common features been explicitly studied. So, I considered that it was a matter needing to be addressed because, in science education, the development of scientific thinking is an inherent objective, but, when critical thinking is added to the learning objectives, there arise more than reasonable doubts about when one or the other would be used, or both at the same time. The present work came about motivated by this, with the intention of making a particular contribution, but based on the relevant literature, to advance in the question raised. This converges in conceiving scientific thinking and critical thinking as two intellectual processes that overlap and feed into each other in many aspects but are different with respect to certain cognitive skills and in terms of their purpose. Thus, in the case of scientific thinking, the aim is to choose the best possible explanation of a phenomenon based on the available evidence, and it therefore involves the rejection of alternative explanatory proposals that are shown to be less coherent or convincing. Whereas, from the perspective of critical thinking, the purpose is to choose the most defensible idea/option among others that are also defensible, using both scientific and extra-scientific (i.e., moral, ethical, political, etc.) arguments. With this in mind, I have described a proposal to guide their development in the classroom, integrating them under a conception that I have called, metaphorically, a symbiotic relationship between two modes of thinking.

Critical thinking is mentioned literally in other of the curricular provisions’ subjects such as in Education in Civics and Ethical Values or in Geography and History (Royal Decree 217/2022).

García-Carmona ( 2021a ) conceives of them as activities that require the comprehensive application of procedural skills, cognitive and metacognitive processes, and both scientific knowledge and knowledge of the nature of scientific practice .

Kuhn ( 2021 ) argues that the relationship between scientific reasoning and metacognition is especially fostered by what she calls inhibitory control , which basically consists of breaking down the whole of a thought into parts in such a way that attention is inhibited on some of those parts to allow a focused examination of the intended mental content.

Specifically, Tena-Sánchez and León-Medina (2020) assume that critical thinking is at the basis of rational or scientific skepticism that leads to questioning any claim that does not have empirical support.

As discussed in the introduction, the inquiry-based approach is also considered conducive to addressing critical thinking in science education (Couso et al., 2020 ; NRC, 2012 ).

Epistemic skills should not be confused with epistemological knowledge (García-Carmona, 2021a ). The former refers to skills to construct, evaluate, and use knowledge, and the latter to understanding about the origin, nature, scope, and limits of scientific knowledge.

For this purpose, it can be very useful to address in class, with the help of the history and philosophy of science, that scientists get more wrong than right in their research, and that error is always an opportunity to learn (García-Carmona & Acevedo-Díaz, 2018 ).

Acevedo-Díaz, J. A., & García-Carmona, A. (2017). Controversias en la historia de la ciencia y cultura científica [Controversies in the history of science and scientific culture]. Los Libros de la Catarata.

Aragón-Méndez, M. D. M., Acevedo-Díaz, J. A., & García-Carmona, A. (2019). Prospective biology teachers’ understanding of the nature of science through an analysis of the historical case of Semmelweis and childbed fever. Cultural Studies of Science Education , 14 (3), 525–555. https://doi.org/10.1007/s11422-018-9868-y

Bailin, S. (2002). Critical thinking and science education. Science & Education, 11 (4), 361–375. https://doi.org/10.1023/A:1016042608621

Article   Google Scholar  

BBVA Foundation (2011). El Nobel de Física Sheldon L. Glashow no cree que los neutrinos viajen más rápido que la luz [Physics Nobel laureate Sheldon L. Glashow does not believe neutrinos travel faster than light.]. https://www.fbbva.es/noticias/nobel-fisica-sheldon-l-glashow-no-cree-los-neutrinos-viajen-mas-rapido-la-luz/ . Accessed 5 Februray 2023.

Bell, R. L. (2009). Teaching the nature of science: Three critical questions. In Best Practices in Science Education . National Geographic School Publishing.

Google Scholar  

Blanco-López, A., España-Ramos, E., & Franco-Mariscal, A. J. (2017). Estrategias didácticas para el desarrollo del pensamiento crítico en el aula de ciencias [Teaching strategies for the development of critical thinking in the teaching of science]. Ápice. Revista de Educación Científica, 1 (1), 107–115. https://doi.org/10.17979/arec.2017.1.1.2004

Brigandt, I. (2016). Why the difference between explanation and argument matters to science education. Science & Education, 25 (3-4), 251–275. https://doi.org/10.1007/s11191-016-9826-6

Cáceres, M., Nussbaum, M., & Ortiz, J. (2020). Integrating critical thinking into the classroom: A teacher’s perspective. Thinking Skills and Creativity, 37 , 100674. https://doi.org/10.1016/j.tsc.2020.100674

Campanario, J. M., Moya, A., & Otero, J. (2001). Invocaciones y usos inadecuados de la ciencia en la publicidad [Invocations and misuses of science in advertising]. Enseñanza de las Ciencias, 19 (1), 45–56. https://doi.org/10.5565/rev/ensciencias.4013

Clouse, S. (2017). Scientific thinking is not critical thinking. https://medium.com/extra-extra/scientific-thinking-is-not-critical-thinking-b1ea9ebd8b31

Confederacion de Sociedades Cientificas de Espana [COSCE]. (2011). Informe ENCIENDE: Enseñanza de las ciencias en la didáctica escolar para edades tempranas en España [ENCIENDE report: Science education for early-year in Spain] . COSCE.

Costa, S. L. R., Obara, C. E., & Broietti, F. C. D. (2020). Critical thinking in science education publications: the research contexts. International Journal of Development Research, 10 (8), 39438. https://doi.org/10.37118/ijdr.19437.08.2020

Couso, D., Jiménez-Liso, M.R., Refojo, C. & Sacristán, J.A. (coords.) (2020). Enseñando ciencia con ciencia [Teaching science with science]. FECYT & Fundacion Lilly / Penguin Random House

Davidson, S. G., Jaber, L. Z., & Southerland, S. A. (2020). Emotions in the doing of science: Exploring epistemic affect in elementary teachers' science research experiences. Science Education, 104 (6), 1008–1040. https://doi.org/10.1002/sce.21596

Dean, D., & Kuhn, D. (2003). Metacognition and critical thinking. ERIC document. Reproduction No. ED477930 . https://files.eric.ed.gov/fulltext/ED477930.pdf

Díaz, C., & Cabrera, C. (2022). Desinformación científica en España . FECYT/IBERIFIER https://www.fecyt.es/es/publicacion/desinformacion-cientifica-en-espana

Dowd, J. E., Thompson, R. J., Jr., Schiff, L. A., & Reynolds, J. A. (2018). Understanding the complex relationship between critical thinking and science reasoning among undergraduate thesis writers. CBE—Life Sciences . Education, 17 (1), ar4. https://doi.org/10.1187/cbe.17-03-0052

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity, 12 , 43–52. https://doi.org/10.1016/j.tsc.2013.12.004

Elliott, K. C., & McKaughan, D. J. (2014). Non-epistemic values and the multiple goals of science. Philosophy of Science, 81 (1), 1–21. https://doi.org/10.1086/674345

Ennis, R. H. (2018). Critical thinking across the curriculum: A vision. Topoi, 37 (1), 165–184. https://doi.org/10.1007/s11245-016-9401-4

Erduran, S. (2021). Respect for evidence: Can science education deliver it? Science & Education, 30 (3), 441–444. https://doi.org/10.1007/s11191-021-00245-8

European Commission. (2015). Science education for responsible citizenship . Publications Office https://op.europa.eu/en/publication-detail/-/publication/a1d14fa0-8dbe-11e5-b8b7-01aa75ed71a1

European Commission / Eurydice. (2011). Science education in Europe: National policies, practices and research . Publications Office. https://op.europa.eu/en/publication-detail/-/publication/bae53054-c26c-4c9f-8366-5f95e2187634

European Commission / Eurydice. (2022). Increasing achievement and motivation in mathematics and science learning in schools . Publications Office. https://eurydice.eacea.ec.europa.eu/publications/mathematics-and-science-learning-schools-2022

European Commission/Eurydice. (2006). Science teaching in schools in Europe. Policies and research . Publications Office. https://op.europa.eu/en/publication-detail/-/publication/1dc3df34-acdf-479e-bbbf-c404fa3bee8b

Fackler, A. (2021). When science denial meets epistemic understanding. Science & Education, 30 (3), 445–461. https://doi.org/10.1007/s11191-021-00198-y

García-Carmona, A. (2008). Relaciones CTS en la educación científica básica. II. Investigando los problemas del mundo [STS relationships in basic science education II. Researching the world problems]. Enseñanza de las Ciencias, 26 (3), 389–402. https://doi.org/10.5565/rev/ensciencias.3750

García-Carmona, A. (2014). Naturaleza de la ciencia en noticias científicas de la prensa: Análisis del contenido y potencialidades didácticas [Nature of science in press articles about science: Content analysis and pedagogical potential]. Enseñanza de las Ciencias, 32 (3), 493–509. https://doi.org/10.5565/rev/ensciencias.1307

García-Carmona, A., & Acevedo-Díaz, J. A. (2016). Learning about the nature of science using newspaper articles with scientific content. Science & Education, 25 (5–6), 523–546. https://doi.org/10.1007/s11191-016-9831-9

García-Carmona, A., & Acevedo-Díaz, J. A. (2016b). Concepciones de estudiantes de profesorado de Educación Primaria sobre la naturaleza de la ciencia: Una evaluación diagnóstica a partir de reflexiones en equipo [Preservice elementary teachers' conceptions of the nature of science: a diagnostic evaluation based on team reflections]. Revista Mexicana de Investigación Educativa, 21 (69), 583–610. https://www.redalyc.org/articulo.oa?id=14045395010

García-Carmona, A., & Acevedo-Díaz, J. A. (2017). Understanding the nature of science through a critical and reflective analysis of the controversy between Pasteur and Liebig on fermentation. Science & Education, 26 (1–2), 65–91. https://doi.org/10.1007/s11191-017-9876-4

García-Carmona, A., & Acevedo-Díaz, J. A. (2018). The nature of scientific practice and science education. Science & Education, 27 (5–6), 435–455. https://doi.org/10.1007/s11191-018-9984-9

García-Carmona, A. (2020). From inquiry-based science education to the approach based on scientific practices. Science & Education, 29 (2), 443–463. https://doi.org/10.1007/s11191-020-00108-8

García-Carmona, A. (2021a). Prácticas no-epistémicas: ampliando la mirada en el enfoque didáctico basado en prácticas científicas [Non-epistemic practices: extending the view in the didactic approach based on scientific practices]. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 18 (1), 1108. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2021.v18.i1.1108

García-Carmona, A. (2021b). Learning about the nature of science through the critical and reflective reading of news on the COVID-19 pandemic. Cultural Studies of Science Education, 16 (4), 1015–1028. https://doi.org/10.1007/s11422-021-10092-2

Guerrero-Márquez, I., & García-Carmona, A. (2020). La energía y su impacto socioambiental en la prensa digital: temáticas y potencialidades didácticas para una educación CTS [Energy and its socio-environmental impact in the digital press: issues and didactic potentialities for STS education]. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 17(3), 3301. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2020.v17.i3.3301

Gobert, J. D., Moussavi, R., Li, H., Sao Pedro, M., & Dickler, R. (2018). Real-time scaffolding of students’ online data interpretation during inquiry with Inq-ITS using educational data mining. In M. E. Auer, A. K. M. Azad, A. Edwards, & T. de Jong (Eds.), Cyber-physical laboratories in engineering and science education (pp. 191–217). Springer.

Chapter   Google Scholar  

Harlen, W. (2014). Helping children’s development of inquiry skills. Inquiry in Primary Science Education, 1 (1), 5–19. https://ipsejournal.files.wordpress.com/2015/03/3-ipse-volume-1-no-1-wynne-harlen-p-5-19.pdf

Hitchcock, D. (2017). Critical thinking as an educational ideal. In On reasoning and argument (pp. 477–497). Springer.

Hyytinen, H., Toom, A., & Shavelson, R. J. (2019). Enhancing scientific thinking through the development of critical thinking in higher education. In M. Murtonen & K. Balloo (Eds.), Redefining scientific thinking for higher education . Palgrave Macmillan.

Jiménez-Aleixandre, M. P., & Puig, B. (2022). Educating critical citizens to face post-truth: the time is now. In B. Puig & M. P. Jiménez-Aleixandre (Eds.), Critical thinking in biology and environmental education, Contributions from biology education research (pp. 3–19). Springer.

Jirout, J. J. (2020). Supporting early scientific thinking through curiosity. Frontiers in Psychology, 11 , 1717. https://doi.org/10.3389/fpsyg.2020.01717

Kanari, Z., & Millar, R. (2004). Reasoning from data: How students collect and interpret data in science investigations. Journal of Research in Science Teaching, 41 (7), 748–769. https://doi.org/10.1002/tea.20020

Klahr, D., Zimmerman, C., & Matlen, B. J. (2019). Improving students’ scientific thinking. In J. Dunlosky & K. A. Rawson (Eds.), The Cambridge handbook of cognition and education (pp. 67–99). Cambridge University Press.

Krell, M., Vorholzer, A., & Nehring, A. (2022). Scientific reasoning in science education: from global measures to fine-grained descriptions of students’ competencies. Education Sciences, 12 , 97. https://doi.org/10.3390/educsci12020097

Kuhn, D. (1993). Science as argument: Implications for teaching and learning scientific thinking. Science education, 77 (3), 319–337. https://doi.org/10.1002/sce.3730770306

Kuhn, D. (1999). A developmental model of critical thinking. Educational Researcher, 28 (2), 16–46. https://doi.org/10.3102/0013189X028002016

Kuhn, D. (2022). Metacognition matters in many ways. Educational Psychologist, 57 (2), 73–86. https://doi.org/10.1080/00461520.2021.1988603

Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23 (4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006

Kuhn, D., & Lerman, D. (2021). Yes but: Developing a critical stance toward evidence. International Journal of Science Education, 43 (7), 1036–1053. https://doi.org/10.1080/09500693.2021.1897897

Kuhn, D., & Modrek, A. S. (2022). Choose your evidence: Scientific thinking where it may most count. Science & Education, 31 (1), 21–31. https://doi.org/10.1007/s11191-021-00209-y

Lederman, J. S., Lederman, N. G., Bartos, S. A., Bartels, S. L., Meyer, A. A., & Schwartz, R. S. (2014). Meaningful assessment of learners' understandings about scientific inquiry—The views about scientific inquiry (VASI) questionnaire. Journal of Research in Science Teaching, 51 (1), 65–83. https://doi.org/10.1002/tea.21125

Lehrer, R., & Schauble, L. (2006). Scientific thinking and science literacy. In K. A. Renninger, I. E. Sigel, W. Damon, & R. M. Lerner (Eds.), Handbook of child psychology: Child psychology in practice (pp. 153–196). John Wiley & Sons, Inc.

López-Fernández, M. D. M., González-García, F., & Franco-Mariscal, A. J. (2022). How can socio-scientific issues help develop critical thinking in chemistry education? A reflection on the problem of plastics. Journal of Chemical Education, 99 (10), 3435–3442. https://doi.org/10.1021/acs.jchemed.2c00223

Magno, C. (2010). The role of metacognitive skills in developing critical thinking. Metacognition and Learning, 5 , 137–156. https://doi.org/10.1007/s11409-010-9054-4

McBain, B., Yardy, A., Martin, F., Phelan, L., van Altena, I., McKeowen, J., Pembertond, C., Tosec, H., Fratuse, L., & Bowyer, M. (2020). Teaching science students how to think. International Journal of Innovation in Science and Mathematics Education, 28 (2), 28–35. https://openjournals.library.sydney.edu.au/CAL/article/view/14809/13480

McIntyre, L. (2021). Talking to science deniers and sceptics is not hopeless. Nature, 596 (7871), 165–165. https://doi.org/10.1038/d41586-021-02152-y

Moore, C. (2019). Teaching science thinking. Using scientific reasoning in the classroom . Routledge.

Moreno-Fontiveros, G., Cebrián-Robles, D., Blanco-López, A., & y España-Ramos, E. (2022). Decisiones de estudiantes de 14/15 años en una propuesta didáctica sobre la compra de un coche [Fourteen/fifteen-year-old students’ decisions in a teaching proposal on the buying of a car]. Enseñanza de las Ciencias, 40 (1), 199–219. https://doi.org/10.5565/rev/ensciencias.3292

National Research Council [NRC]. (2012). A framework for K-12 science education . National Academies Press.

Network, I.-A. T. E. (2015). Critical thinking toolkit . OAS/ITEN.

Normand, M. P. (2008). Science, skepticism, and applied behavior analysis. Behavior Analysis in Practice, 1 (2), 42–49. https://doi.org/10.1007/BF03391727

Norris, S. P., Phillips, L. M., & Korpan, C. A. (2003). University students’ interpretation of media reports of science and its relationship to background knowledge, interest, and reading difficulty. Public Understanding of Science, 12 (2), 123–145. https://doi.org/10.1177/09636625030122001

Oliveras, B., Márquez, C., & Sanmartí, N. (2013). The use of newspaper articles as a tool to develop critical thinking in science classes. International Journal of Science Education, 35 (6), 885–905. https://doi.org/10.1080/09500693.2011.586736

Organisation for Economic Co-operation and Development [OECD]. (2019). PISA 2018. Assessment and Analytical Framework . OECD Publishing. https://doi.org/10.1787/b25efab8-en

Book   Google Scholar  

Organisation for Economic Co-operation and Development [OECD]. (2020). PISA 2024: Strategic Vision and Direction for Science. https://www.oecd.org/pisa/publications/PISA-2024-Science-Strategic-Vision-Proposal.pdf

Osborne, J., Pimentel, D., Alberts, B., Allchin, D., Barzilai, S., Bergstrom, C., Coffey, J., Donovan, B., Kivinen, K., Kozyreva, A., & Wineburg, S. (2022). Science Education in an Age of Misinformation . Stanford University.

Osborne, J. F., & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95 (4), 627–638. https://doi.org/10.1002/sce.20438

Pols, C. F. J., Dekkers, P. J. J. M., & De Vries, M. J. (2021). What do they know? Investigating students’ ability to analyse experimental data in secondary physics education. International Journal of Science Education, 43 (2), 274–297. https://doi.org/10.1080/09500693.2020.1865588

Royal Decree 217/2022. (2022). of 29 March, which establishes the organisation and minimum teaching of Compulsory Secondary Education (Vol. 76 , pp. 41571–41789). Spanish Official State Gazette. https://www.boe.es/eli/es/rd/2022/03/29/217

Sagan, C. (1987). The burden of skepticism. Skeptical Inquirer, 12 (1), 38–46. https://skepticalinquirer.org/1987/10/the-burden-of-skepticism/

Santos, L. F. (2017). The role of critical thinking in science education. Journal of Education and Practice, 8 (20), 160–173. https://eric.ed.gov/?id=ED575667

Schafersman, S. D. (1991). An introduction to critical thinking. https://facultycenter.ischool.syr.edu/wp-content/uploads/2012/02/Critical-Thinking.pdf . Accessed 10 May 2023.

Sinatra, G. M., & Hofer, B. K. (2021). How do emotions and attitudes influence science understanding? In Science denial: why it happens and what to do about it (pp. 142–180). Oxford Academic.

Solbes, J., Torres, N., & Traver, M. (2018). Use of socio-scientific issues in order to improve critical thinking competences. Asia-Pacific Forum on Science Learning & Teaching, 19 (1), 1–22. https://www.eduhk.hk/apfslt/

Spektor-Levy, O., Eylon, B. S., & Scherz, Z. (2009). Teaching scientific communication skills in science studies: Does it make a difference? International Journal of Science and Mathematics Education, 7 (5), 875–903. https://doi.org/10.1007/s10763-009-9150-6

Taylor, P., Lee, S. H., & Tal, T. (2006). Toward socio-scientific participation: changing culture in the science classroom and much more: Setting the stage. Cultural Studies of Science Education, 1 (4), 645–656. https://doi.org/10.1007/s11422-006-9028-7

Tena-Sánchez, J., & León-Medina, F. J. (2022). Y aún más al fondo del “bullshit”: El papel de la falsificación de preferencias en la difusión del oscurantismo en la teoría social y en la sociedad [And even deeper into “bullshit”: The role of preference falsification in the difussion of obscurantism in social theory and in society]. Scio, 22 , 209–233. https://doi.org/10.46583/scio_2022.22.949

Tytler, R., & Peterson, S. (2004). From “try it and see” to strategic exploration: Characterizing young children's scientific reasoning. Journal of Research in Science Teaching, 41 (1), 94–118. https://doi.org/10.1002/tea.10126

Uskola, A., & Puig, B. (2023). Development of systems and futures thinking skills by primary pre-service teachers for addressing epidemics. Research in Science Education , 1–17. https://doi.org/10.1007/s11165-023-10097-7

Vallverdú, J. (2005). ¿Cómo finalizan las controversias? Un nuevo modelo de análisis: la controvertida historia de la sacarina [How does controversies finish? A new model of analysis: the controversial history of saccharin]. Revista Iberoamericana de Ciencia, Tecnología y Sociedad, 2 (5), 19–50. http://www.revistacts.net/wp-content/uploads/2020/01/vol2-nro5-art01.pdf

Vázquez-Alonso, A., & Manassero-Mas, M. A. (2018). Más allá de la comprensión científica: educación científica para desarrollar el pensamiento [Beyond understanding of science: science education for teaching fair thinking]. Revista Electrónica de Enseñanza de las Ciencias, 17 (2), 309–336. http://reec.uvigo.es/volumenes/volumen17/REEC_17_2_02_ex1065.pdf

Willingham, D. T. (2008). Critical thinking: Why is it so hard to teach? Arts Education Policy Review, 109 (4), 21–32. https://doi.org/10.3200/AEPR.109.4.21-32

Yacoubian, H. A. (2020). Teaching nature of science through a critical thinking approach. In W. F. McComas (Ed.), Nature of Science in Science Instruction (pp. 199–212). Springer.

Yacoubian, H. A., & Khishfe, R. (2018). Argumentation, critical thinking, nature of science and socioscientific issues: a dialogue between two researchers. International Journal of Science Education, 40 (7), 796–807. https://doi.org/10.1080/09500693.2018.1449986

Zeidler, D. L., & Nichols, B. H. (2009). Socioscientific issues: Theory and practice. Journal of elementary science education, 21 (2), 49–58. https://doi.org/10.1007/BF03173684

Zimmerman, C., & Klahr, D. (2018). Development of scientific thinking. In J. T. Wixted (Ed.), Stevens’ handbook of experimental psychology and cognitive neuroscience (Vol. 4 , pp. 1–25). John Wiley & Sons, Inc..

Download references

Conflict of Interest

The author declares no conflict of interest.

Funding for open access publishing: Universidad de Sevilla/CBUA

Author information

Authors and affiliations.

Departamento de Didáctica de las Ciencias Experimentales y Sociales, Universidad de Sevilla, Seville, Spain

Antonio García-Carmona

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Antonio García-Carmona .

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

García-Carmona, A. Scientific Thinking and Critical Thinking in Science Education . Sci & Educ (2023). https://doi.org/10.1007/s11191-023-00460-5

Download citation

Accepted : 30 July 2023

Published : 05 September 2023

DOI : https://doi.org/10.1007/s11191-023-00460-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Cognitive skills
  • Critical thinking
  • Metacognitive skills
  • Science education
  • Scientific thinking
  • Find a journal
  • Publish with us
  • Track your research

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Prevent plagiarism. Run a free check.

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

critical thinking related literature

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved April 3, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, unlimited academic ai-proofreading.

✔ Document error-free in 5minutes ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

The Great Books Foundation

Critical Thinking with Literature: It’s Problem-Solving

  • By Sharon Crowley
  • June 29, 2015

Critical thinking tops the list of skills students need for success in the complex 21st century. When it comes to science and math, most people equate critical thinking with problem solving. In those content areas, students apply their understanding of basic concepts to a task for which the solution is not known in advance. By grappling with a challenging problem, students extend their learning. Critical thinking about literature is not so different. With a written work, the problem or task is often an open-ended, text-based question. Students use their comprehension of the text to develop interpretations—or solutions to the problem.

If you want your students to engage in higher-order thinking as they read and discuss literature, include these key elements of problem-solving activities:

Genuine, intriguing questions. To think critically, there must be something to think critically about. With literature, it’s a text that leaves your students puzzling and asking questions about a character, event, symbol, or structure. Predictable or moralistic texts with flat characters don’t generate intriguing questions. When texts are sufficiently complex, the questions that spring from them present engaging problems.

Divergent answers. Just as genuine problems in math or science allow for multiple strategies and solutions, a discussion-worthy question about a piece of literature should invite multiple interpretations or answers. In Shared Inquiry discussions, considering divergent ideas is what drives students to find deeper meaning in a text.

Ample evidence. As in math or science, for an answer or solution to be sound, there must be relevant reasons behind it. Likewise, ideas about the meaning of literary texts must be supported with the evidence from the work itself. Evidence and reasoning make ideas valid and debatable. Without evidence, ideas are simply guesses.

Opportunities to evaluate evidence. Some pieces of scientific or mathematical data are more compelling than others. The same is true when exploring a question about a rich work of literature. Collaborative discussion is a time for participants to share the evidence that supports their ideas, to weigh that evidence, and to strengthen ideas by debating each other’s assertions or suggesting additional evidence.

Collaboration. A good discussion question, or problem, is one that students want to work on together. Just as students benefit from combining their skills and perspectives when solving a math or science problem, discussing an interpretive question as a group yields more thoughtful and considered answers than if students had worked alone. Follow-up questions that ask students to clarify, elaborate, and explain their ideas help deepen and enliven the conversation.

' src=

I’m looking to enroll my daughter, in a local program in Detroit,MI

' src=

Thank you for your interest in Great Books! We’re confident your daughter would enjoy and benefit from participating in the program. Tom Kerschner, the Great Books Consultant for Michigan, will be in touch with you soon. Watch for an email ending with @greatbooks.org.

' src=

I am looking for a Great Books Program in the Detroit, Michigan, area for my ten year old daughter. Thank you, Kym

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

Shared Inquiry Provides a Common Language for Immersion Schools in Louisiana

Open Resources for English Language Teaching (ORELT) Portal

Open Resources for English Language Teaching (ORELT) Portal

COL

Search form

You are here, unit 5: facilitating critical thinking through literature, introduction.

Literature is an effective tool for engaging students in critical thinking. By teaching children to analyse and evaluate literary texts appropriate to their age and interests, we can help them develop critical thinking skills. This involves seeing relationships between events, drawing inferences, analysing events, synthesising evidence and evaluating both the content of a text and the language used to the express ideas contained within it.

Unit outcomes

Upon completion of this unit you will be able to:

Terminology

Teacher support information.

The literature class gives a teacher the opportunity to engage students in discussions about the ideas expressed in literary texts. This exercise benefits students in two ways: firstly, it gives them an opportunity to express their own ideas about life and relationships, values and beliefs, and interests and dislikes; secondly, it forces them to use a more complex set of structures and a more “advanced” range of vocabulary. As a language teacher in a literature class, you can exploit this situation by engaging students in group and pair activities to read sections of texts and then give their opinions about characters in the text, for example, or the style of writing — whether it is interesting, humorous, tragic, and so on. This will let students practise expressing opinions, drawing inferences, explaining cause-and-effect relationships, comparing facts and applying ideas they have gleaned from literature to new situations. In addition, they will learn how to analyse texts based on logical reasoning and to synthesise and evaluate the information in the texts.

Activity 1: Using literature to develop critical thinking: Drawing inferences from a text

Activity 2: evaluating a literary text, activity 3: from critical to creative skills: participating in creative writing workshops, activity 4: collaborative creative writing: creating a big book, unit summary, reflections, resource 1: inferring information from a literary text: a sample text.

Inferential questions:

Why do you think Trudy’s mother was shouting at her?

Does Trudy understand her responsibilities?

Is Trudy a tidy person?

Look up the meaning of the word “curious” in your dictionary. Is Trudy a curious person?

Did Trudy’s grandfather finally get to spend his life with Betty?

Do you think it was normal for girls and boys to meet freely during Trudy’s grandfather’s time?

Resource 2a: Critically reflecting on and responding to literary texts: Asking evaluative questions

Resource 2b: how to write a journal entry (worksheet), teacher question and answer.

IMAGES

  1. Critical Thinking Skills

    critical thinking related literature

  2. Tools Of Critical Thinking

    critical thinking related literature

  3. 6 Main Types of Critical Thinking Skills (With Examples)

    critical thinking related literature

  4. Critical thinking in reading and writing

    critical thinking related literature

  5. 14 Of The Best Critical Thinking Books That Come Packed With Examples

    critical thinking related literature

  6. 14 Of The Best Critical Thinking Books That Come Packed With Examples

    critical thinking related literature

VIDEO

  1. 12 Important Practice Questions /Research Methodology in English Education /Unit-1 /B.Ed. 4th Year

  2. Memory trouble, Concentration, attention and thinking related to #traumaticbraininjury #tbi #cyclist

  3. Teacher De-Wokefies Student By Teaching Critical Thinking

  4. Critical thinking at university

  5. The Positive Influence of Literature on Critical Thinking Skills

  6. Critical Thinking in Literature, Film, and Media: Why It Matters

COMMENTS

  1. (PDF) Teaching Critical Thinking Skills: Literature Review

    Critical Thinking (CT) has been recognized as one of the most important thinking skills and one of the most important indicators of student learning quality. In order to develop successful ...

  2. PDF Teaching Critical Thinking Skills: Literature Review

    purposeful, reasoned and goal directed'. Halpren (1997, p. 4) states, 'Critical thinking is purposeful, reasoned, and goal-directed. It is the kind of thinking involved in solving problems, formulating inferences, calculating likelihoods, and making decisions. Critical thinkers use these skills appropriately, without prompting, and

  3. PDF Critical thinking: A literature review

    the definition of critical thinking. The purposes of this literature review are to (a) explore the. ways in which critical thinking has been defined by researchers, (b) investigate how critical. thinking develops (c) learn how teachers can encourage the development of critical thinking.

  4. Bridging critical thinking and transformative learning: The role of

    Most critical thinking researchers agree that open-mindedness is a component of critical thinking (Ennis, 2018; Facione et al., 1994; Perkins et al., 1993). Arguably, William Hare's account of open-mindedness has been the most influential within the critical thinking literature.

  5. [PDF] Critical Thinking : A Literature

    Critical Thinking : A Literature. Critical thinking includes the component skills of analyzing arguments, making inferences using inductive or deductive reasoning, judging or evaluating, and making decisions or solving problems. Background knowledge is a necessary but not a sufficient condition for enabling critical thought within a given subject.

  6. Critical Thinking Skills in Education: A Systematic Literature Review

    The aim of this study is to analyze the existing literature related to critical thinking in educational curricula through a systematic literature review. This paper analyses literature through ...

  7. Critical thinking in the preschool classroom

    This paper is organised in the following way: Section 1 provides an overview of critical thinking including related skills and definitions that reflect on the importance of critical thinking in early childhood; Section 2, ... Literature on critical thinking spans the disciplines of philosophy, psychology, and education (see e.g., Lia, 2011 ...

  8. Teaching Critical Thinking Skills: Literature Review.

    Critical Thinking (CT) has been recognized as one of the most important thinking skills and one of the most important indicators of student learning quality. In order to develop successful critical thinkers, CT must be incorporated into the curriculum content and teaching approaches and sequenced at all grade levels. This research provides a systematic review of the extant literature on ...

  9. Critical Thinking

    General Overviews. The sources highlighted here include textbooks, literature reviews, and meta-analyses related to critical thinking. These contributions come from both psychological (Halpern 2003; Nisbett 1993; Sternberg, et al. 2007) and philosophical (Ennis 1962, Facione 1990) perspectives.Many of these general overviews are textbooks (Facione 2011b; Halpern 2003; Nisbett 1993; Sternberg ...

  10. Bibliometric analysis of the literature on critical thinking: an

    2. Theoretical framework. Two of the most cited studies in the CT literature define CT as 'the use of those cognitive skills or strategies to increase the probability of a desired outcome' (Halpern, Citation 1998, p. 450) and as 'the ability to engage in purposeful, self-regulatory judgement' (Abrami et al., Citation 2008, p. 1102).The absence of a common definition of CT can be ...

  11. Critical thinking: A literature review

    Request PDF | On Jan 1, 2011, E.R. Lai published Critical thinking: A literature review | Find, read and cite all the research you need on ResearchGate

  12. Critical Thinking: A Literature Review

    Critical Thinking: A Literature Review. This paper focuses on the multifaceted nature of critical thinking, encompassing skills such as analyzing arguments, making inferences, and problem-solving. It emphasizes that background knowledge alone is insufficient for fostering critical thinking, which requires both cognitive skills and specific ...

  13. Developing Critical Thinking: A Review of Past Efforts as a ...

    The purpose of this paper is to establish a theoretically grounded and research-based framework to support the development of inquiry and critical thinking skills in children. As a first step in developing the framework, a review of the research literature on critical thinking and inquiry learning was conducted.

  14. PDF Measuring Student Success Skills: a Review of The Literature on

    The critical thinking literature is rooted in three ields: psychology, philosophy, and education (Lewis & ... deinitions, and understandings in the research literature related to critical thinking. Key initial questions include: What is critical thinking? How is critical thinking related to other success skill concepts? And to what

  15. Understanding the Complex Relationship between Critical Thinking and

    In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might ...

  16. Writing and Critical Thinking Through Literature (Ringo and Kashyap

    40355. Heather Ringo & Athena Kashyap. City College of San Francisco via ASCCC Open Educational Resources Initiative. This text offers instruction in analytical, critical, and argumentative writing, critical thinking, research strategies, information literacy, and proper documentation through the study of literary works from major genres, while ...

  17. Critically Reviewing Literature: A Tutorial for New Researchers

    Abstract. Critically reviewing the literature is an indispensible skill which is used throughout a research career. This demystifies the processes involved in systematically and critically reviewing the literature to demonstrate knowledge, identify research ideas and questions, position research and develop theory.

  18. Scientific Thinking and Critical Thinking in Science Education

    In consulting technical reports, theoretical frameworks, research, and curricular reforms related to science education, one commonly finds appeals to scientific thinking and critical thinking as essential educational processes or objectives. This is confirmed in some studies that include exhaustive reviews of the literature in this regard such as those of Bailin (), Costa et al. (), and Santos ...

  19. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  20. Critical Thinking with Literature: It's Problem-Solving

    June 29, 2015. Critical thinking tops the list of skills students need for success in the complex 21st century. When it comes to science and math, most people equate critical thinking with problem solving. In those content areas, students apply their understanding of basic concepts to a task for which the solution is not known in advance.

  21. Developing Critical Thinking through Literature Reading

    Developing Critical Thinking t hrough Literature Reading 293. (1956) 20—knowledge and comprehension—as they fail to reflect and examine their. beliefs and actions. To initiate them into higher ...

  22. Unit 5: Facilitating Critical Thinking through Literature

    Activity 1: Using literature to develop critical thinking: Drawing inferences from a text. Activity 1. The term critical thinking suggests the idea of not readily accepting any given viewpoint. ... The students will have to support their answers by quoting related sections from the text. You could note down three of the best questions, and have ...

  23. PDF MEASURING STUDENT SUCCESS SKILLS: A REVIEW OF THE LITERATURE ON ...

    The critical thinking literature is rooted in three fields: psychology, philosophy, and education (Lewis & ... definitions, and understandings in the research literature related to critical thinking. Key initial questions include: What is critical thinking? How is critical thinking related to other success skill concepts? And to what