OPINION article

Redefining critical thinking: teaching students to think like scientists.

\r\nRodney M. Schmaltz*

  • Department of Psychology, MacEwan University, Edmonton, AB, Canada

From primary to post-secondary school, critical thinking (CT) is an oft cited focus or key competency (e.g., DeAngelo et al., 2009 ; California Department of Education, 2014 ; Alberta Education, 2015 ; Australian Curriculum Assessment and Reporting Authority, n.d. ). Unfortunately, the definition of CT has become so broad that it can encompass nearly anything and everything (e.g., Hatcher, 2000 ; Johnson and Hamby, 2015 ). From discussion of Foucault, critique and the self ( Foucault, 1984 ) to Lawson's (1999) definition of CT as the ability to evaluate claims using psychological science, the term critical thinking has come to refer to an ever-widening range of skills and abilities. We propose that educators need to clearly define CT, and that in addition to teaching CT, a strong focus should be placed on teaching students how to think like scientists. Scientific thinking is the ability to generate, test, and evaluate claims, data, and theories (e.g., Bullock et al., 2009 ; Koerber et al., 2015 ). Simply stated, the basic tenets of scientific thinking provide students with the tools to distinguish good information from bad. Students have access to nearly limitless information, and the skills to understand what is misinformation or a questionable scientific claim is crucially important ( Smith, 2011 ), and these skills may not necessarily be included in the general teaching of critical thinking ( Wright, 2001 ).

This is an issue of more than semantics. While some definitions of CT include key elements of the scientific method (e.g., Lawson, 1999 ; Lawson et al., 2015 ), this emphasis is not consistent across all interpretations of CT ( Huber and Kuncel, 2016 ). In an attempt to provide a comprehensive, detailed definition of CT, the American Philosophical Association (APA), outlined six CT skills, 16 subskills, and 19 dispositions ( Facione, 1990 ). Skills include interpretation, analysis, and inference; dispositions include inquisitiveness and open-mindedness. 1 From our perspective, definitions of CT such as those provided by the APA or operationally defined by researchers in the context of a scholarly article (e.g., Forawi, 2016 ) are not problematic—the authors clearly define what they are referring to as CT. Potential problems arise when educators are using different definitions of CT, or when the banner of CT is applied to nearly any topic or pedagogical activity. Definitions such as those provided by the APA provide a comprehensive framework for understanding the multi-faceted nature of CT, however the definition is complex and may be difficult to work with at a policy level for educators, especially those who work primarily with younger students.

The need to develop scientific thinking skills is evident in studies showing that 55% of undergraduate students believe that a full moon causes people to behave oddly, and an estimated 67% of students believe creatures such as Bigfoot and Chupacabra exist, despite the lack of scientific evidence supporting these claims ( Lobato et al., 2014 ). Additionally, despite overwhelming evidence supporting the existence of anthropogenic climate change, and the dire need to mitigate its effects, many people still remain skeptical of climate change and its impact ( Feygina et al., 2010 ; Lewandowsky et al., 2013 ). One of the goals of education is to help students foster the skills necessary to be informed consumers of information ( DeAngelo et al., 2009 ), and providing students with the tools to think scientifically is a crucial component of reaching this goal. By focusing on scientific thinking in conjunction with CT, educators may be better able design specific policies that aim to facilitate the necessary skills students should have when they enter post-secondary training or the workforce. In other words, students should leave secondary school with the ability to rule out rival hypotheses, understand that correlation does not equal causation, the importance of falsifiability and replicability, the ability to recognize extraordinary claims, and use the principle of parsimony (e.g., Lett, 1990 ; Bartz, 2002 ).

Teaching scientific thinking is challenging, as people are vulnerable to trusting their intuitions and subjective observations and tend to prioritize them over objective scientific findings (e.g., Lilienfeld et al., 2012 ). Students and the public at large are prone to naïve realism, or the tendency to believe that our experiences and observations constitute objective reality ( Ross and Ward, 1996 ), when in fact our experiences and observations are subjective and prone to error (e.g., Kahneman, 2011 ). Educators at the post-secondary level tend to prioritize scientific thinking ( Lilienfeld, 2010 ), however many students do not continue on to a post-secondary program after they have completed high school. Further, students who are told they are learning critical thinking may believe they possess the skills to accurately assess the world around them. However, if they are not taught the specific skills needed to be scientifically literate, they may still fall prey to logical fallacies and biases. People tend to underestimate or not understand fallacies that can prevent them from making sound decisions ( Lilienfeld et al., 2001 ; Pronin et al., 2004 ; Lilienfeld, 2010 ). Thus, it is reasonable to think that a person who has not been adequately trained in scientific thinking would nonetheless consider themselves a strong critical thinker, and therefore would be even less likely consider his or her own personal biases. Another concern is that when teaching scientific thinking there is always the risk that students become overly critical or cynical (e.g., Mercier et al., 2017 ). By this, a student may be skeptical of nearly all findings, regardless of the supporting evidence. By incorporating and focusing on cognitive biases, instructors can help students understand their own biases, and demonstrate how the rigor of the scientific method can, at least partially, control for these biases.

Teaching CT remains controversial and confusing for many instructors ( Bensley and Murtagh, 2012 ). This is partly due to the lack of clarity in the definition of CT and the wide range of methods proposed to best teach CT ( Abrami et al., 2008 ; Bensley and Murtagh, 2012 ). For instance, Bensley and Spero (2014) found evidence for the effectiveness of direct approaches to teaching CT, a claim echoed in earlier research ( Abrami et al., 2008 ; Marin and Halpern, 2011 ). Despite their positive findings, some studies have failed to find support for measures of CT ( Burke et al., 2014 ) and others have found variable, yet positive, support for instructional methods ( Dochy et al., 2003 ). Unfortunately, there is a lack of research demonstrating the best pedagogical approaches to teaching scientific thinking at different grade levels. More research is needed to provide an empirically grounded approach to teach scientific thinking, and there is also a need to develop evidence based measures of scientific thinking that are grade and age appropriate. One approach to teaching scientific thinking may be to frame the topic in its simplest terms—the ability to “detect baloney” ( Sagan, 1995 ).

Sagan (1995) has promoted the tools necessary to recognize poor arguments, fallacies to avoid, and how to approach claims using the scientific method. The basic tenets of Sagan's argument apply to most claims, and have the potential to be an effective teaching tool across a range of abilities and ages. Sagan discusses the idea of a baloney detection kit, which contains the “tools” for skeptical thinking. The development of “baloney detection kits” which include age-appropriate scientific thinking skills may be an effective approach to teaching scientific thinking. These kits could include the style of exercises that are typically found under the banner of CT training (e.g., group discussions, evaluations of arguments) with a focus on teaching scientific thinking. An empirically validated kit does not yet exist, though there is much to draw from in the literature on pedagogical approaches to correcting cognitive biases, combatting pseudoscience, and teaching methodology (e.g., Smith, 2011 ). Further research is needed in this area to ensure that the correct, and age-appropriate, tools are part of any baloney detection kit.

Teaching Sagan's idea of baloney detection in conjunction with CT provides educators with a clear focus—to employ a pedagogical approach that helps students create sound and cogent arguments while avoiding falling prey to “baloney”. This is not to say that all of the information taught under the current banner of “critical thinking” is without value. In fact, many of the topics taught under the current approach of CT are important, even though they would not fit within the framework of some definitions of critical thinking. If educators want to ensure that students have the ability to be accurate consumers of information, a focus should be placed on including scientific thinking as a component of the science curriculum, as well as part of the broader teaching of CT.

Educators need to be provided with evidence-based approaches to teach the principles of scientific thinking. These principles should be taught in conjunction with evidence-based methods that mitigate the potential for fallacious reasoning and false beliefs. At a minimum, when students first learn about science, there should also be an introduction to the basics tenets of scientific thinking. Courses dedicated to promoting scientific thinking may also be effective. A course focused on cognitive biases, logical fallacies, and the hallmarks of scientific thinking adapted for each grade level may provide students with the foundation of solid scientific thinking skills to produce and evaluate arguments, and allow expansion of scientific thinking into other scholastic areas and classes. Evaluations of the efficacy of these courses would be essential, along with research to determine the best approach to incorporate scientific thinking into the curriculum.

If instructors know that students have at least some familiarity with the fundamental tenets of scientific thinking, the ability to expand and build upon these ideas in a variety of subject specific areas would further foster and promote these skills. For example, when discussing climate change, an instructor could add a brief discussion of why some people reject the science of climate change by relating this back to the information students will be familiar with from their scientific thinking courses. In terms of an issue like climate change, many students may have heard in political debates or popular culture that global warming trends are not real, or a “hoax” ( Lewandowsky et al., 2013 ). In this case, only teaching the data and facts may not be sufficient to change a student's mind about the reality of climate change ( Lewandowsky et al., 2012 ). Instructors would have more success by presenting students with the data on global warming trends as well as information on the biases that could lead some people reject the data ( Kowalski and Taylor, 2009 ; Lewandowsky et al., 2012 ). This type of instruction helps educators create informed citizens who are better able to guide future decision making and ensure that students enter the job market with the skills needed to be valuable members of the workforce and society as a whole.

By promoting scientific thinking, educators can ensure that students are at least exposed to the basic tenets of what makes a good argument, how to create their own arguments, recognize their own biases and those of others, and how to think like a scientist. There is still work to be done, as there is a need to put in place educational programs built on empirical evidence, as well as research investigating specific techniques to promote scientific thinking for children in earlier grade levels and develop measures to test if students have acquired the necessary scientific thinking skills. By using an evidence based approach to implement strategies to promote scientific thinking, and encouraging researchers to further explore the ideal methods for doing so, educators can better serve their students. When students are provided with the core ideas of how to detect baloney, and provided with examples of how baloney detection relates to the real world (e.g., Schmaltz and Lilienfeld, 2014 ), we are confident that they will be better able to navigate through the oceans of information available and choose the right path when deciding if information is valid.

Author Contribution

RS was the lead author and this paper, and both EJ and NW contributed equally.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

1. ^ There is some debate about the role of dispositional factors in the ability for a person to engage in critical thinking, specifically that dispositional factors may mitigate any attempt to learn CT. The general consensus is that while dispositional traits may play a role in the ability to think critically, the general skills to be a critical thinker can be taught ( Niu et al., 2013 ; Abrami et al., 2015 ).

Abrami, P. C., Bernard, R. M., Borokhovski, E., Waddington, D. I., Wade, C. A., and Persson, T. (2015). Strategies for teaching students to think critically a meta-analysis. Rev. Educ. Res. 85, 275–314. doi: 10.3102/0034654308326084

CrossRef Full Text | Google Scholar

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., et al. (2008). Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Rev. Educ. Res. 78, 1102–1134. doi: 10.3102/0034654308326084

Alberta Education (2015). Ministerial Order on Student Learning . Available online at: https://education.alberta.ca/policies-and-standards/student-learning/everyone/ministerial-order-on-student-learning-pdf/

Australian Curriculum Assessment and Reporting Authority (n.d.). Available online at: http://www.australiancurriculum.edu.au

Bartz, W. R. (2002). Teaching skepticism via the CRITIC acronym and the skeptical inquirer. Skeptical Inquirer 17, 42–44.

Google Scholar

Bensley, D. A., and Murtagh, M. P. (2012). Guidelines for a scientific approach to critical thinking assessment. Teach. Psychol. 39, 5–16. doi: 10.1177/0098628311430642

Bensley, D. A., and Spero, R. A. (2014). Improving critical thinking skills and metacognitive monitoring through direct infusion. Think. Skills Creativ. 12, 55–68. doi: 10.1016/j.tsc.2014.02.001

Bullock, M., Sodian, B., and Koerber, S. (2009). “Doing experiments and understanding science: development of scientific reasoning from childhood to adulthood,” in Human Development from Early Childhood to Early Adulthood: Findings from a 20 Year Longitudinal Study , eds W. Schneider and M. Bullock (New York, NY: Psychology Press), 173–197.

Burke, B. L., Sears, S. R., Kraus, S., and Roberts-Cady, S. (2014). Critical analysis: a comparison of critical thinking changes in psychology and philosophy classes. Teach. Psychol. 41, 28–36. doi: 10.1177/0098628313514175

California Department of Education (2014). Standard for Career Ready Practice . Available online at: http://www.cde.ca.gov/nr/ne/yr14/yr14rel22.asp

DeAngelo, L., Hurtado, S., Pryor, J. H., Kelly, K. R., Santos, J. L., and Korn, W. S. (2009). The American College Teacher: National Norms for the 2007-2008 HERI Faculty Survey . Los Angeles, CA: Higher Education Research Institute.

Dochy, F., Segers, M., Van den Bossche, P., and Gijbels, D. (2003). Effects of problem-based learning: a meta-analysis. Learn. Instruct. 13, 533–568. doi: 10.1016/S0959-4752(02)00025-7

Facione, P. A. (1990). Critical thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations. Newark, DE: American Philosophical Association.

Feygina, I., Jost, J. T., and Goldsmith, R. E. (2010). System justification, the denial of global warming, and the possibility of ‘system-sanctioned change’. Pers. Soc. Psychol. Bull. 36, 326–338. doi: 10.1177/0146167209351435

PubMed Abstract | CrossRef Full Text | Google Scholar

Forawi, S. A. (2016). Standard-based science education and critical thinking. Think. Skills Creativ. 20, 52–62. doi: 10.1016/j.tsc.2016.02.005

Foucault, M. (1984). The Foucault Reader . New York, NY: Pantheon.

Hatcher, D. L. (2000). Arguments for another definition of critical thinking. Inquiry 20, 3–8. doi: 10.5840/inquiryctnews20002016

Huber, C. R., and Kuncel, N. R. (2016). Does college teach critical thinking? A meta-analysis. Rev. Educ. Res. 86, 431–468. doi: 10.3102/0034654315605917

Johnson, R. H., and Hamby, B. (2015). A meta-level approach to the problem of defining “Critical Thinking”. Argumentation 29, 417–430. doi: 10.1007/s10503-015-9356-4

Kahneman, D. (2011). Thinking, Fast and Slow . New York, NY: Farrar, Straus and Giroux.

Koerber, S., Mayer, D., Osterhaus, C., Schwippert, K., and Sodian, B. (2015). The development of scientific thinking in elementary school: a comprehensive inventory. Child Dev. 86, 327–336. doi: 10.1111/cdev.12298

Kowalski, P., and Taylor, A. K. (2009). The effect of refuting misconceptions in the introductory psychology class. Teach. Psychol. 36, 153–159. doi: 10.1080/00986280902959986

Lawson, T. J. (1999). Assessing psychological critical thinking as a learning outcome for psychology majors. Teach. Psychol. 26, 207–209. doi: 10.1207/S15328023TOP260311

CrossRef Full Text

Lawson, T. J., Jordan-Fleming, M. K., and Bodle, J. H. (2015). Measuring psychological critical thinking: an update. Teach. Psychol. 42, 248–253. doi: 10.1177/0098628315587624

Lett, J. (1990). A field guide to critical thinking. Skeptical Inquirer , 14, 153–160.

Lewandowsky, S., Ecker, U. H., Seifert, C. M., Schwarz, N., and Cook, J. (2012). Misinformation and its correction: continued influence and successful debiasing. Psychol. Sci. Public Interest 13, 106–131. doi: 10.1177/1529100612451018

Lewandowsky, S., Oberauer, K., and Gignac, G. E. (2013). NASA faked the moon landing—therefore, (climate) science is a hoax: an anatomy of the motivated rejection of science. Psychol. Sci. 24, 622–633. doi: 10.1177/0956797612457686

Lilienfeld, S. O. (2010). Can psychology become a science? Pers. Individ. Dif. 49, 281–288. doi: 10.1016/j.paid.2010.01.024

Lilienfeld, S. O., Ammirati, R., and David, M. (2012). Distinguishing science from pseudoscience in school psychology: science and scientific thinking as safeguards against human error. J. Sch. Psychol. 50, 7–36. doi: 10.1016/j.jsp.2011.09.006

Lilienfeld, S. O., Lohr, J. M., and Morier, D. (2001). The teaching of courses in the science and pseudoscience of psychology: useful resources. Teach. Psychol. 28, 182–191. doi: 10.1207/S15328023TOP2803_03

Lobato, E., Mendoza, J., Sims, V., and Chin, M. (2014). Examining the relationship between conspiracy theories, paranormal beliefs, and pseudoscience acceptance among a university population. Appl. Cogn. Psychol. 28, 617–625. doi: 10.1002/acp.3042

Marin, L. M., and Halpern, D. F. (2011). Pedagogy for developing critical thinking in adolescents: explicit instruction produces greatest gains. Think. Skills Creativ. 6, 1–13. doi: 10.1016/j.tsc.2010.08.002

Mercier, H., Boudry, M., Paglieri, F., and Trouche, E. (2017). Natural-born arguers: teaching how to make the best of our reasoning abilities. Educ. Psychol. 52, 1–16. doi: 10.1080/00461520.2016.1207537

Niu, L., Behar-Horenstein, L. S., and Garvan, C. W. (2013). Do instructional interventions influence college students' critical thinking skills? A meta-analysis. Educ. Res. Rev. 9, 114–128. doi: 10.1016/j.edurev.2012.12.002

Pronin, E., Gilovich, T., and Ross, L. (2004). Objectivity in the eye of the beholder: divergent perceptions of bias in self versus others. Psychol. Rev. 111, 781–799. doi: 10.1037/0033-295X.111.3.781

Ross, L., and Ward, A. (1996). “Naive realism in everyday life: implications for social conflict and misunderstanding,” in Values and Knowledge , eds E. S. Reed, E. Turiel, T. Brown, E. S. Reed, E. Turiel and T. Brown (Hillsdale, NJ: Lawrence Erlbaum Associates Inc.), 103–135.

Sagan, C. (1995). Demon-Haunted World: Science as a Candle in the Dark . New York, NY: Random House.

Schmaltz, R., and Lilienfeld, S. O. (2014). Hauntings, homeopathy, and the Hopkinsville Goblins: using pseudoscience to teach scientific thinking. Front. Psychol. 5:336. doi: 10.3389/fpsyg.2014.00336

Smith, J. C. (2011). Pseudoscience and Extraordinary Claims of the Paranormal: A Critical Thinker's Toolkit . New York, NY: John Wiley and Sons.

Wright, I. (2001). Critical thinking in the schools: why doesn't much happen? Inform. Logic 22, 137–154. doi: 10.22329/il.v22i2.2579

Keywords: scientific thinking, critical thinking, teaching resources, skepticism, education policy

Citation: Schmaltz RM, Jansen E and Wenckowski N (2017) Redefining Critical Thinking: Teaching Students to Think like Scientists. Front. Psychol . 8:459. doi: 10.3389/fpsyg.2017.00459

Received: 13 December 2016; Accepted: 13 March 2017; Published: 29 March 2017.

Reviewed by:

Copyright © 2017 Schmaltz, Jansen and Wenckowski. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Rodney M. Schmaltz, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

15k Accesses

13 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

peer review articles on critical thinking

Fostering twenty-first century skills among primary school students through math project-based learning

peer review articles on critical thinking

A meta-analysis to gauge the impact of pedagogies employed in mixed-ability high school biology classrooms

peer review articles on critical thinking

A guide to critical thinking: implications for dental education

Introduction.

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence.

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2024)

Exploring the effects of digital technology on deep learning: a meta-analysis

Sustainable electricity generation and farm-grid utilization from photovoltaic aquaculture: a bibliometric analysis.

  • A. A. Amusa
  • M. Alhassan

International Journal of Environmental Science and Technology (2024)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

peer review articles on critical thinking

Critical thinking in nursing clinical practice, education and research: From attitudes to virtue

Affiliations.

  • 1 Department of Fundamental Care and Medical Surgital Nursing, Faculty of Medicine and Health Sciences, School of Nursing, Consolidated Research Group Quantitative Psychology (2017-SGR-269), University of Barcelona, Barcelona, Spain.
  • 2 Department of Fundamental Care and Medical Surgital Nursing, Faculty of Medicine and Health Sciences, School of Nursing, Consolidated Research Group on Gender, Identity and Diversity (2017-SGR-1091), University of Barcelona, Barcelona, Spain.
  • 3 Department of Fundamental Care and Medical Surgital Nursing, Faculty of Medicine and Health Sciences, School of Nursing, University of Barcelona, Barcelona, Spain.
  • 4 Multidisciplinary Nursing Research Group, Vall d'Hebron Research Institute (VHIR), Vall d'Hebron Hospital, Barcelona, Spain.
  • PMID: 33029860
  • DOI: 10.1111/nup.12332

Critical thinking is a complex, dynamic process formed by attitudes and strategic skills, with the aim of achieving a specific goal or objective. The attitudes, including the critical thinking attitudes, constitute an important part of the idea of good care, of the good professional. It could be said that they become a virtue of the nursing profession. In this context, the ethics of virtue is a theoretical framework that becomes essential for analyse the critical thinking concept in nursing care and nursing science. Because the ethics of virtue consider how cultivating virtues are necessary to understand and justify the decisions and guide the actions. Based on selective analysis of the descriptive and empirical literature that addresses conceptual review of critical thinking, we conducted an analysis of this topic in the settings of clinical practice, training and research from the virtue ethical framework. Following JBI critical appraisal checklist for text and opinion papers, we argue the need for critical thinking as an essential element for true excellence in care and that it should be encouraged among professionals. The importance of developing critical thinking skills in education is well substantiated; however, greater efforts are required to implement educational strategies directed at developing critical thinking in students and professionals undergoing training, along with measures that demonstrate their success. Lastly, we show that critical thinking constitutes a fundamental component in the research process, and can improve research competencies in nursing. We conclude that future research and actions must go further in the search for new evidence and open new horizons, to ensure a positive effect on clinical practice, patient health, student education and the growth of nursing science.

Keywords: critical thinking; critical thinking attitudes; nurse education; nursing care; nursing research.

© 2020 John Wiley & Sons Ltd.

  • Attitude of Health Personnel*
  • Education, Nursing / methods
  • Nursing Process
  • Nursing Research / methods

Grants and funding

  • PREI-19-007-B/School of Nursing. Faculty of Medicine and Health Sciences. University of Barcelona

Advertisement

Advertisement

To Clarity and Beyond: Situating Higher-Order, Critical, and Critical-Analytic Thinking in the Literature on Learning from Multiple Texts

  • REVIEW ARTICLE
  • Published: 24 March 2023
  • Volume 35 , article number  40 , ( 2023 )

Cite this article

peer review articles on critical thinking

  • Alexandra List   ORCID: orcid.org/0000-0003-1125-9811 1 &
  • Yuting Sun 2  

1055 Accesses

5 Citations

1 Altmetric

Explore all metrics

For this systematic review, learning from multiple texts served as the specific context for investigating the constructs of higher-order (HOT), critical (CT), and critical-analytic (CAT) thinking. Examining the manifestations of HOT, CT, and CAT within the specific context of learning from multiple texts allowed us to clarify and disentangle these valued modes of thought. We begin by identifying the mental activities underlying the processes and outcomes of learning from multiple texts. We then juxtapose these mental activities with definitions of HOT, CT, and CAT drawn from the literature. Through this juxtaposition, we define HOT as multi-componential, including evaluation; CT as requiring both evaluation and its justification or substantiation; and CAT as considering the extent to which evaluation and justification may be consistently and systematically applied. We further generate a number of insights, described in the final section of this article. These include the frequent manifestations of HOT, CT, and CAT within the context of students learning from multiple texts and the co-occurring demand for these valued modes of thinking. We propose an additional mode of valued thought, that we refer to as devising , when learners synthetically and systematically use knowledge and strategies gained within one multiple text learning situation to produce an original product or solution in another novel learning situation. We consider such devising to demand HOT, CT, and CAT.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

peer review articles on critical thinking

Similar content being viewed by others

peer review articles on critical thinking

Online learning in higher education: exploring advantages and disadvantages for engagement

peer review articles on critical thinking

Multiple Intelligences Theory—Howard Gardner

Opting out as an untapped resource in instructional design: review and implications, data availability.

Data available upon request.

Although comprehension has been referred to as the result of both bottom-up (or passive) and top-down (i.e., purposeful and active) knowledge activation processes (Kurby et al., 2005 ; Wolfe & Goldman, 2005 ), we refer here to top-down processes or students’ deliberate engagement of prior knowledge, as reflective of HOT (Richter & Maier, 2017 ).

Bloom et al. ( 1956 ), in their original introduction of this taxonomy, did not refer to higher vis-à-vis lower levels.

This study was beyond the scope of our review as it was a dissertation; however, serves as an illustrative example here.

The multiple text literature can also benefit from considering objectives specified within Marzano and Kendall ( 2008 ) self-system. These include asking students to consider the importance of tasks to them , their efficacy for task completion, emotional responses to tasks or texts, as well as overall motivation. As suggested by a recent review from Anmarkrud et al. ( 2021 ), rarely have these self-system components been examined in the literature on learning from multiple texts, with interest constituting the motivational construct most analyzed. Recent work has started to look at the role of epistemic emotions in learning from multiple texts (Danielson et al., 2022 ; Muis et al., 2015 ).

This paper was excluded from our review as it did not have a learning outcome.

Kammerer et al. ( 2013 ) were unique in asking students to apprise their certainty in a solution to a medical controversy, described across multiple texts. We considered students’ certainty appraisals to reflect metacognition; however this was a mental activity not included among the multiple text outcomes we coded for (i.e., and was placed into the Other category), as Kammerer et al. ( 2013 ) were unique in including this as an assessment.

*Indicate references included in the review

Adams, N. E. (2015). Bloom’s Taxonomy of cognitive learning objectives. Journal of the Medical Library Association, 103 (3), 152–153. https://doi.org/10.3163/1536-5050.103.3.010

Article   Google Scholar  

Afflerbach, P., Cho, B.-Y., & Kim, J.-Y. (2011). The assessment of higher order thinking in reading. In G. Schraw & D. R. Robinson (Eds.), Assessment of higher order thinking skills (pp. 185–217). IAP Information Age Publishing.

Google Scholar  

Afflerbach, P., Cho, B.-Y., & Kim, J.-Y. (2015). Conceptualizing and assessing higher-order thinking in reading. Theory into Practice, 54 (3), 203–212. https://doi.org/10.1080/00405841.2015.1044367

Alexander, P. A. (2014). Thinking critically and analytically about critical-analytic thinking: An introduction. Educational Psychology Review, 26 , 469–476. https://doi.org/10.1007/s10648-014-9283-1

Alexander, P. A., Dinsmore, D. L., Fox, E., Grossnickle, E. M., Loughlin, S. M., Maggioni, L., Parkinson, M. M., & Winters, F. I. (2011). Higher order thinking and knowledge: Domain-general and domain-specific trends and future directions. In G. Schraw & D. R. Robinson (Eds.), Assessment of higher order thinking skills (pp. 47–88). IAP Information Age Publishing.

Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J., & Wittrock, M. C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s Taxonomy of Educational Objectives . Longman.

Anmarkrud, Ø., Bråten, I., Florit, E., & Mason, L. (2021). The role of individual differences in sourcing: A systematic review.  Educational Psychology Review , 1–44. https://doi.org/10.1007/s10648-021-09640-7

*Anmarkrud, Ø., Bråten, I., & Strømsø, H. I. (2014). Multiple-documents literacy: Strategic processing, source awareness, and argumentation when reading multiple conflicting documents.  Learning and Individual Differences ,  30 , 64–76. https://doi.org/10.1016/j.lindif.2013.01.007

*Anmarkrud, Ø., McCrudden, M. T., Bråten, I., & Strømsø, H. I. (2013). Task-oriented reading of multiple documents: Online comprehension processes and offline products.  Instructional Science ,  41 (5), 873–894. https://doi.org/10.1007/s11251-013-9263-8

*Barzilai, S., Tal-Savir, D., Abed, F., Mor-Hagani, S., & Zohar, A. R. (2021). Mapping multiple documents: From constructing multiple document models to argumentative writing.  Reading and Writing: An Interdisciplinary Journal . https://doi.org/10.1007/s11145-021-10208-8

*Barzilai, S., Tzadok, E., & Eshet-Alkalai, Y. (2015). Sourcing while reading divergent expert accounts: Pathways from views of knowing to written argumentation.  Instructional Science ,  43 (6), 737–766. https://doi.org/10.1007/s11251-015-9359-4

Barzilai, S., & Zohar, A. (2012). Epistemic thinking in action: Evaluating and integrating online sources. Cognition and Instruction, 30 (1), 39–85. https://doi.org/10.1080/07370008.2011.636495

Barzilai, S., Zohar, A. R., & Mor-Hagani, S. (2018). Promoting integration of multiple texts: A review of instructional approaches and practices. Educational Psychology Review, 30 (3), 973–999. https://doi.org/10.1007/s10648-018-9436-8

Bloom, B. S. (Ed.), Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook 1: Cognitive domain . David McKay.

*Brand‐Gruwel, S., Kammerer, Y., van Meeuwen, L., & van Gog, T. (2017). Source evaluation of domain experts and novices during Web search.  Journal of Computer Assisted Learning ,  33 (3), 234–251. https://doi.org/10.1111/jcal.12162

*Brand-Gruwel, S., Wopereis, I., & Vermetten, Y. (2005). Information problem solving by experts and novices: Analysis of a complex cognitive skill.  Computers in Human Behavior ,  21 (3), 487–508. https://doi.org/10.1016/j.chb.2004.10.005

Brand-Gruwel, S., Wopereis, I., & Walraven, A. (2009). A descriptive model of information problem solving while using internet. Computers & Education, 53 (4), 1207–1217. https://doi.org/10.1016/j.compedu.2009.06.004

Brante, E. W., & Strømsø, H. I. (2018). Sourcing in text comprehension: A review of interventions targeting sourcing skills. Educational Psychology Review, 30 (3), 773–799. https://doi.org/10.1007/s10648-017-9421-7

Bråten, I., Britt, M. A., Strømsø, H. I., & Rouet, J.-F. (2011). The role of epistemic beliefs in the comprehension of multiple expository texts: Toward an integrated model. Educational Psychologist, 46 (1), 48–70. https://doi.org/10.1080/00461520.2011.538647

*Bråten, I., Ferguson, L. E., Anmarkrud, Ø., & Strømsø, H. I. (2013). Prediction of learning and comprehension when adolescents read multiple texts: The roles of word-level processing, strategic approach, and reading motivation.  Reading and Writing: An Interdisciplinary Journal ,  26 (3), 321–348. https://doi.org/10.1007/s11145-012-9371-x

*Bråten, I., Ferguson, L. E., Strømsø, H. I., & Anmarkrud, Ø. (2014). Students working with multiple conflicting documents on a scientific issue: Relations between epistemic cognition while reading and sourcing and argumentation in essays.  British Journal of Educational Psychology ,  84 (1), 58–85. https://doi.org/10.1111/bjep.12005

*Bråten, I., & Strømsø, H. I. (2003). A longitudinal think-aloud study of spontaneous strategic processing during the reading of multiple expository texts.  Reading and Writing ,  16 :195–218. https://doi.org/10.1023/A:1022895207490

Bråten, I., & Strømsø, H. I. (2011). Measuring strategic processing when students read multiple texts. Metacognition and Learning, 6 (2), 111–130. https://doi.org/10.1007/s11409-011-9075-7

*Britt, M. A., & Aglinskas, C. (2002). Improving students’ ability to identify and use source information.  Cognition and Instruction ,  20 (4), 485–522. https://doi.org/10.1207/S1532690XCI2004_2

Britt, M. A., Perfetti, C. A., Sandak, R., & Rouet, J. F. (1999). Content integration and source separation in learning from multiple texts. In S. R. Goldman, A. C. Graesser, & P. van den Broek (Eds.), Narrative comprehension, causality, and coherence: Essays in honor of Tom Trabasso (pp. 209–233). Lawrence Erlbaum Associates.

Brown, N. J., Afflerbach, P. P., & Croninger, R. G. (2014). Assessment of critical-analytic thinking. Educational Psychology Review, 26 (4), 543–560. https://doi.org/10.1007/s10648-014-9280-4

Buehl, M. M., & Alexander, P. A. (2005). Motivation and performance differences in students’ domain-specific epistemological belief profiles. American Educational Research Journal, 42 (4), 697–726. https://doi.org/10.3102/00028312042004697

Butterfuss, R., & Kendeou, P. (2021). KReC-MD: Knowledge revision with multiple documents. Educational Psychology Review, 33 (4), 1475–1497. https://doi.org/10.1007/s10648-021-09603-y

Byrnes, J. P., & Dunbar, K. N. (2014). The nature and development of critical-analytic thinking. Educational Psychology Review, 26 (4), 477–493. https://doi.org/10.1007/s10648-014-9284-0

*Cerdán, R., & Vidal-Abarca, E. (2008). The effects of tasks on integrating information from multiple documents.  Journal of Educational Psychology ,  100 (1), 209–222. https://doi.org/10.1037/0022-0663.100.1.209

*Cho, B.-Y., Han, H., & Kucan, L. L. (2018). An exploratory study of middle-school learners’ historical reading in an Internet environment.  Reading and Writing: An Interdisciplinary Journal ,  31 (7), 1525–1549. https://doi.org/10.1007/s11145-018-9847-4

*Cho, B.-Y., Woodward, L., Li, D., & Barlow, W. (2017). Examining adolescents’ strategic processing during online reading with a question-generating task.  American Educational Research Journal ,  54 (4), 691–724

Cleary, T. J., Callan, G. L., & Zimmerman, B. J. (2012). Assessing self-regulation as a cyclical, context-specific phenomenon: Overview and analysis of SRL microanalytic protocols. Education Research International, 2012 , 428639. https://doi.org/10.1155/2012/428639

*Daher, T. A., & Kiewra, K. A. (2016). An investigation of SOAR study strategies for learning from multiple online resources.  Contemporary Educational Psychology ,  46 , 10–21. https://doi.org/10.1016/j.cedpsych.2015.12.004

Danielson, R. W., Sinatra, G. M., Trevors, G., Muis, K. R., Pekrun, R., & Heddy, B. C. (2022). Can multiple texts prompt causal thinking? The role of epistemic emotions.  The Journal of Experimental Education , 1–15. https://doi.org/10.1080/00220973.2022.2107604

Danvers, E. C. (2016). Criticality’s affective entanglements: Rethinking emotion and critical thinking in higher education. Gender and Education, 28 (2), 282–297. https://doi.org/10.1080/09540253.2015.1115469

Dinsmore, D. L., & Alexander, P. A. (2012). A critical discussion of deep and surface processing: What it means, how it is measured, the role of context, and model specification. Educational Psychology Review, 24 (4), 499–567. https://doi.org/10.1007/s10648-012-9198-7

*Du, H., & List, A. (2020). Researching and writing based on multiple texts. Learning and Instruction, 66 , 101297. https://doi.org/10.1016/j.learninstruc.2019.101297

*Du, H., & List, A. (2021). Evidence use in argument writing based on multiple texts.  Reading Research Quarterly ,  56 (4), 715–735.  https://doi.org/10.1002/rrq.366

Dumas, D. , & Dong, Y. (2021). Focusing the relational lens on critical thinking: How can relational reasoning support critical and analytic thinking? In D. Fasko & F. Fair (Eds.) Critical thinking and reasoning: Theory development, instruction, and assessment (pp. 47–63) . Brill. https://doi.org/10.1163/9789004444591_004

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity, 12 , 43–52. https://doi.org/10.1016/j.tsc.2013.12.004

Eber, P. A., & Parker, T. S. (2007). Assessing student learning: Applying Bloom's Taxonomy. Human Service Education , 27 (1), 45–53. Retrieved February 20, 2023, from https://go.gale.com/ps/i.do?id=GALE%7CA280993786&sid=googleScholar&v=2.1&it=r&linkaccess=abs&issn=08905428&p=AONE&sw=w&userGroupName=anon%7E9c8f4eb

Elder, L., & Paul, R. W. (2013). Critical thinking: Intellectual standards essential to reasoning well within every domain of thought. Journal of Developmental Education, 36 (3), 34–35. Retrieved February 20, 2023, from  https://files.eric.ed.gov/fulltext/EJ1067273.pdf

Ennis, R. H. (1962). A concept of critical thinking. Harvard Educational Review, 32 (1), 81–111.

Ennis, R. H. (1993). Critical thinking assessment. Theory into Practice, 32 (3), 179–186.

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction . Executive Summary: The Delphi Report. The California Academic Press.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34 (10), 906–911. https://doi.org/10.1037/0003-066X.34.10.906

Frerejean, J., Velthorst, G. J., van Strien, J. L. H., Kirschner, P. A., & Brand-Gruwel, S. (2019). Embedded instruction to learn information problem solving: Effects of a whole task approach. Computers in Human Behavior, 90 , 117–130. https://doi.org/10.1016/j.chb.2018.08.043

*Gerjets, P., Kammerer, Y., & Werner, B. (2011). Measuring spontaneous and instructed evaluation processes during Web search: Integrating concurrent thinking-aloud protocols and eye-tracking data.  Learning and Instruction ,  21 (2), 220–231. https://doi.org/10.1016/j.learninstruc.2010.02.005

Gil, L., Bråten, I., Vidal-Abarca, E., & Strømsø, H. I. (2010a). Summary versus argument tasks when working with multiple documents: Which is better for whom? Contemporary Educational Psychology, 35 (3), 157–173. https://doi.org/10.1016/j.cedpsych.2009.11.002

Gil, L., Bråten, I., Vidal-Abarca, E., & Strømsø, H. I. (2010b). Understanding and integrating multiple science texts: Summary tasks are sometimes better than argument tasks. Reading Psychology, 31 (1), 30–68. https://doi.org/10.1080/02702710902733600

*Goldman, S. R., Braasch, J. L. G., Wiley, J., Graesser, A. C., & Brodowinska, K. (2012). Comprehending and learning from internet sources: Processing patterns of better and poorer learners.  Reading Research Quarterly ,  47 (4), 356–381. https://doi.org/10.1002/RRQ.027

Granello, D. H. (2001). Promoting cognitive complexity in graduate written work: Using Bloom’s taxonomy as a pedagogical tool to improve literature reviews. Counselor Education and Supervision, 40 (4), 292–307. https://doi.org/10.1002/j.1556-6978.2001.tb01261.x

Greene, J. A., Muis, K. R., & Pieschl, S. (2010). The role of epistemic beliefs in students’ self-regulated learning with computer-based learning environments: Conceptual and methodological issues. Educational Psychologist, 45 (4), 245–257. https://doi.org/10.1080/00461520.2010.515932

*Grossnickle Peterson, E., & Alexander, P. A. (2020). Navigating print and digital sources: Students’ selection, use, and integration of multiple sources across mediums.  Journal of Experimental Education ,  88 (1), 27–46. https://doi.org/10.1080/00220973.2018.1496058

*Hagen, Å. M., Braasch, J. L. G., & Bråten, I. (2014). Relationships between spontaneous note-taking, self-reported strategies and comprehension when reading multiple texts in different task conditions.  Journal of Research in Reading ,  37 (S1), S141–S157. https://doi.org/10.1111/j.1467-9817.2012.01536.x

Hofer, B. K., & Bendixen, L. D. (2012). Personal epistemology: Theory, research, and future directions. In K. R. Harris, S. Graham, T. Urdan, C. B. McCormick, G. M. Sinatra, & J. Sweller (Eds.), APA educational psychology handbook, Vol. 1. Theories, constructs, and critical issues (pp. 227–256). American Psychological Association. https://doi.org/10.1037/13273-009

Jiménez-Aleixandre, M. P., & Puig, B. (2012). Argumentation, evidence evaluation and critical thinking. In B. J. Fraser, K. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 1001–1015). Springer. https://doi.org/10.1007/978-1-4020-9041-7_66

Jones, K. O., Harland, J., Reid, J. M. V.,& Bartlett, R. (2009). Relationship between examination questions and bloom's taxonomy. IEEE Frontiers in Education Conference (pp. 1–6). San Antonio, Texas. https://doi.org/10.1109/FIE.2009.5350598

*Kammerer, Y., Bråten, I., Gerjets, P., & Strømsø, H. I. (2013). The role of Internet-specific epistemic beliefs in laypersons’ source evaluations and decisions during Web search on a medical issue.  Computers in human behavior ,  29 (3), 1193–1203. https://doi.org/10.1016/j.chb.2012.10.012

*Kammerer, Y., Gottschling, S., & Bråten, I. (2021). The role of internet-specific justification beliefs in source evaluation and corroboration during web search on an unsettled socio-scientific issue.  Journal of Educational Computing Research ,  59 (2), 342–378. https://doi.org/10.1177/0735633120952731

*Kammerer, Y., Kalbfell, E., & Gerjets, P. (2016). Is this information source commercially biased? How contradictions between web pages stimulate the consideration of source information.  Discourse Processes ,  53 (5-6), 430–456. https://doi.org/10.1080/0163853X.2016.1169968

Kiili, C., & Leu, D. J. (2019). Exploring the collaborative synthesis of information during online reading. Computers in Human Behavior, 95 , 146–157. https://doi.org/10.1016/j.chb.2019.01.033

Kintsch, W. (1988). The role of knowledge in discourse comprehension: A construction-integration model. Psychological Review, 95 (2), 163–182. https://doi.org/10.1037/0033-295X.95.2.163

Kintsch, W., & van Dijk, T. A. (1978). Toward a model of text comprehension and production. Psychological Review, 85 (5), 363–394. https://doi.org/10.1037/0033-295X.85.5.363

*Kobayashi, K. (2009a). Comprehension of relations among controversial texts: Effects of external strategy use.  Instructional Science ,  37 (4), 311–324. https://doi.org/10.1007/s11251-007-9041-6

*Kobayashi, K. (2009b). The influence of topic knowledge, external strategy use, and college experience on students’ comprehension of controversial texts.  Learning and Individual Differences ,  19 (1), 130–134. https://doi.org/10.1016/j.lindif.2008.06.001

*Kobayashi, K. (2014). Students’ consideration of source information during the reading of multiple texts and its effect on intertextual conflict resolution.  Instructional Science ,  42 (2), 183–205. https://doi.org/10.1007/s11251-013-9276-3

Krathwohl, D.R., Bloom, B.S., & Masia, B.B. (1964). Taxonomy of educational objectives: The classification of educational goals. Handbook II: The affective domain . David McKay.

Kuhn, D. (2019). Critical thinking as discourse. Human Development, 62 (3), 146–164. https://doi.org/10.1159/000500171

Kurby, C. A., Britt, M. A., & Magliano, J. P. (2005). The role of top-down and bottom-up processes in between-text integration. Reading Psychology, 26 (4–5), 335–362. https://doi.org/10.1080/02702710500285870

Lai, E. R. (2011). Critical thinking: A literature review. Pearson’s Research Reports, 6 (1), 40–41.

Lee, Y. (2022). Examining students’ help-seeking when learning from multiple texts . Pennsylvania State University.

Lewis, A., & Smith, D. (1993). Defining higher order thinking. Theory into Practice, 32 (3), 131–137. https://doi.org/10.1080/00405849309543588

*Linderholm, T., Therriault, D. J., & Kwon, H. (2014). Multiple science text processing: Building comprehension skills for college student readers.  Reading Psychology ,  35 (4), 332–356. https://doi.org/10.1080/02702711.2012.726696

List, A., & Alexander, P. A. (2015). Examining response confidence in multiple text tasks. Metacognition and Learning, 10 , 407–436. https://doi.org/10.1007/s11409-015-9138-2

List, A., & Alexander, P. A. (2017). Analyzing and integrating models of multiple text comprehension. Educational Psychologist, 52 (3), 143–147. https://doi.org/10.1080/00461520.2017.1328309

List, A., & Alexander, P. A. (2018). Corroborating students’ self-reports of source evaluation. Behaviour & Information Technology, 37 (3), 198–216. https://doi.org/10.1080/0144929X.2018.1430849

List, A., & Alexander, P. A. (2019). Toward an integrated framework of multiple text use. Educational Psychologist, 54 (1), 20–39. https://doi.org/10.1080/00461520.2018.1505514

*List, A., Alexander, P. A., & Stephens, L. A. (2017). Trust but verify: Examining the association between students’ sourcing behaviors and ratings of text trustworthiness.  Discourse Processes ,  54 (2), 83–104. https://doi.org/10.1080/0163853X.2016.1174654

*List, A., Campos Oaxaca, G. S., Lee, E., Du, H., & Lee, H. Y. (2021). Examining perceptions, selections, and products in undergraduates’ learning from multiple resources.  British Journal of Educational Psychology ,  91 (4), 1555–1584. https://doi.org/10.1111/bjep.12435

*List, A., & Du, H. (2021). Reasoning beyond history: Examining students’ strategy use when completing a multiple text task addressing a controversial topic in education. Reading and Writing: An Interdisciplinary Journal . https://doi.org/10.1007/s11145-020-10095-5

List, A., Du, H., & Wang, Y. (2019a). Understanding students’ conceptions of task assignments. Contemporary Educational Psychology, 59 , 101801. https://doi.org/10.1016/j.cedpsych.2019.101801

*List, A., Du, H., Wang, Y., & Lee, H. Y. (2019b). Toward a typology of integration: Examining the documents model framework. Contemporary Educational Psychology , 58 , 228–242. https://doi.org/10.1016/j.cedpsych.2019.03.003

*List, A., Grossnickle, E. M., & Alexander, P. A. (2016a). Profiling students’ multiple source use by question type.  Reading Psychology ,  37 (5), 753–797. https://doi.org/10.1080/02702711.2015.1111962

*List, A., Grossnickle, E. M., & Alexander, P. A. (2016b). Undergraduate students’ justifications for source selection in a digital academic context.  Journal of Educational Computing Research ,  54 (1), 22–61. https://doi.org/10.1177/0735633115606659

Marzano, R. J., & Kendall, J. S. (2008). Designing and assessing educational objectives: Applying the new taxonomy . Corwin Press.

Mason, L., Boldrin, A., & Ariasi, N. (2010). Searching the Web to learn about a controversial topic: Are students epistemically active? Instructional Science, 38 , 607–633. https://doi.org/10.1007/s11251-008-9089-y

*Mason, L., Junyent, A. A., & Tornatora, M. C. (2014). Epistemic evaluation and comprehension of web-source information on controversial science-related topics: Effects of a short-term instructional intervention.  Computers & Education ,  76 , 143–157. https://doi.org/10.1016/j.compedu.2014.03.016

*Mason, L., Zaccoletti, S., Scrimin, S., Tornatora, M. C., Florit, E., & Goetz, T. (2020). Reading with the eyes and under the skin: Comprehending conflicting digital texts.  Journal of Computer Assisted Learning ,  36 (1), 89–101. https://doi.org/10.1111/jcal.12399

*Mateos, M., & Solé, I. (2009). Synthesising information from various texts: A study of procedures and products at different educational levels.  European Journal of Psychology of Education ,  24 (4), 435–451. https://doi.org/10.1007/BF03178760

Mayer, R. E. (2002). A taxonomy for computer-based assessment of problem solving. Computers in Human Behavior, 18 (6), 623–632. https://doi.org/10.1016/S0747-5632(02)00020-1

*McCrudden, M. T., Kulikowich, J. M., Lyu, B., & Huynh, L. (2022). Promoting integration and learning from multiple complementary texts. Journal of Educational Psychology. Advance online publication. https://doi.org/10.1037/edu0000746

Miri, B., David, B. C., & Uri, Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking. Research in Science Education, 37 (4), 353–369. https://doi.org/10.1007/s11165-006-9029-2

Muis. K. R., Chevrier, M., Denton, C. A., & Losenno, K. M. (2021). Epistemic emotions and epistemic cognition predict critical thinking about socio-scientific issues. Frontiers in Education, 6, Article 669908. https://doi.org/10.3389/feduc.2021.669908

*Muis, K. R., Pekrun, R., Sinatra, G. M., Azevedo, R., Trevors, G., Meier, E., & Heddy, B. C. (2015). The curious case of climate change: Testing a theoretical model of epistemic beliefs, epistemic emotions, and complex learning.  Learning and Instruction ,  39 , 168–183. https://doi.org/10.1016/j.learninstruc.2015.06.003

Murphy, P. K., Rowe, M. L., Ramani, G., & Silverman, R. (2014). Promoting critical-analytic thinking in children and adolescents at home and in school. Educational Psychology Review, 26 (4), 561–578. https://doi.org/10.1007/s10648-014-9281-3

Newmann, F. M. (1991). Promoting higher order thinking in social studies: Overview of a study of 16 high school departments. Theory & Research in Social Education, 19 (4), 324–340. https://doi.org/10.1080/00933104.1991.10505645

Paul, R., & Elder, L. (2006). The miniature guide to critical thinking concepts and tools (4th ed.) . The Foundation for Critical Thinking. Retrieved February 20, 2023, from https://www.criticalthinking.org/files/Concepts_Tools.pdf

Paul, R. W., & Nosich, G. M. (1991). A proposal for the national assessment of higher-order thinking at the community college, college, and university levels . National Center for Education Statistics, Office of Educational Research and Improvement, the United States Department of Education. Retrieved February 20, 2023, from https://files.eric.ed.gov/fulltext/ED340762.pdf

Perfetti, C. A., Rouet, J.-F., & Britt, M. A. (1999). Towards a theory of documents representation. In H. van Oostendorp & S. R. Goldman (Eds.), The Construction of mental representations during reading (pp. 99–122). Lawrence Erlbaum Associates.

Petty, R. E., & Briñol, P. (2015). Emotion and persuasion: Cognitive and meta-cognitive processes impact attitudes. Cognition and Emotion, 29 (1), 1–26. https://doi.org/10.1080/02699931.2014.967183

Rapp, D. N., & Mensink, M. C. (2011). Focusing effects from online and offline reading tasks. In M. T. McCrudden, J. P. Magliano, & G. Schraw (Eds.), Text relevance and learning from text (pp. 141–164). IAP Information Age Publishing.

Resnick, L. B. (1987). Education and learning to think . National Academies Press. https://doi.org/10.17226/1032

Book   Google Scholar  

Reznitskaya, A., Anderson, R. C., Dong, T., Li, Y., Kim, I.-H., & Kim, S.-Y. (2008). Learning to think well: Application of argument schema theory to literacy instruction. In C. C. Block & S. R. Parris (Eds.), Comprehension instruction: Research-based best practices (pp. 196–213). The Guilford Press.

Richland, L. E., & Simms, N. (2015). Analogy, higher order thinking, and education. Wiley Interdisciplinary Reviews: Cognitive Science, 6 (2), 177–192. https://doi.org/10.1002/wcs.1336

Richter, T., & Maier, J. (2017). Comprehension of multiple documents with conflicting information: A two-step model of validation. Educational Psychologist, 52 (3), 148–166. https://doi.org/10.1080/00461520.2017.1322968

*Rodicio, H. G. (2015). Students’ evaluation strategies in a Web research task: Are they sensitive to relevance and reliability?  Journal of Computing in Higher Education ,  27 (2), 134–157. https://doi.org/10.1007/s12528-015-9098-1

Roeser, S., & Todd, C. (2015). Emotion and value: Introduction. In S. Roeser & C. Todd (Eds.), Emotion and Value (pp. 1–4). Oxford University Press.

Rouet, J.-F. (2006). The skills of document use: From text comprehension to web-based learning . Lawrence Erlbaum Associates.

Rouet, J.-F., & Britt, M. A. (2011). Relevance processes in multiple document comprehension. In M. T. McCrudden, J. P. Magliano, & G. Schraw (Eds.), Relevance instructions and goal-focusing in text learning (pp. 19–52). Information Age.

*Rouet, J.-F., Britt, M. A., Mason, R. A., & Perfetti, C. A. (1996). Using multiple sources of evidence to reason about history.  Journal of Educational Psychology ,  88 (3), 478–493. https://doi.org/10.1037/0022-0663.88.3.478

*Rouet, J.-F., Favart, M., Britt, M. A., & Perfetti, C. A. (1997). Studying and using multiple documents in history: Effects of discipline expertise.  Cognition and Instruction ,  15 (1), 85–106. https://doi.org/10.1207/s1532690xci1501_3

Rudd, R., Baker, M., & Hoover, T. (2000). Undergraduate agriculture student learning styles and critical thinking abilities: Is there a relationship? Journal of Agricultural Education, 41 (3), 2–12.

Sadler, T. D., & Zeidler, D. L. (2004). The morality of socioscientific issues: Construal and resolution of genetic engineering dilemmas. Science Education, 88 (1), 4–27. https://doi.org/10.1002/sce.10101

*Salmerón, L., Gil, L., Bråten, I., & Strømsø, H. (2010). Comprehension effects of signalling relationships between documents in search engines.  Computers in Human Behavior ,  26 (3), 419–426. https://doi.org/10.1016/j.chb.2009.11.013

Samuelstuen, M. S., & Bråten, I. (2007). Examining the validity of self-reports on scales measuring students’ strategic processing. British Journal of Educational Psychology, 77 (2), 351–378. https://doi.org/10.1348/000709906X106147

Schoor, C., Rouet, J. F., Artelt, C., Mahlow, N., Hahnel, C., Kroehne, U., & Goldhammer, F. (2021). Readers’ perceived task demands and their relation to multiple document comprehension strategies and outcome. Learning and Individual Differences, 88 , 102018. https://doi.org/10.1016/j.lindif.2021.102018

Schraw, G., & Robinson, D. R. (2011). Conceptualizing and assessing higher order thinking skills. In G. Schraw & D. R. Robinson (Eds.), Assessment of higher order thinking skills (pp. 47–88). IAP Information Age Publishing.

Scriven, M., & Paul, R. (1987, August). Critical thinking as defined by the National Council for Excellence in Critical Thinking. In  8th Annual International Conference on Critical Thinking and Education Reform, Rohnert Park, CA  (pp. 25 – 30)

Seaman, M. (2011). Bloom’s Taxonomy: Its evolution, revision, and use in the field of education. In D. J. Flinders & P. B. Uhrmacher (Eds.), Curriculum & Teaching Dialogue (pp. 29–45). Information Age Publishing Inc.

Sockett, H. (1971). Bloom’s Taxonomy: A philosophical critique (I). Cambridge Journal of Education, 1 (1), 16–25. https://doi.org/10.1080/0305764710010103

*Solé, I., Miras, M., Castells, N., Espino, S., & Minguela, M. (2013). Integrating information: An analysis of the processes involved and the products generated in a written synthesis task.  Written Communication ,  30 (1), 63–90. https://doi.org/10.1177/0741088312466532

Stromer-Galley, J., & Muhlberger, P. (2009). Agreement and disagreement in group deliberation: Effects on deliberation satisfaction, future engagement, and decision legitimacy. Political Communication, 26 (2), 173–192. https://doi.org/10.1080/10584600902850775

*Strømsø, H. I., Bråten, I., Britt, M. A., & Ferguson, L. E. (2013). Spontaneous sourcing among students reading multiple documents.  Cognition and Instruction ,  31 (2), 176–203. https://doi.org/10.1080/07370008.2013.769994

Tarchi, C., & Mason, L. (2020). Effects of critical thinking on multiple-document comprehension. European Journal of Psychology of Education, 35 (2), 289–313. https://doi.org/10.1007/s10212-019-00426-8

Tsai, C.-C. (2004). Beyond cognitive and metacognitive tools: The use of the Internet as an “epistemological” tool for instruction. British Journal of Educational Technology, 35 (5), 525–536. https://doi.org/10.1111/j.0007-1013.2004.00411.x

*Tsai, M.-J., & Wu, A.-H. (2021). Visual search patterns, information selection strategies, and information anxiety for online information problem solving.  Computers & Education ,  172 , 104236. https://doi.org/10.1016/j.compedu.2021.104236

*van Strien, J. L. H., Kammerer, Y., Brand-Gruwel, S., & Boshuizen, H. P. A. (2016). How attitude strength biases information processing and evaluation on the web.  Computers in Human Behavior ,  60 , 245–252. https://doi.org/10.1016/j.chb.2016.02.057

*Vandermeulen, N., van den Broek, B., van Steendam, E., & Rijlaarsdam, G. (2020). In search of an effective source use pattern for writing argumentative and informative synthesis texts.  Reading and Writing ,  33 (2), 239–266. https://doi.org/10.1007/s11145-019-09958-3

Vijayaratnam, P. (2012). Developing higher order thinking skills and team commitment via group problem solving: A bridge to the real world. Procedia-Social and Behavioral Sciences, 66 , 53–63. https://doi.org/10.1016/j.sbspro.2012.11.247

*Walraven, A., Brand-Gruwel, S., & Boshuizen, H. P. A. (2009). How students evaluate information and sources when searching the World Wide Web for information.  Computers & Education ,  52 (1), 234–246. https://doi.org/10.1016/j.compedu.2008.08.003

*Walraven, A., Brand-Gruwel, S., & Boshuizen, H. P. A. (2010). Fostering transfer of websearchers’ evaluation skills: A field test of two transfer theories.  Computers in Human Behavior ,  26 (4), 716–728. https://doi.org/10.1016/j.chb.2010.01.008

Wang, Y., & List, A. (2019). Calibration in multiple text use. Metacognition and Learning, 14 (2), 131–166. https://doi.org/10.1007/s11409-019-09201-y

Wentzel, K. R. (2014). Commentary: The role of goals and values in critical-analytic thinking. Educational Psychology Review, 26 (4), 579–582. https://doi.org/10.1007/s10648-014-9285-z

Weiss, R. E. (2003). Designing problems to promote higher-order thinking. New Directions for Teaching and Learning, 2003 (95), 25–31. https://doi.org/10.1002/tl.109

*Wiley, J., Goldman, S. R., Graesser, A. C., Sanchez, C. A., Ash, I. K., & Hemmerich, J. A. (2009). Source evaluation, comprehension, and learning in Internet science inquiry tasks.  American Educational Research Journal ,  46 (4), 1060–1106. https://doi.org/10.3102/0002831209333183

Wiley, J., Griffin, T. D., Steffens, B., & Britt, M. A. (2020). Epistemic beliefs about the value of integrating information across multiple documents in history. Learning and Instruction, 65 , 101266. https://doi.org/10.1016/j.learninstruc.2019.101266

*Wiley, J., & Voss, J. F. (1999). Constructing arguments from multiple sources: Tasks that promote understanding and not just memory for text.  Journal of Educational Psychology ,  91 (2), 301–311. https://doi.org/10.1037/0022-0663.91.2.301

Willingham, D. T. (2007). Critical thinking: Why it is so hard to teach? American Educator, 31 (2), 8–19. https://www.aft.org/sites/default/files/media/2014/Crit_Thinking.pdf

Wolf, A. B. (2017). “Tell me how that makes you feel”: Philosophy’s reason/emotion divide and epistemic pushback in philosophy classrooms. Hypatia, 32 (4), 893–910. https://doi.org/10.1111/hypa.12378

*Wolfe, M. B. W., & Goldman, S. R. (2005). Relations between adolescents’ text processing and reasoning.  Cognition and Instruction ,  23 (4), 467–502. https://doi.org/10.1207/s1532690xci2304_2

*Yang, F. (2017). Examining the reasoning of conflicting science information from the information processing perspective—An eye movement analysis.  Journal of Research in Science Teaching ,  54 (10), 1347–1372. https://doi.org/10.1002/tea.21408

Zeidler, D.L., & Lewis, J. (2003). Unifying themes in moral reasoning on socioscientific issues and discourse. In D. L. Zeidler (ed.), The Role of Moral Reasoning on Socioscientific Issues and Discourse in Science Education (pp. 289–306). Springer. https://doi.org/10.1007/1-4020-4996-X_15

Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81 (3), 329–339. https://doi.org/10.1037/0022-0663.81.3.329

Zimmerman, B. J. (2013). From cognitive modeling to self-regulation: A social cognitive career path. Educational Psychologist, 48 (3), 135–147. https://doi.org/10.1080/00461520.2013.794676

Zohar, A., & Dori, Y. J. (2003). Higher-order thinking and low-achieving students: Are they mutually exclusive? Journal of the Learning Sciences, 12 (2), 145–181. https://doi.org/10.1207/S15327809JLS1202_1

Braasch, J. L., Rouet, J. F., Vibert, N., & Britt, M. A. (2012). Readers’ use of source information in text comprehension. Memory & Cognition, 40 , 450–465. https://doi.org/10.3758/s13421-011-0160-6

Muis, K. R. (2007). The role of epistemic beliefs in selfregulated learning. Educational Psychologist, 42 (3), 173–190. https://doi.org/10.1080/00461520701416306

Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20 (1), 61–84. https://doi.org/10.22329/il.v20i1.2254

Download references

Author information

Authors and affiliations.

Department of Educational Psychology, Counseling, and Special Education, The Pennsylvania State University, 227 Cedar Building, PA, 16820, University Park, USA

Alexandra List

Department of Human Development and Quantitative Methodology, University of Maryland, 3242 Benjamin Building, College Park, MD, 20742, College Park, USA

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Alexandra List .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix 2. Literature search and screening process

Literature search.

We conducted keyword searches in the PsycINFO and Educational Resources Information Center (ERIC) databases. We restricted the searches to empirical, quantitative studies published in peer-reviewed, English-language journals. To capture examinations of HOT, CT, and CAT within the multiple text literature, we combined two sets of search terms. The first set included keywords or phrases for multiple text research, such as “multiple source*”, “multiple text*”, “multiple document*”, “intertext*”, “conflicting texts”, “complementary texts”, and “information problem solving”. In the second set, we selected a range of keywords and phrases pertaining to HOT, CT, and CAT. These included broader terms such as “higher order thinking”, “ critical thinking ”, “critical analytic*”, “reasoning”, “metacognit*”, as well as more specific cognitive processes relevant for multiple text learning that potentially manifest HOT, CT, or CAT, such as “integrat*”, “synthes*”, “analys*”, “corroborat*”, “validat*”, “evaluat*”, “justif*”, “sourcing”, “argument*”, and “refutation*”. A full list of search terms is as follows:

(multiple source* OR multiple text* OR multiple document* OR intertext* OR conflicting information OR conflicting views OR conflicting texts OR conflicting sources OR complementary text* OR information problem solving) AND (higher order thinking OR critical thinking OR critical analytic OR metacogni* OR analys* OR analytical OR reasoning OR analog* OR refutation* OR integrat* OR synthes* OR corroborat* OR argument* OR validat* OR justif* OR evaluat* OR sourcing)

These initial searches yielded 1814 non-duplicate records. We narrowed down this initial pool by applying research area classification filters to constrain the literature to areas more relevant for educational and academic contexts (e.g., curriculum & programs & teaching methods, academic learning & achievement, cognitive processes, educational psychology, human experimental psychology, learning and memory). The remaining 482 records after this narrowing were then subject to title and abstract screening.

Title and Abstract Screening

We screened the titles and abstracts based on the following inclusion criteria: The study (a) involved a main task that required reading two or more texts, and (b) was conducted in an educational or academically relevant context. (c) The participants were proficient in the language in which the task was conducted, and (d) had no psychological, physiological, or neurological disorders or disabilities that could affect their text processing. This screening left us with 161 records.

Additional Searches

From these 161 records, we identified authors with five or more included articles on multiple text learning and journals that published five or more included articles. We then did a physical search of additional studies by going through these authors’ Google Scholar pages and the tables of contents of the selected journals in the last five years. We also identified additional relevant studies from the works referenced in these remaining studies following a backward snowballing procedure. These additional searches led us to another 30 non-duplicate records, making for a total of 191 records assessed in full-text form for eligibility.

Full-Text Assessment

As we read the full texts of these remaining 191 articles, we ensured the studies met the following eligibility criteria: (a) The multiple-text task was completed by participants independently rather than as a collaborative group activity; (b) The study included at least one process measure, that was not a self-report strategy question, and (c) one (non-self-report) outcome measure that assessed some form of higher-order cognition potentially reflective of HOT, CT, or CAT. Studies employing self-report questionnaire measures, to capture process data, were only included if these also used other forms of process or outcome measures (e.g., think-alouds, eye-tracking). This left us with a final set of 54 records, with 57 studies eligible for the systematic review.

Appendix 3. Coding for processes and outcomes of multiple text learning

Rights and permissions.

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

List, A., Sun, Y. To Clarity and Beyond: Situating Higher-Order, Critical, and Critical-Analytic Thinking in the Literature on Learning from Multiple Texts. Educ Psychol Rev 35 , 40 (2023). https://doi.org/10.1007/s10648-023-09756-y

Download citation

Accepted : 23 February 2023

Published : 24 March 2023

DOI : https://doi.org/10.1007/s10648-023-09756-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Critical thinking
  • Critical-analytic thinking
  • Higher-order thinking
  • Multiple text comprehension
  • Bloom et al.’s Taxonomy
  • Find a journal
  • Publish with us
  • Track your research

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Fostering Critical Thinking, Reasoning, and Argumentation Skills through Bioethics Education

* E-mail: [email protected]

Affiliation Northwest Association for Biomedical Research, Seattle, Washington, United States of America

Affiliation Center for Research and Learning, Snohomish, Washington, United States of America

  • Jeanne Ting Chowning, 
  • Joan Carlton Griswold, 
  • Dina N. Kovarik, 
  • Laura J. Collins

PLOS

  • Published: May 11, 2012
  • https://doi.org/10.1371/journal.pone.0036791
  • Reader Comments

Table 1

Developing a position on a socio-scientific issue and defending it using a well-reasoned justification involves complex cognitive skills that are challenging to both teach and assess. Our work centers on instructional strategies for fostering critical thinking skills in high school students using bioethical case studies, decision-making frameworks, and structured analysis tools to scaffold student argumentation. In this study, we examined the effects of our teacher professional development and curricular materials on the ability of high school students to analyze a bioethical case study and develop a strong position. We focused on student ability to identify an ethical question, consider stakeholders and their values, incorporate relevant scientific facts and content, address ethical principles, and consider the strengths and weaknesses of alternate solutions. 431 students and 12 teachers participated in a research study using teacher cohorts for comparison purposes. The first cohort received professional development and used the curriculum with their students; the second did not receive professional development until after their participation in the study and did not use the curriculum. In order to assess the acquisition of higher-order justification skills, students were asked to analyze a case study and develop a well-reasoned written position. We evaluated statements using a scoring rubric and found highly significant differences (p<0.001) between students exposed to the curriculum strategies and those who were not. Students also showed highly significant gains (p<0.001) in self-reported interest in science content, ability to analyze socio-scientific issues, awareness of ethical issues, ability to listen to and discuss viewpoints different from their own, and understanding of the relationship between science and society. Our results demonstrate that incorporating ethical dilemmas into the classroom is one strategy for increasing student motivation and engagement with science content, while promoting reasoning and justification skills that help prepare an informed citizenry.

Citation: Chowning JT, Griswold JC, Kovarik DN, Collins LJ (2012) Fostering Critical Thinking, Reasoning, and Argumentation Skills through Bioethics Education. PLoS ONE 7(5): e36791. https://doi.org/10.1371/journal.pone.0036791

Editor: Julio Francisco Turrens, University of South Alabama, United States of America

Received: February 7, 2012; Accepted: April 13, 2012; Published: May 11, 2012

Copyright: © 2012 Chowning et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: The “Collaborations to Understand Research and Ethics” (CURE) program was supported by a Science Education Partnership Award grant ( http://ncrrsepa.org ) from the National Center for Research Resources and the Division of Program Coordination, Planning, and Strategic Initiatives of the National Institutes of Health through Grant Number R25OD011138. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.

Introduction

While the practice of argumentation is a cornerstone of the scientific process, students at the secondary level have few opportunities to engage in it [1] . Recent research suggests that collaborative discourse and critical dialogue focused on student claims and justifications can increase student reasoning abilities and conceptual understanding, and that strategies are needed to promote such practices in secondary science classrooms [2] . In particular, students need structured opportunities to develop arguments and discuss them with their peers. In scientific argument, the data, claims and warrants (that relate claims to data) are strictly concerned with scientific data; in a socio-scientific argument, students must consider stakeholder perspectives and ethical principles and ideas, in addition to relevant scientific background. Regardless of whether the arguments that students employ point towards scientific or socio-scientific issues, the overall processes students use in order to develop justifications rely on a model that conceptualizes arguments as claims to knowledge [3] .

Prior research in informal student reasoning and socio-scientific issues also indicates that most learners are not able to formulate high-quality arguments (as defined by the ability to articulate justifications for claims and to rebut contrary positions), and highlights the challenges related to promoting argumentation skills. Research suggests that students need experience and practice justifying their claims, recognizing and addressing counter-arguments, and learning about elements that contribute to a strong justification [4] , [5] .

Proponents of Socio-scientific Issues (SSI) education stress that the intellectual development of students in ethical reasoning is necessary to promote understanding of the relationship between science and society [4] , [6] . The SSI approach emphasizes three important principles: (a) because science literacy should be a goal for all students, science education should be broad-based and geared beyond imparting relevant content knowledge to future scientists; (b) science learning should involve students in thinking about the kinds of real-world experiences that they might encounter in their lives; and (c) when teaching about real-world issues, science teachers should aim to include contextual elements that are beyond traditional science content. Sadler and Zeidler, who advocate a SSI perspective, note that “people do not live their lives according to disciplinary boundaries, and students approach socio-scientific issues with diverse perspectives that integrate science and other considerations” [7] .

Standards for science literacy emphasize not only the importance of scientific content and processes, but also the need for students to learn about science that is contextualized in real-world situations that involve personal and community decision-making [7] – [10] . The National Board for Professional Teaching Standards stresses that students need “regular exposure to the human contexts of science [and] examples of ethical dilemmas, both current and past, that surround particular scientific activities, discoveries, and technologies” [11] . Teachers are mandated by national science standards and professional teaching standards to address the social dimensions of science, and are encouraged to provide students with the tools necessary to engage in analyzing bioethical issues; yet they rarely receive training in methods to foster such discussions with students.

The Northwest Association for Biomedical Research (NWABR), a non-profit organization that advances the understanding and support of biomedical research, has been engaging students and teachers in bringing the discussion of ethical issues in science into the classroom since 2000 [12] . The mission of NWABR is to promote an understanding of biomedical research and its ethical conduct through dialogue and education. The sixty research institutions that constitute our members include academia, industry, non-profit research organizations, research hospitals, professional societies, and volunteer health organizations. NWABR connects the scientific and education communities across the Northwestern United States and helps the public understand the vital role of research in promoting better health outcomes. We have focused on providing teachers with both resources to foster student reasoning skills (such as activities in which students practice evaluating arguments using criteria for strong justifications), as well as pedagogical strategies for fostering collaborative discussion [13] – [15] . Our work draws upon socio-scientific elements of functional scientific literacy identified by Zeidler et al. [6] . We include support for teachers in discourse issues, nature of science issues, case-based issues, and cultural issues – which all contribute to cognitive and moral development and promote functional scientific literacy. Our Collaborations to Understand Research and Ethics (CURE) program, funded by a Science Education Partnership Award from the National Institutes of Health (NIH), promotes understanding of translational biomedical research as well as the ethical considerations such research raises.

Many teachers find a principles-based approach most manageable for introducing ethical considerations. The principles include respect for persons (respecting the inherent worth of an individual and his or her autonomy), beneficence/nonmaleficence (maximizing benefits/minimizing harms), and justice (distributing benefits/burdens equitably across a group of individuals). These principles, which are articulated in the Belmont Report [16] in relation to research with human participants (and which are clarified and defended by Beauchamp and Childress [17] ), represent familiar concepts and are widely used. In our professional development workshops and in our support resources, we also introduce teachers to care, feminist, virtue, deontological and consequentialist ethics. Once teachers become familiar with principles, they often augment their teaching by incorporating these additional ethical approaches.

The Bioethics 101 materials that were the focus of our study were developed in conjunction with teachers, ethicists, and scientists. The curriculum contains a series of five classroom lessons and a culminating assessment [18] and is described in more detail in the Program Description below. For many years, teachers have shared with us the dramatic impacts that the teaching of bioethics can have on their students; this research study was designed to investigate the relationship between explicit instruction in bioethical reasoning and resulting student outcomes. In this study, teacher cohorts and student pre/post tests were used to investigate whether CURE professional development and the Bioethics 101 curriculum materials made a significant difference in high school students’ abilities to analyze a case study and justify their positions. Our research strongly indicates that such reasoning approaches can be taught to high school students and can significantly improve their ability to develop well-reasoned justifications to bioethical dilemmas. In addition, student self-reports provide additional evidence of the extent to which bioethics instruction impacted their attitudes and perceptions and increased student motivation and engagement with science content.

Program Description

Our professional development program, Ethics in the Science Classroom, spanned two weeks. The first week, a residential program at the University of Washington (UW) Pack Forest Conference Center, focused on our Bioethics 101 curriculum, which is summarized in Table S1 and is freely available at http://www.nwabr.org . The curriculum, a series of five classroom lessons and a culminating assessment, was implemented by all teachers who were part of our CURE treatment group. The lessons explore the following topics: (a) characteristics of an ethical question; (b) bioethical principles; (c) the relationship between science and ethics and the roles of objectivity/subjectivity and evidence in each; (d) analysis of a case study (including identifying an ethical question, determining relevant facts, identifying stakeholders and their concerns and values, and evaluating options); and (e) development of a well-reasoned justification for a position.

Additionally, the first week focused on effective teaching methods for incorporating ethical issues into science classrooms. We shared specific pedagogical strategies for helping teachers manage classroom discussion, such as asking students to consider the concerns and values of individuals involved in the case while in small single and mixed stakeholder groups. We also provided participants with background knowledge in biomedical research and ethics. Presentations from colleagues affiliated with the NIH Clinical and Translational Science Award program, from the Department of Bioethics and Humanities at the UW, and from NWABR member institutions helped participants develop a broad appreciation for the process of biomedical research and the ethical issues that arise as a consequence of that research. Topics included clinical trials, animal models of disease, regulation of research, and ethical foundations of research. Participants also developed materials directly relevant and applicable to their own classrooms, and shared them with other educators. Teachers wrote case studies and then used ethical frameworks to analyze the main arguments surrounding the case, thereby gaining experience in bioethical analysis. Teachers also developed Action Plans to outline their plans for implementation.

The second week provided teachers with first-hand experiences in NWABR research institutions. Teachers visited research centers such as the Tumor Vaccine Group and Clinical Research Center at the UW. They also had the opportunity to visit several of the following institutions: Amgen, Benaroya Research Institute, Fred Hutchinson Cancer Research Center, Infectious Disease Research Institute, Institute for Stem Cells and Regenerative Medicine at the UW, Pacific Northwest Diabetes Research Institute, Puget Sound Blood Center, HIV Vaccine Trials Network, and Washington National Primate Research Center. Teachers found these experiences in research facilities extremely valuable in helping make concrete the concepts and processes detailed in the first week of the program.

We held two follow-up sessions during the school year to deepen our relationship with the teachers, promote a vibrant ethics in science education community, provide additional resources and support, and reflect on challenges in implementation of our materials. We also provided the opportunity for teachers to share their experiences with one another and to report on the most meaningful longer-term impacts from the program. Another feature of our CURE program was the school-year Institutional Review Board (IRB) and Institutional Animal Care and Use Committee (IACUC) follow-up sessions. Teachers chose to attend one of NWABR’s IRB or IACUC conferences, attend a meeting of a review board, or complete NIH online ethics training. Some teachers also visited the UW Embryonic Stem Cell Research Oversight Committee. CURE funding provided substitutes in order for teachers to be released during the workday. These opportunities further engaged teachers in understanding and appreciating the actual process of oversight for federally funded research.

Participants

Most of the educators who have been through our intensive summer workshops teach secondary level science, but we have welcomed teachers at the college, community college, and even elementary levels. Our participants are primarily biology teachers; however, chemistry and physical science educators, health and career specialists, and social studies teachers have also used our strategies and materials with success.

The research design used teacher cohorts for comparison purposes and recruited teachers who expressed interest in participating in a CURE workshop in either the summer of 2009 or the summer of 2010. We assumed that all teachers who applied to the CURE workshop for either year would be similarly interested in ethics topics. Thus, Cohort 1 included teachers participating in CURE during the summer of 2009 (the treatment group). Their students received CURE instruction during the following 2009–2010 academic year. Cohort 2 (the comparison group) included teachers who were selected to participate in CURE during the summer of 2010. Their students received a semester of traditional classroom instruction in science during the 2009–2010 academic year. In order to track participation of different demographic groups, questions pertaining to race, ethnicity, and gender were also included in the post-tests.

Using an online sample size calculator http://www.surveysystem.com/sscalc.htm , a 95% Confidence Level, and a Confidence Interval of 5, it was calculated that a sample size of 278 students would be needed for the research study. For that reason, six Cohort 1 teachers were impartially chosen to be in the study. For the comparison group, the study design also required six teachers from Cohort 2. The external evaluator contacted all Cohort 2 teachers to explain the research study and obtain their consent, and successfully recruited six to participate.

Ethics Statement

This study was conducted according to the principles expressed in the Declaration of Helsinki. Prior to the study, research processes and materials were reviewed and approved by the Western Institutional Review Board (WIRB Study #1103180). CURE staff and evaluators received written permission from parents to have their minor children participate in the Bioethics 101 curriculum, for the collection and subsequent analysis of students’ written responses to the assessment, and for permission to collect and analyze student interview responses. Teachers also provided written informed consent prior to study participation. All study participants and/or their legal guardians provided written informed consent for the collection and subsequent analysis of verbal and written responses.

Research Study

Analyzing a case study: cure and comparison students..

Teacher cohorts and pre/post tests were used to investigate whether CURE professional development and curriculum materials made a significant difference in high school students’ abilities to analyze a case study and justify their positions. Cohort 1 teachers (N = 6) received CURE professional development and used the Bioethics 101 curriculum with their students (N = 323); Cohort 2 teachers (N = 6) did not receive professional development until after their participation in the study and did not use the curriculum with their students (N = 108). Cohort 2 students were given the test case study and questions, but with only traditional science instruction during the semester. Each Cohort was further divided into two groups (A and B). Students in Group A were asked to complete a pre-test prior to the case study, while students in Group B did not. All four student groups completed a post-test after analysis of the case study. This four-group model ( Table 1 ) allowed us to assess: 1) the effect of CURE treatment relative to conventional education practices, 2) the effect of the pre-test relative to no pre-test, and 3) the interaction between the pre-test and CURE treatment condition. Random assignment of students to treatment and comparison groups was not possible; consequently we used existing intact classes. In all, 431 students and 12 teachers participated in the research study ( Table 2 ).

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0036791.t001

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t002

In order to assess the acquisition of higher-order justification skills, students used the summative assessment provided in our curriculum as the pre- and post-test. We designed the curriculum to scaffold students’ ability to write a persuasive bioethical position; by the time they participated in the assessment, Cohort 1 students had opportunities to discuss the elements of a strong justification as well as practice in analyzing case studies. For our research, both Cohort 1 and 2 students were asked to analyze the case study of “Ashley X” ( Table S2 ), a young girl with a severe neurological impairment whose parents wished to limit her growth through a combination of interventions so that they could better care for her. Students were asked to respond to the ethical question: “Should one or more medical interventions be used to limit Ashley’s growth and physical maturation? If so, which interventions should be used and why?” In their answer, students were encouraged to develop a well-reasoned written position by responding to five questions that reflected elements of a strong justification. One difficulty in evaluating a multifaceted science-related learning task (analyzing a bioethical case study and justifying a position) is that a traditional multiple-choice assessment may not adequately reflect the subtlety and depth of student understanding. We used a rubric to assess student responses to each of the following questions (Q) on a scale of 1 to 4; these questions represent key elements of a strong justification for a bioethical argument:

  • Q1: Student Position: What is your decision?
  • Q2: Factual Support: What facts support your decision? Is there missing information that could be used to make a better decision?
  • Q3: Interests and Views of Others: Who will be impacted by the decision and how will they be impacted?
  • Q4: Ethical Considerations: What are the main ethical considerations?
  • Q5: Evaluating Alternative Options: What are some strengths and weaknesses of alternate solutions?

In keeping with our focus on the process of reasoning rather than on having students draw any particular conclusion, we did not assess students on which position they took, but on how well they stated and justified the position they chose.

We used a rubric scoring guide to assess student learning, which aligned with the complex cognitive challenges posed by the task ( Table S3 ). Assessing complex aspects of student learning is often difficult, especially evaluating how students represent their knowledge and competence in the domain of bioethical reasoning. Using a scoring rubric helped us more authentically score dimensions of students’ learning and their depth of thinking. An outside scorer who had previously participated in CURE workshops, has secondary science teaching experience, and who has a Masters degree in Bioethics blindly scored all student pre- and post-tests. Development of the rubric was an iterative process, refined after analyzing a subset of surveys. Once finalized, we confirmed the consistency and reliability of the rubric and grading process by re-testing a subset of student surveys randomly selected from all participating classes. The Cronbach alpha reliability result was 0.80 [19] .

The rubric closely followed the framework introduced through the curricular materials and reinforced through other case study analyses. For example, under Q2, Factual Support , a student rated 4 out of 4 if their response demonstrated the following:

  • The justification uses the relevant scientific reasons to support student’s answer to the ethical question.
  • The student demonstrates a solid understanding of the context in which the case occurs, including a thoughtful description of important missing information.
  • The student shows logical, organized thinking. Both facts supporting the decision and missing information are presented at levels exceeding standard (as described above).

An example of a student response that received the highest rating for Q2 asking for factual support is: “Her family has a history of breast cancer and fibrocystic breast disease. She is bed-bound and completely dependent on her parents. Since she is bed-bound, she has a higher risk of blood clots. She has the mentality of an infant. Her parents’ requests offer minimal side effects. With this disease, how long is she expected to live? If not very long then her parents don’t have to worry about growth. Are there alternative measures?”

In contrast, a student rated a 1 for responses that had the following characteristics:

  • Factual information relevant to the case is incompletely described or is missing.
  • Irrelevant information may be included and the student demonstrates some confusion.

An example of a student response that rated a 1 for Q2 is: “She is unconscious and doesn’t care what happens.”

All data were entered into SPSS (Statistical Package for the Social Sciences) and analyzed for means, standard deviations, and statistically significant differences. An Analysis of Variance (ANOVA) was used to test for significant overall differences between the two cohort groups. Pre-test and post-test composite scores were calculated for each student by adding individual scores for each item on the pre- and post-tests. The composite score on the post-test was identical in form and scoring to the composite score on the pre-test. The effect of the CURE treatment on post-test composite scores is referred to as the Main Effect, and was determined by comparing the post-test composite scores of the Cohort 1 (CURE) and Cohort 2 (Comparison) groups. In addition, Cohort 1 and Cohort 2 means scores for each test question (Questions 1–5) were compared within and between cohorts using t-tests.

CURE student perceptions of curriculum effect.

During prior program evaluations, we asked teachers to identify what they believed to be the main impacts of bioethics instruction on students. From this earlier work, we identified several themes. These themes, listed below, were further tested in our current study by asking students in the treatment group to assess themselves in these five areas after participation in the lesson, using a retrospective pre-test design to measure self-reported changes in perceptions and abilities [20] .

  • Interest in the science content of class (before/after) participating in the Ethics unit.
  • Ability to analyze issues related to science and society and make well-justified decisions (before/after) participating in the Ethics unit.
  • Awareness of ethics and ethical issues (before/after) participating in the Ethics unit.
  • Understanding of the connection between science and society (before/after) participating in the Ethics unit.
  • Ability to listen to and discuss different viewpoints (before/after) participating in the Ethics unit.

After Cohort 1 (CURE) students participated in the Bioethics 101 curriculum, we asked them to indicate the extent to which they had changed in each of the theme areas we had identified using Likert-scale items on a retrospective pre-test design [21] , with 1 =  None and 5 =  A lot!. We used paired t-tests to examine self-reported changes in their perceptions and abilities. The retrospective design avoids response-shift bias that results from overestimation or underestimation of change since both before and after information is collected at the same time [20] .

Student Demographics

Demographic information is provided in Table 3 . Of those students who reported their gender, a larger number were female (N = 258) than male (N = 169), 60% and 40%, respectively, though female students represented a larger proportion of Cohort 1 than Cohort 2. Students ranged in age from 14 to 18 years old; the average age of the students in both cohorts was 15. Students were enrolled in a variety of science classes (mostly Biology or Honors Biology). Because NIH recognizes a difference between race and ethnicity, students were asked to respond to both demographic questions. Students in both cohorts were from a variety of ethnic and racial backgrounds.

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t003

Pre- and Post-Test Results for CURE and Comparison Students

Post-test composite means for each cohort (1 and 2) and group (A and B) are shown in Table 4 . Students receiving CURE instruction earned significantly higher (p<0.001) composite mean scores than students in comparison classrooms. Cohort 1 (CURE) students (N = 323) post-test composite means were 10.73, while Cohort 2 (Comparison) students (N = 108) had post-test composite means of 9.16. The ANOVA results ( Table 5 ) showed significant differences in the ability to craft strong justifications between Cohort 1 (CURE) and Cohort 2 (Comparison) students F (1, 429) = 26.64, p<0.001.

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t004

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t005

We also examined if the pre-test had a priming effect on the students’ scores because it provides an opportunity to practice or think about the content. The pre-test would not have this effect on the comparison group because they were not exposed to CURE teaching or materials. If the pre-test provides a practice or priming effect, this would result in higher post-test performance by CURE students receiving the pre-test than by CURE students not receiving the pre-test. For this comparison, the F (1, 321) = 0.10, p = 0.92. This result suggests that the differences between the CURE and comparison groups are attributable to the treatment condition and not a priming effect of the pre-test.

After differences in main effects were investigated, we analyzed differences between and within cohorts on individual items (Questions 1–5) using t-tests. The Mean scores of individual questions for each cohort are shown in Figure 1 . There were no significant differences between Cohort 1 (CURE) and Cohort 2 (Comparison) on pre-test scores. In fact, for Q5, the mean pre-test scores for the Cohort 2 (Comparison) group were slightly higher (1.8) than the Cohort 1 (CURE) group (1.6). On the post-test, the Cohort 1 (CURE) students significantly outscored the Cohort 2 (Comparison) students on all questions; Q1, Q3, and Q4 were significant at p<0.001, Q2 was significant at p<0.01, and Q5 was significant at p<0.05. The largest post-test difference between Cohort 1 (CURE) students and Cohort 2 (Comparison) students was for Q3, with an increase of 0.6; all the other questions showed changes of 0.3 or less. Comparing Cohort 1 (CURE) post-test performance on individual questions yields the following results: scores were highest for Q1 (mean = 2.8), followed by Q3 (mean = 2.2), Q2 (mean = 2.1), and Q5 (mean = 1.9). Lowest Cohort 1 (CURE) post-test scores were associated with Q4 (mean = 1.8).

thumbnail

Mean scores for individual items of the pre-test for each cohort revealed no differences between groups for any of the items (Cohort 1, CURE, N = 323; Cohort 2, Comparison, N = 108). Post-test gains of Cohort 1 (CURE) relative to Cohort 2 (Comparison) were statistically significant for all questions. (Question (Q) 1) What is your decision? (Q2) What facts support your decision? Is there missing information that could be used to make a better decision? (Q3) Who will be impacted by the decision and how will they be impacted? (Q4) What are the main ethical considerations? and (Q5)What are some strengths and weaknesses of alternate solutions? Specifically: (Q1), (Q3), (Q4) were significant at p<0.001 (***); (Q2) was significant at p<0.01 (**); and (Q5) was significant at p<0.05 (*). Lines represent standard deviations.

https://doi.org/10.1371/journal.pone.0036791.g001

Overall, across all four groups, mean scores for Q1 were highest (2.6), while scores for Q4 were lowest (1.6). When comparing within-Cohort scores on the pre-test versus post-test, Cohort 2 (Comparison Group) showed little to no change, while CURE students improved on all test questions.

CURE Student Perceptions of Curriculum Effect

After using our resources, Cohort 1 (CURE) students showed highly significant gains (p<0.001) in all areas examined: interest in science content, ability to analyze socio-scientific issues and make well-justified decisions, awareness of ethical issues, understanding of the connection between science and society, and the ability to listen to and discuss viewpoints different from their own ( Figure 2 ). Overall, students gave the highest score to their ability to listen to and discuss viewpoints different than their own after participating in the CURE unit (mean = 4.2). Also highly rated were the changes in understanding of the connection between science and society (mean = 4.1) and the awareness of ethical issues (mean = 4.1); these two perceptions also showed the largest change pre-post (from 2.8 to 4.1 and 2.7 to 4.1, respectively).

thumbnail

Mean scores for individual items of the retrospective items on the post-test for Cohort 1 students revealed significant gains (p<0.001) in all self-reported items: Interest in science (N = 308), ability to Analyze issues related to science and society and make well-justified decisions (N = 306), Awareness of ethics and ethical issues (N = 309), Understanding of the connection between science and society (N = 308), and the ability to Listen and discuss different viewpoints (N = 308). Lines represent standard deviations.

https://doi.org/10.1371/journal.pone.0036791.g002

NWABR’s teaching materials provide support both for general ethics and bioethics education, as well as for specific topics such as embryonic stem cell research. These resources were developed to provide teachers with classroom strategies, ethics background, and decision-making frameworks. Teachers are then prepared to share their understanding with their students, and to support their students in using analysis tools and participating in effective classroom discussions. Our current research grew out of a desire to measure the effectiveness of our professional development and teaching resources in fostering student ability to analyze a complex bioethical case study and to justify their positions.

Consistent with the findings of SSI researchers and our own prior anecdotal observations of teacher classrooms and student work, we found that students improve in their analytical skill when provided with reasoning frameworks and background in concepts such as beneficence, respect, and justice. Our research demonstrates that structured reasoning approaches can be effectively taught at the secondary level and that they can improve student thinking skills. After teachers participated in a two-week professional development workshop and utilized our Bioethics 101 curriculum, within a relatively short time period (five lessons spanning approximately one to two weeks), students grew significantly in their ability to analyze a complex case and justify their position compared to students not exposed to the program. Often, biology texts present a controversial issue and ask students to “justify their position,” but teachers have shared with us that students frequently do not understand what makes a position or argument well-justified. By providing students with opportunities to evaluate sample justifications, and by explicitly introducing a set of elements that students should include in their justifications, we have facilitated the development of this important cognitive skill.

The first part of our research examined the impact of CURE instruction on students’ ability to analyze a case study. Although students grew significantly in all areas, the highest scores for the Cohort 1 (CURE) students were found in response to Q1 of the case analysis, which asked them to clearly state their own position, and represented a relatively easy cognitive task. This question also received the highest score in the comparison group. Not surprisingly, students struggled most with Q4 and Q5, which asked for the ethical considerations and the strengths and weaknesses of different solutions, respectively, and which tested specialized knowledge and sophisticated analytical skills. The area in which we saw the most growth in Cohort 1 (CURE) (both in comparison to the pre-test and in relation to the comparison group) was in students’ ability to identify stakeholders in a case and state how they might be impacted by a decision (Q3). Teachers have shared with us that secondary students are often focused on their own needs and perspectives; stepping into the perspectives of others helps enlarge their understanding of the many views that can be brought to bear upon a socio-scientific issue.

Many of our teachers go far beyond these introductory lessons, revisiting key concepts throughout the year as new topics are presented in the media or as new curricular connections arise. Although we have observed this phenomenon for many years, it has been difficult to evaluate these types of interventions, as so many teachers implement the concepts and ideas differently in response to their unique needs. Some teachers have used the Bioethics 101 curriculum as a means for setting the tone and norms for the entire year in their classes and fostering an atmosphere of respectful discussion. These teachers note that the “opportunity cost” of investing time in teaching basic bioethical concepts, decision-making strategies, and justification frameworks pays off over the long run. Students’ understanding of many different science topics is enhanced by their ability to analyze issues related to science and society and make well-justified decisions. Throughout their courses, teachers are able to refer back to the core ideas introduced in Bioethics 101, reinforcing the wide utility of the curriculum.

The second part of our research focused on changes in students’ self-reported attitudes and perceptions as a result of CURE instruction. Obtaining accurate and meaningful data to assess student self-reported perceptions can be difficult, especially when a program is distributed across multiple schools. The traditional use of the pretest-posttest design assumes that students are using the same internal standard to judge attitudes or perceptions. Considerable empirical evidence suggests that program effects based on pre-posttest self-reports are masked because people either overestimate or underestimate their pre-program perceptions [20] , [22] – [26] . Moore and Tananis [27] report that response shift can occur in educational programs, especially when they are designed to increase students’ awareness of a specific construct that is being measured. The retrospective pre-test design (RPT), which was used in this study, has gained increasing prominence as a convenient and valid method for measuring self-reported change. RPT has been shown to reduce response shift bias, providing more accurate assessment of actual effect. The retrospective design avoids response-shift bias that results from overestimation or underestimation of change since both before and after information is collected at the same time [20] . It is also convenient to implement, provides comparison data, and may be more appropriate in some situations [26] . Using student self-reported measures concerning perceptions and attitudes is also a meta-cognitive strategy that allows students to think about their learning and justify where they believe they are at the end of a project or curriculum compared to where they were at the beginning.

Our approach resulted in a significant increase in students’ own perceived growth in several areas related to awareness, understanding, and interest in science. Our finding that student interest in science can be significantly increased through a case-study based bioethics curriculum has implications for instruction. Incorporating ethical dilemmas into the classroom is one strategy for increasing student motivation and engagement with science content. Students noted the greatest changes in their own awareness of ethical issues and in understanding the connection between science and society. Students gave the highest overall rating to their ability to listen to and discuss viewpoints different from their own after participation in the bioethics unit. This finding also has implications for our future citizenry; in an increasingly diverse and globalized society, students need to be able to engage in civil and rational dialogue with others who may not share their views.

Conducting research studies about ethical learning in secondary schools is challenging; recruiting teachers for Cohort 2 and obtaining consent from students, parents, and teachers for participation was particularly difficult, and many teachers faced restraints from district regulations about curriculum content. Additional studies are needed to clarify the extent to which our curricular materials alone, without accompanying teacher professional development, can improve student reasoning skills.

Teacher pre-service training programs rarely incorporate discussion of how to address ethical issues in science with prospective educators. Likewise, with some noticeable exceptions, such as the work of the University of Pennsylvania High School Bioethics Project, the Genetic Science Learning Center at the University of Utah, and the Kennedy Institute of Ethics at Georgetown University, relatively few resources exist for high school curricular materials in this area. Teachers have shared with us that they know that such issues are important and engaging for students, but they do not have the experience in either ethical theory or in managing classroom discussion to feel comfortable teaching bioethics topics. After participating in our workshops or using our teaching materials, teachers shared that they are better prepared to address such issues with their students, and that students are more engaged in science topics and are better able to see the real-world context of what they are learning.

Preparing students for a future in which they have access to personalized genetic information, or need to vote on proposals for stem cell research funding, necessitates providing them with the tools required to reason through a complex decision containing both scientific and ethical components. Students begin to realize that, although there may not be an absolute “right” or “wrong” decision to be made on an ethical issue, neither is ethics purely relative (“my opinion versus yours”). They come to realize that all arguments are not equal; there are stronger and weaker justifications for positions. Strong justifications are built upon accurate scientific information and solid analysis of ethical and contextual considerations. An informed citizenry that can engage in reasoned dialogue about the role science should play in society is critical to ensure the continued vitality of the scientific enterprise.

“I now bring up ethical issues regularly with my students, and use them to help students see how the concepts they are learning apply to their lives…I am seeing positive results from my students, who are more clearly able to see how abstract science concepts apply to them.” – CURE Teacher “In ethics, I’ve learned to start thinking about the bigger picture. Before, I based my decisions on how they would affect me. Also, I made decisions depending on my personal opinions, sometimes ignoring the facts and just going with what I thought was best. Now, I know that to make an important choice, you have to consider the other people involved, not just yourself, and take all information and facts into account.” – CURE Student

Supporting Information

Bioethics 101 Lesson Overview.

https://doi.org/10.1371/journal.pone.0036791.s001

Case Study for Assessment.

https://doi.org/10.1371/journal.pone.0036791.s002

Grading Rubric for Pre- and Post-Test: Ashley’s Case.

https://doi.org/10.1371/journal.pone.0036791.s003

Acknowledgments

We thank Susan Adler, Jennifer M. Pang, Ph.D., Leena Pranikay, and Reitha Weeks, Ph.D., for their review of the manuscript, and Nichole Beddes for her assistance scoring student work. We also thank Carolyn Cohen of Cohen Research and Evaluation, former CURE Evaluation Consultant, who laid some of the groundwork for this study through her prior work with us. We also wish to thank the reviewers of our manuscript for their thoughtful feedback and suggestions.

Author Contributions

Conceived and designed the experiments: JTC LJC. Performed the experiments: LJC. Analyzed the data: LJC JTC DNK. Contributed reagents/materials/analysis tools: JCG. Wrote the paper: JTC LJC DNK JCG. Served as Principal Investigator on the CURE project: JTC. Provided overall program leadership: JTC. Led the curriculum and professional development efforts: JTC JCG. Raised funds for the CURE program: JTC.

  • 1. Bell P (2004) Promoting students’ argument construction and collaborative debate in the science classroom. Mahwah, NJ: Erlbaum.
  • View Article
  • Google Scholar
  • 3. Toulmin S (1958) The Uses of Argument. Cambridge: Cambridge University Press.
  • 6. Zeidler DL, Sadler TD, Simmons ML, Howes , EV (2005) Beyond STS: A research-based framework for socioscientific issues education. Wiley InterScience. pp. 357–377.
  • 8. AAAS (1990) Science for All Americans. New York: Oxford University Press.
  • 9. National Research Council (1996) National Science Education Standards. Washington, DC: National Academies Press.
  • 10. National Research Council (2011) A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: National Academies Press.
  • 11. National Board for Professional Teaching Standards (2007) Adolescence and Young Adulthood Science Standards. Arlington, VA.
  • 17. Beauchamp T, Childress JF (2001) Principles of biomedical ethics. New York: Oxford University Press.
  • 18. Chowning JT, Griswold JC (2010) Bioethics 101. Seattle, WA: NWABR.
  • 26. Klatt J, Taylor-Powell E (2005) Synthesis of literature relative to the retrospective pretest design. Presentation to the 2005 Joint CES/AEA Conference, Toronto.

Critical Thinking and Evaluating Information

  • Introduction
  • What Is Critical Thinking?
  • Critical Thinking and Reflective Judgement
  • Problem Solving Skills
  • Critical Reading
  • Critical Writing
  • Use the CRAAP Test
  • Evaluating Fake News
  • Explore Different Viewpoints

Peer-Review

What is peer review, peer review and popular sources, helpful links.

  • Critical ThinkingTutorials
  • Books on Critical Thinking
  • Explore More With These Links

Live Chat with a Librarian 24/7

Text a Librarian at 912-600-2782

Peer-Review is the process by which scholars critically evaluate each other's research article prior to publication in an academic journal. 

This page will help you better understand the Peer-Review process , as well as help you identify articles that are peer-reviewed .

Video Credit: Deakin Library

CPTC Library Services' Guide to scholarly vs popular sources Infographic

  • What is Peer Review? This LibGuide, created by librarians at Oregon State University, will help you learn about the peer review process including: -What is Peer Review? -Identifying if a Journal is Peer Reviewed -Using the Database to Identify a Peer-Reviewed Journal
  • How Can I Determine if a Journal is Peer Reviewed? This LibAnswers Guide, created by Librarian Jennifer Harris at Southern New Hampshire University, provides a number of steps to follow to help you figure out if a journal is peer reviewed.
  • How to Find Scholarly, Peer-Reviewed Journal Articles This Bow Valley College Library LibGuide provides questions to ask yourself to help you determine whether or not an article is scholarly and peer-reviewed.
  • << Previous: Explore Different Viewpoints
  • Next: Critical ThinkingTutorials >>
  • Last Updated: Mar 25, 2024 8:18 AM
  • URL: https://libguides.coastalpines.edu/thinkcritically
  • Open access
  • Published: 02 May 2024

Understanding the value of a doctorate for allied health professionals in practice in the UK: a survey

  • Jo Watson 1 ,
  • Steven Robertson 2 ,
  • Tony Ryan 2 ,
  • Emily Wood 3 ,
  • Jo Cooke 2 ,
  • Susan Hampshaw 4 &
  • Hazel Roddam 5  

BMC Health Services Research volume  24 , Article number:  566 ( 2024 ) Cite this article

92 Accesses

Metrics details

The need to transform the United Kingdom’s (UK) delivery of health and care services to better meet population needs and expectations is well-established, as is the critical importance of research and innovation to drive those transformations. Allied health professionals (AHPs) represent a significant proportion of the healthcare workforce. Developing and expanding their skills and capabilities is fundamental to delivering new ways of working. However, career opportunities combining research and practice remain limited. This study explored the perceived utility and value of a doctorate to post-doctoral AHPs and how they experience bringing their research-related capabilities into practice environments.

With a broadly interpretivist design, a qualitatively oriented cross-sectional survey, with closed and open questions, was developed to enable frequency reporting while focusing on the significance and meaning participants attributed to the topic. Participants were recruited via professional networks and communities of practice. Descriptive statistics were used to analyse closed question responses, while combined framework and thematic analysis was applied to open question responses.

Responses were received from 71 post-doctoral AHPs located across all four UK nations. Findings are discussed under four primary themes of utilisation of the doctorate; value of the doctorate; impact on career, and impact on self and support. Reference is also made at appropriate points to descriptive statistics summarising closed question responses.

The findings clearly articulate variability of experiences amongst post-doctoral AHPs. Some were able to influence team and organisational research cultures, support the development of others and drive service improvement. The challenges, barriers and obstacles encountered by others reflect those that have been acknowledged for many years. Acknowledging them is important, but the conversation must move forward and generate positive action to ensure greater consistency in harnessing the benefits and value-added these practitioners bring. If system-wide transformation is the aim, it is inefficient to leave navigating challenges to individual creativity and tenacity or forward-thinking leaders and organisations. There is an urgent need for system-wide responses to more effectively, consistently and equitably enable career pathways combining research and practice for what is a substantial proportion of the UK healthcare workforce.

Peer Review reports

The imperative to transform the delivery of health and social care in the United Kingdom (UK) to better meet the changing needs and expectations of the population has been acknowledged for some time. Equally well recognised is the critical importance of research and innovation to drive those transformations, including advances in treatments and interventions [ 1 , 2 ]. As the third largest clinical workforce in the UK’s National Health Service (NHS), the allied health professions Footnote 1 (AHPs) are acknowledged as having an essential role in helping meet demand. In addition, their impact reaches beyond the NHS with significant contributions made across the health and care sector in roles within social care, housing, early years, schools, public health, the criminal justice system and in private, voluntary, community and social enterprise organisations [ 3 ].

With a fundamental need to identify new ways of working and delivering care, developing the skills and expanding the capabilities of the workforce, and creating meaningful career pathways to support retention of experienced staff, are paramount [ 1 , 2 , 4 , 5 , 6 ]. Building research capability and capacity to complement practice expertise is key to optimising the workforce and strengthening the evidence-base informing safe, clinically effective, cost efficient services [ 7 , 8 , 9 , 10 ]. A complex interplay between developing strong internal organisational infrastructure and supporting individual career planning and skills development has been identified [ 11 ]. Where this is successfully navigated, healthcare organisations that are research-active are noted to have improved performance and patient experience, and better staff recruitment and retention, compared to those with lower levels of research engagement [ 12 , 13 , 14 , 15 ].

Clinical academics are an important strand of the workforce who work concurrently in practice and academic environments and are research-active [ 9 ]. Newington et al’s [ 16 ] systematic review identified a wide range of positive impacts from non-medical clinical academics in the UK. These included benefits to patients, service provision and the workforce, including: recruitment and retention; the research profile, culture and capacity of organisations; knowledge exchange and the economy. Clinical academics themselves were also noted to benefit through, for example, increased job satisfaction, growing their networks and influence, developing leadership as well as research skills, and unlocking new career opportunities [ 16 ].

Acknowledging these potential benefits, Comer et al. [ 17 ] explored the perceived level of research capacity and culture within AHPs working in the NHS, using the Research Capacity and Culture (RCC) tool. Only 34% of respondents reported research-related activities being part of their roles, and of these, 79% had less than 25% of their time allocated for research-related activity. Further, only 18% reported that research engagement was routinely discussed at annual appraisals. Similarly, and using the same RCC tool, Cordrey et al. [ 18 ] found that 31% of responding AHPs from a single NHS department reported research-related activities as a component of their role, and of these 21% had dedicated time for research. Lack of time and opportunity are noted to curtail research engagement to a greater extent than limits in capability or ambition [ 17 , 18 ].

Despite faring better than their nursing and midwifery colleagues in securing funding via National Institute for Health and Care Research (NIHR) developmental pathways [ 19 ], opportunities for AHPs (amongst others) to develop clinical academic careers remain limited [ 20 , 21 ]. Further, key barriers to research engagement persist, despite having been highlighted over a number of years (see, for example [ 7 , 22 , 23 ]), . Even where funding has been secured, backfill to enable release from practice duties is a particular challenge [ 16 , 18 ]. A related obstacle is a lack of time for research [ 17 , 18 ]. This in turn is linked to the need to accommodate practice-facing and research components of roles [ 16 ], with practice roles frequently taking priority [ 17 , 18 ]. It has also been noted that feelings of personal guilt can direct AHPs’ own prioritisation towards clinical workload management at the expense of engaging in research activities and their own career development [ 18 ]. The scarcity of research-engaged organisational cultures and clinical academic roles to aspire to [ 16 ] remains a foundational issue that must be addressed.

In this context, the publication of the Allied Health Professions’ Research and Innovation Strategy for England [ 24 ] was driven by recognition of the urgent need for transformational change in the pace of growth, stability and sustainability of research engagement by AHPs. Understanding more about how AHPs who have undertaken doctoral studies experience bringing their research-related capabilities into practice environments provides helpful insights to inform the actions required to progress this agenda and realise the visions outlined in the HEE Research and Innovation Strategy.

To understand the perceived utility and value of a doctorate to post-doctoral Allied Health Professionals in practice in the UK.

Study design

The overall design of this study was broadly interpretivist in nature, an approach concerned with discovering the meaning people attach to experiences and how this influences their actions [ 25 ]. A cross-sectional survey, with closed and open questions, was developed to report the frequency of participant responses and to facilitate a focus on the significance participants attributed to the research topic. In this sense, the research project and survey tool were qualitative in their orientation [ 26 ]. Mixed surveys with a qualitative emphasis (and even qualitative questionnaires) are increasingly being used in health and social care research as they limit the number of constraining responses and allow participants to provide as much information as they choose in their own terms [ 26 ].

Ethics approval was obtained from the University of Sheffield ethics committee (Ref: 023667). The Standards for Reporting Qualitative Research (SRQR) checklist for cross-sectional studies was used to guide this study’s conduct and reporting [ 27 ].

Participants and recruitment

Nurses, midwives, healthcare scientists and AHPs currently undertaking or having completed doctorates were recruited via professional networks and via the Yorkshire and Humber Collaboration and Leadership for Allied Health and Care Research (CLAHRC) infrastructure. A Twitter© account was established for the purpose of the study and a link to an online survey was disseminated via this Twitter© feed. Within Twitter©, relevant communities of practice were targeted and encouraged to actively retweet the survey. Respondents were also asked to retweet and share the survey link within their networks to generate a diverse sample. Only those respondents based within the UK were included in the study.

Data collection

A bespoke, 23-item questionnaire containing closed and open questions, was developed to collect information about demographics and the self-reported benefits and impact of doctoral study. Closed questions included information about: motivation; mode, length and funding sources of doctoral study; prior research experience; perceived benefits, utility and value of the doctorate; and impact on career and self. The survey also included open questions for respondents to provide greater detail about their experiences and views (see Table  1 ). These closed and open questions were developed by drawing on issues raised within the literature [ 28 , 29 , 30 ]. As is recommended in questionnaire development [ 31 ], these questions were piloted with those sharing similar characteristics to the intended survey recipients, in this case members of the Addressing Organisational Capacity to do Research Network (ACORN) community of practice. ACORN was developed as part of a capacity building programme within the Yorkshire and Humber CLAHRC. The online survey, with associated information sheet, was open from 5th Feb 2019 to 15th March 2019. Participation was optional and anonymous.

For the purpose of this paper, data from AHPs was separated from the nurses/midwives and healthcare scientists. The intention is to focus on AHP experiences to complement previously published work from this dataset that focused on the reported experiences of nurses and midwives [ 21 ].

Analysis of AHP responses to closed questions, supported by IBM SPSS© software, was undertaken with descriptive statistics reported here. As noted by others [ 26 ], open question data in mixed surveys can be analysed in a thoroughly qualitative way. Open question responses from the AHPs were therefore analysed using a combined framework and thematic analysis. First stage analysis involved placing open response data into a three-theme framework developed following the earlier analysis of the data for the nurses and midwives [ 21 ]. This was conducted by SR and was cross-checked for accuracy by TR. Second stage analysis involved categorising and merging data within these themes into sub-themes. This was conducted by SR and was cross-checked for accuracy by TR. The final stage involved clustering the sub-themes into existing or new themes. This was conducted by SR, cross-checked for accuracy by TR and agreed by all research team members. The themes developed through and during the analysis differed little from the previous analysis of the data from nurses and midwives, although the sub-themes altered slightly and one previous theme was split into two themes, resulting in four themes with 13 sub-themes (see Table  2 ).

Apart from the context section below on ‘route to doctoral completion’, the closed and open question data analysis is integrated and presented under the four theme headings that were derived from the analysis of the open question data.

There were 214 respondents from across the UK. They included 47 nurses, 96 healthcare scientists and 71 AHPs. Representing the AHP disciplines, respondents were: physiotherapists (PT, n  = 26), speech and language therapists (SLT, n  = 15), occupational therapists (OT, n  = 9), radiographers (RAD, n  = 8), podiatrists (POD, n  = 4), dieticians (DIET, n  = 3), art therapists ( n  = 2), paramedics ( n  = 2), a music therapist ( n  = 1), and an orthoptist ( n  = 1). To aid anonymity, art therapists, paramedics, music therapists and orthoptists were all coded as MISC. Table  3 summarised the geographical spread of this self-selecting sample. As noted earlier, only data from the AHP survey respondents are reported here.

Route to doctoral completion

All AHP respondents had completed study at doctoral level; none were still undertaking their studies. One-third had studied full-time (30%, n  = 21). Well over half of AHP respondents (56%, n  = 40) had taken five years or fewer to complete, with 29 (41%) taking more than five years (2 missing). Just under half of AHP respondents had some form of research experience prior to commencing their doctorates (46%, n  = 33). These experiences were typically formed through internships (including NIHR opportunities), Master’s dissertation study, secondments and co-investigator roles.

Post-doctoral experiences

The following section outlines the findings from the analysis of the open question data which allowed respondents to reflect upon their current professional experiences in a post-doctoral context. However, at appropriate points references are also made to some of the descriptive statistics summarising responses to closed question items included within the questionnaire. These data are discussed under four theme headings (see Table  2 ) with illustrative data quotations.

Utilisation of the doctorate

Most participants who commented mentioned utilising the skills and knowledge gained through their doctoral studies. For some, this related directly to aspects of clinical care including shifting team culture and changing existing practices:

“I now influence my team’s way of thinking about what we do with patients. We are all more analytical and confident to question practices that have been historically used for many years.” [PT4] .
“My research and critical thinking contributes to redesigned pathways and patient outcome improvement” [RAD6] .

Important here was the recognition that developing, enhancing or extending the skill of critical thinking was a key aspect that helped drive these service improvements and change:

“I think my critical analysis skills and research skills have helped significantly in developing our service and providing effective evidence-based interventions for the patients seen by our team […] My doctorate has given me many transferable skills and has enabled me to deliver much better evidence-based care for my clients and our service.” [SLT14] .

As well directly linking to service improvement and change, the doctoral journey was noted to enhance a wider skill-base that is transferrable and applicable across roles, and across teams, particularly, but not solely, in relation to the research aspects of their practice:

“The doctorate allowed me to deepen my research and clinical expertise somewhat but the benefits in terms of ‘soft-skills’; e.g. time management, resilience, negotiation have been enormous.” [RAD3] .
“These skills have filtered through to my clinical work, helped me to facilitate service changes within the clinical team and helped to foster an ethos of research as core business within my immediate team, but also more widely in the hospital Trust and wider professional networks.” [SLT13] .

A further skill developed by some through the doctoral journey was an improved ability (and confidence) to train, educate, supervise and encourage others:

“I utilise these benefits on a daily basis […] I teach and train colleagues where I work to be able to critically appraise research and consider the applications for clinical practice. I also lecture and believe that my experience clinically and with research is an important combination.” [SLT12] .

While the majority of comments were positive, there was also suggestion that the ability to utilise the skills learnt through doctoral experiences was not always present. Almost one-third agreed or strongly agreed with the statement: “there are limited opportunities to use skills gained during my doctorate” (31%, n = 22). One-quarter of participants felt over-qualified for their role (24%, n = 18). There were some general comments in response to the question “To what extent are you able to utilise these benefits in your current role?” , such as; “Limited” [POD3] , “Limited ability” [DIET2] , and “To some degree” [DIET1] . Others, expanding on this, demonstrated some frustration with the opportunities available to utilise the skills and knowledge developed through their doctoral studies, particularly in clinical contexts:

“Research sits in an uncertain position in my organisation so doctorate level skills are difficult to show the benefit of.” [MISC7] .
“Now I’m in an academic role, completely […] In a university department I could not do my job without a PhD. In the NHS they didn’t quite know how to best exploit my new skills and knowledge (or what they were).” [SLT9] .
“The doctorate seems to open up more opportunities outside of the NHS as opposed to within it…” [MISC7] .

Importantly, some have worked hard to create opportunities to put the skills and knowledge learnt into practice, but this has taken a personal toll (see also later section “Impact on self”):

“My clinical caseload is heavier now than prior to my Doctorate and I have no protected research time. Implementing and sharing my skills has taken a monumental amount of work and perseverance.” [OT8] .

While almost all participants recognised the importance of the skills and knowledge gained from their doctoral studies, it was not always easy to put these into practice, particularly in clinical contexts. Some suggested this difficulty links to the extent to which doctorates (and research skills and knowledge generally) are valued within their organisations or profession.

Value of the doctorate

On completion of their doctorate, almost 60% ( n  = 43) of AHPs were confident that the qualification was valued by employers, with 11% ( n  = 8) certain that their doctorate was not valued and one quarter remaining unsure (26%, n  = 20). For many of those whose employers recognised the value doctoral learning and experience brought, this was often linked to either the importance of improved clinical expertise or to increased credibility and prestige for the individual, their team or their organisation:

“Adds to credibility in a national role and multi-professional arena. Clinical expertise and insight adds value to strategic NHS planning associated research and development activity.” [RAD1] .
“My PhD brought/brings credibility not only to me but to this specialist centre.” [SLT12] .

For those who did not think their employers valued doctoral learning and experience, or who were not sure if it was valued, comments indicate a disinterest in doctoral study – its associated skills and what they might bring - among colleagues and managers, particularly in clinical settings:

“My NHS role respect my clinical leadership but not my research leadership as much. It’s not considered core business.” [POD1] .
“Having a PhD was not valued in my previous job in the NHS because I was seen as developing skills in the wrong area - extremely disappointing for me.” [OT7] .
“My employer is oblivious to them [doctoral skills gained].[…] My NHS employer has no interest in my academic skills, experience or knowledge. No-one at work acknowledges my doctorate, uses Dr when addressing me or writing to me, or recognises in a positive sense the study I have undertaken.” [PT24] .

A more frequent response to the question about whether employers valued doctoral learning and experience was to present a nuanced view of how, and by whom, the doctoral skills and knowledge were valued. For those with combined academic and clinical roles, it was often stated that the doctorate was valued in the academic context but not in their clinical role:

“I have two roles - these skills are valued in my academic job. Maybe less so in my clinical job.” [MISC1] .

For those employed fully within a clinical setting, important differences were noted regarding who valued what doctoral study could bring and which elements were valued, although this wasn’t consistent:

“Locally (departmental) I think they are valued. On a hospital basis, I am not sure.” [RAD2] .
“I think they are respected by senior colleagues but I find my own departmental managers find little value in either higher education achievement or research, which they often consider to be a burden.” [PT3] .

Finally, participants noted whether their doctoral studies were valued in financial terms. Around one-third of AHPs regarded their current earnings to be misaligned with their post-doctoral expectations, with 27% ( n  = 19) agreeing or strongly agreeing with the statement: “My post-doctoral earnings are lower than I envisaged” . The following comments were a little contradictory but are suggestive of a negative, or at best static, influence on post-doctoral earning capacity:

“I feel it’s enhanced my career clinically and my national leadership and teaching profile, but not my earning potential.” [POD1] .
“PhD has been very rewarding intellectually and clinically. However, it’s offered less job security and absolutely no financial rewards, my grading remaining static since pre PhD.” [SLT7] .
“I wouldn’t have my lecturer job without it. Although ironically I now work at a lower clinical band so that I can maintain a clinical foothold.” [PT17] .  This issue of progression and financial value links to larger issues of the impact on careers that participants experienced following completion of their doctoral study.

Impact on career

A large proportion of respondents reported being in a clinical academic post at the time of the survey (41%, n  = 29), whilst just under one-third (28%, n  = 20) were in academic positions. A small proportion remained in clinical positions (11%, n  = 8), with the remainder being in what were described as managerial and leadership roles (20%, n  = 14).

Many participants noted the positive impact that completing doctoral studies had on their career. For several, completing a doctorate facilitated or cemented an academic career path:

“I started my PhD whilst in clinical practice and during my studies I took a role in academia. It was pivotal in being offered a position at a university where a doctorate, or working towards one, is required.” [OT2] .

Such positive comments were also made in relation to clinical and clinical academic career development:

“As a reporting radiographer, my role was a blend of image reporting and acquisition. My PhD facilitated growth and progression to a consultant clinical academic position.” [RAD6] .
“Career, research and collaborative opportunities arise much more for me post-PhD compared with pre-PhD. I seem to have greater credibility. I have freedom to choose what I do with my career now.” [PT11] .

However, not all participants experienced this positive career impact. Some felt that undertaking doctoral study had limited impact, or even represented a backward step, particularly in relation to clinical career components:

“After my PhD I had a phase of feeling it had derailed my career. I enjoyed my doctoral studies but never wanted an academic career. I felt as though I had stepped off the career ladder and struggled to get back on it.” [PT1] .

Others experienced frustration with the limited opportunities for on-going research and concomitant post-doctoral career development:

“I hoped that it would have enabled me to actively pursue further clinical research. However, opportunities were limited when I moved to [geographical area] […] I ended up falling into a management post and find it difficult now to downgrade / get opportunity to be involved directly in research.” [DIET2] .
“I had high hopes that this role would provide networking and research opportunities as it’s within a large teaching hospital. Despite trying to develop an AHP research culture, there is no dedicated time or support for this to happen. […] I am desperate to progress but can’t seem to navigate into that split clinical academic world that seems to be made for medics only.” [SLT11] .

For many participants it was not a clear-cut case of whether completing doctoral level study had or had not impacted their career. Rather, most described a somewhat crooked path of post-doctoral career development; a mixture of opportunities and barriers:

“I had diverse skills that didn’t necessarily follow a recognised path. Pleased to say that things are falling into place and after a couple of stepping stones I am finding roles that value diverse experience and are an appropriate grade/salary. The PhD helped me get here but it wasn’t a straightforward path.” [PT1] .
“I loved doing my PhD. However there is no career pathway for me to follow. I was lucky to be employed in a research management role and have been lucky to gain funding to continue with my research career.” [RAD2] .
“I feel my doctorate has given me a platform to carry out more research; however I wasn’t expecting on having to leave my senior leadership position in the NHS to do this.” [OT7] .

What becomes clear from the above is that it often took a significant amount of personal effort, resilience and flexibility to generate a positive post-doctoral career path. This can take its toll on individual AHPs and those around them.

Impact on self and support

There was overwhelming recognition that the doctoral experience led to changes in relation to skills and resourcefulness. Evidence amongst respondents suggested that the doctoral experience facilitated positive changes in relation to: critical thinking (100%, n  = 71), research skills (100%, n  = 71), specialist knowledge (97%, n  = 69), fresh perspective (90%, n  = 64), resilience and confidence (83%, n  = 59) and problem solving (93%, n  = 66).

Some participants provided positive personal accounts of undertaking doctoral study and the impact it had for them in terms of satisfaction, confidence and resilience:

“Deciding to undertake my doctorate part-time and remain part-time in clinical practice was the best decision I made.” [POD2] .
“I feel that my doctoral experience has changed the way I think about everything and I continue to be thirsty for further research. I love this feeling! […] I loved it, and would always recommend others to do so for their own benefit, even if it won’t benefit their career.” [RAD5] .

For a few, however, it represented a difficult journey having a negative impact through increased stress, the exertion of considerable effort for little gain, and disruptions to career and family:

“It’s [PhD] hard, requires perseverance and tenacity and no guarantees of anything at the end!” [MISC3] .
“Being a clinical academic is problematic when it comes to stability in posts, equality in promotion, etc. Pursuing this career has resulted in many challenges in gaining recognition, promotion, work-life balance, etc.” [PT7] .
“I would have liked to have had a clinical-research career, but there is no support for this, it’s something I would have to carve out myself, and due to other pressures (family, financial, etc.), I just haven’t felt able to do this.” [SLT1] .

The majority presented a mixed picture of the personal cost and impact, describing both the difficulties and the benefits doctoral study brought and the personal change it produced:

“I have enjoyed the journey immensely and feel it was the right pathway for me. That said, it is tough and maintaining a career in research as an AHP requires not only resilience and perseverance but a willingness and ability to take risks. Job security is still uncertain and as the main breadwinner that is a big concern.” [SLT2] .
“It was the hardest challenge of my life. I’m still recovering and re-orientating as I changed so much during my registration period. Colleagues in clinical settings often don’t appreciate the internal changes a PhD brings which can be frustrating. It’s also not great for work/life balance at all… tough on mental health at times too. I’d absolutely do it again though because of the value it has brought me personally.” [POD4] .

Support, including that from family, was clearly important in facilitating positive personal experiences of doctoral study and for positive post-doctoral experiences. Sources of financial support to undertake a doctorate were varied. The most cited form of support was self-funding (25%, n  = 18), typically alongside the use of smaller funds (such as regional HEE, charity and continuing professional development funds) used during study programmes. Employer support (17%, n  = 12) and charitable trusts were also highly cited (23%, n  = 17). NIHR funding (including that from Fellowships and CLAHRCs) supported 13 (18%) respondents and higher education in was also cited as a significant source of financial support (12%, n  = 9) (missing n  = 2). This reliance upon self-funding may have contributed to almost two-fifths of respondents agreeing or strongly agreeing with the statement: “My doctoral study was a financial risk” .

Beyond family and financial support, employer and colleague support in terms of allowing space and time for study, and in facilitating appropriate research and personal development opportunities, was key:

“My employer supports my development as a clinical academic by allowing me to build research into my new role, supported by ongoing application for research funding to pay for the research portion of my post.” [SLT13] .
“My manager has also initiated discussions about optimising the research (and training) skills I have in terms of a new role.” [MISC8] .

However, such support (as noted in earlier sections) was not always forthcoming in relation to on-going post-doctoral research opportunities, which could be very disheartening:

“I felt well-supported to complete the doctorate itself but I had zero post-PhD career support, including during my first post-doc position. I think this is a real gap for AHPs doing a PhD.” [MISC6] .
“There is a lot of help for clinicians who wants to get into research but there is not much for researchers who need support to return to the clinical practice.” [PT15] .
“I would like to be a clinical academic but this is not a role valued by my Trust or managers. I have had some support from previous managers to use my research skills within my current post, but research is to some degree viewed as a luxury and clinical risk and managerial issues always take priority.” [SLT14] .

These personal accounts of the impact of doctoral experiences on individual participants, and the potential ripple effects of that for departments and organisations, have rarely been explored in previous research. Findings here therefore comprise an original contribution to understanding the lived experience of AHP doctoral study and the pursuit of career pathways combining research and practice.

Organisational benefits

Some participants in this study clearly identified the organisational benefits derived from their completion of doctoral level study. Noteworthy is their articulation of ‘value added’ across all four pillars of practice (namely: professional practice; facilitation of learning; leadership; and evidence, research and development), not solely the research pillar. Echoing the findings of Newington et al. [ 16 ] and the reflections of Cooper at al [ 20 ], the findings of this study indicate the strong potential for post-doctoral practitioners to actively contribute to, and lead, service improvements, delivery of evidence-based interventions, local workforce development and the building of team and organisational cultures of research engagement. The findings also clearly illustrate the variable nature of departmental and organisational cultures related to research. It is evident that the extent to which research is embraced and embedded as fundamental to the core business of health and care providers has a strong bearing on the extent to which organisations are able to realise the benefits of, and value added by, post-doctoral practitioners.

Where research is valued, and where organisation, service leaders and managers are willing to make the sometimes initially challenging decisions to create time and space to enable research-active practitioners, there is evidence of value to people accessing services, services, departments and organisations themselves [ 7 , 12 , 13 , 14 , 15 , 16 , 32 ]. The findings from this study highlight that some organisations / departments do very well when it comes to supporting research capacity building and engagement amongst practitioners, and reaping the associated benefits. Some are on a positive journey towards developing and embedding research within practice. While it may be perceived that other organisations remain ambivalent, apparent inaction is possibly more likely associated with a determined focus on meeting the demands placed on pressurised services. This, coupled with prioritising non research-related key performance indicators linked to service commissioning, creates a challenging backdrop against which to find the time or a way to embed research engagement into service delivery.

National policy imperatives

Notwithstanding the genuine pressures felt by services and organisations, delays in building cultures of research engagement slow and hamper the collective progress required to respond to national policy imperatives. The CQC standards for Well Led Research in NHS Trusts, introduced in 2018, specifically require evidence that research is supported across the breadth of all services [ 33 ]. The NHS Long Term Plan [ 1 ] is a recent illustration, but is by no means the first to emphasise the role of, and need for, practice-based research engagement. Further, as our findings illustrate, organisational failure to enable practice-based research engagement becomes a contributing factor in the attrition of experienced and sometimes senior practitioners from service delivery. The findings from this sample exemplify decisions to move fully into academia, as it presents an environment where post-doctoral knowledge and skills are overtly valued. The apparent reluctance or regret expressed by some who have decided to do so is particularly telling.

The NHS People Plan [ 4 ] emphasises the need to make effective use of the full range of staff skills and experience to deliver the best possible care. It also contains a significant theme related to staff retention, identifying that ‘systems and employers must make greater efforts to design and offer more varied roles to retain our people’ (p46). Employers, line managers and supervisors are called on to ‘create the time and space for training and development … with a renewed emphasis on the importance of flexible skills and building capabilities rather than staying within traditionally-defined roles’ (p36). The findings presented here suggest that there is still some way to go to consistently implementing these approaches for those practitioners with post-doctoral careers.

This study’s findings also identify that it can be difficult for post-doctoral AHPs to find a viable pathway to return to practice, whether entirely or in clinical academic roles. Those who remain in practice, often experienced relentless barriers and obstacles to deploying their hard-won knowledge and skills. Many ended up settling for the status quo. This reflects a waste of resource for individuals and the organisations who backed them financially or with initial protected time, often resulting in disheartened practitioners and missed opportunities for organisations and the people and communities they serve. Wasted resource is also amplified by the lack of retention of those who do not accept the status quo. Such practitioners and their skills become lost to the organisation that initially supported them. Systems and structures are not consistently working in favour of enabling practitioners to become and remain research active. In some respects and in some, but certainly not all, instances, systems and cultures appears to be resistant to change.

The ongoing need for enabling infrastructure and systems

The findings of this study echo those of Cromer et al. 17] and Newington et al. [ 16 ] by providing personal insights into the lived experience of research activity being de-prioritised in favour of attending to service delivery pressures. As these pressures give no indication of abating in the near future, any thoughts of postponing or deferring action to enable research in practice until demands ease, seem ill-advised. Post-doctoral practitioners are essential to help identify and implement the changes required to reshape and reorient health and social care services to more effectively meet the changing needs of the population. With an aim of system-wide transformation, it is inefficient to leave the creation of viable roles and clinical academic career pathways to individual creativity and tenacity, or to the efforts of forward-thinking leaders and organisations.

Local, context-specific research capacity building programmes and strategies help to ensure congruence with local research priorities [ 18 ]. Close alignment of these strategies to wider organisational strategic objectives, business planning, quality strategies and audit activities effectively ‘hard wires’ research, and its supporting infrastructure, as core business [ 11 ]. However, the organisational and geographical variability in experience identified by the findings suggests that something more than broad national policy is required to drive consistent and comparable progress in local implementation.

As the findings exemplify, the absence of credible, sustainable, financially viable and equitably accessible career pathways combining research with practice is an ongoing issue for AHPs. It is a long-standing matter that requires urgent attention, not only for AHPs, but for all health and care professions beyond medicine [ 4 , 5 , 6 , 20 , 21 ]. A greater level of direction, new systems, structures and infrastructure, and more effective coordination and the sharing of good practice may help to accelerate and smooth out the rate of progress across the UK. Normalising access to clinical academic career pathways, and normalising an appropriate degree of research engagement for all practitioners, is fundamental to this. What is certain is that repeatedly spotlighting barriers and obstacles, yet failing to take action, will not resolve the issues.

Harnessing the value added by post-doctoral AHPs

The findings of this study illustrate the personal and professional development accruing from doctoral study for individual AHPs. Beyond the more obvious research-related skills, the value it brings includes the development of analytical and critical thinking skills, practice expertise, time management, resilience, negotiating skills, educational skills, job satisfaction, career development / progression, and enhanced professional standing / credibility. As previously indicated, these areas of growth span all four pillars of practice.

As the findings highlight, in a receptive environment, the value of this personal and professional development has the potential to ripple outwards and positively influence colleagues, services, departments, organisations and even professions – all for the ultimate benefit of the people who access services. Post-doctoral AHPs bring enormous value to organisations but, as we have heard from this study’s participants, they are frequently unrecognised and under-utilised. That in itself generates significant ripple effects, this time in the form of missed opportunities and the associated adverse consequences across the system.

Limitations

There are of course limitations to this study. The relatively small number of responses, uneven geographical spread across the UK, and the fact that not all AHP disciplines are represented, restricts the opportunity to generalise from the study. Similarly, the convenience and self-selecting nature of the sampling process raises questions about how representative the participants are among AHPs in the UK. However, given the qualitative orientation of this study and its analysis, the aim was to gain a deeper understanding of the significance of participants experiences rather than producing data that is representative and generalizable. It is for others to then assess whether the data presented here, and its interpretation, resonates and is applicable and useful within their own clinical context.

Despite being informed by previous research, the bespoke nature of the questionnaire and lack of formal validation, could mean that questions lacked sensitivity to the complex issues involved in understanding the value of a doctorate for AHPs. However, as noted earlier, the questionnaire was sense checked and adjusted prior to being used in order to minimise any lack of sensitivity. Whilst the qualitative open question approach did not permit clarification or probing of responses, this is true of any qualitative survey. Indeed, Braun et al. [ 34 ] demonstrate that online qualitative surveys can deliver rich and nuanced data by promoting a higher level of anonymity than other qualitative approaches and by allowing participants to generate thoughtful (rather than immediate) responses at a time convenient to them.

Conclusions

This study offers findings that clearly articulate the variability of experiences of post-doctoral AHPs. There are powerful exemplars that role model the optimising of benefits for the individual practitioner, the service, organisation and the community it serves. These provide valuable insights to inspire and inform organisations, services leaders and managers with less experience, helping them to move the research in practice agenda [ 1 , 2 , 3 , 4 , 24 ] forward in their own contexts.

The challenges, barriers and obstacles to post-doctoral research engagement described by participants reflect those that have been acknowledged for many years across a range of health systems and countries (see, for example, 7 , 12 , 17 , 18 , 22 , 23 ]. It is important to acknowledge them, but more important is the need to desist from circling around and revisiting them. Instead, the conversation must move forward and generate positive action.

The need to navigate and mitigate the challenges to realise the wide-reaching benefits is fundamental. Reframing perspectives to centre what is to be gained, how it will contribute to enhancing the experiences and outcomes of people accessing services, and what is possible, will help to focus attention on how it can be achieved, one incremental step at a time. There is existing evidence identifying approaches that are productive in this regard, once again including some that are well-established (see, for example, 7 , 8 , 11 , 18 , 22 , 23 , 35 ].

The findings based on the AHP data reported on here demonstrate significant commonalities with our previous findings from nursing and midwifery data [ 21 ]. Given the commonality of the broad systems within which these health and care professionals work, this is unsurprising. Notwithstanding the need to address nuanced differences on a more specific basis, it reinforces the need for urgent, system-wide responses to more effectively, consistently and equitably enable career pathways that combine research and practice for what is a very substantial proportion of the health and care workforce in the UK.

Data availability

The datasets generated and analysed during the current study are not publicly available due to the readily identifiable nature of some participants, given the demographic detail and relative scarcity of some roles mentioned (i.e. there are very few art therapists or music therapists or paramedics with a PhD). Combining this detail with geographical information and job titles could easily identify individuals. However, limited, sufficiently anonymised, data can be made available from the corresponding author on reasonable request.

The umbrella term ‘allied health professions’ encompasses 14 profession groups: art therapy, dietetics, dramatherapy, music therapy, occupational therapy, operating department practice, orthoptics, osteopathy, paramedicine, physiotherapy, podiatry, prosthetics and orthotics, diagnostic and therapeutic radiography and speech and language therapy.

Abbreviations

Addressing Organisational Capacity to do Research Network

Allied Health Profession/al

Collaboration for Leadership in Applied Health and Care Research

Dietician/s

Art therapist/s, Paramedic/s, Music Therapist/s and Orthoptist/s

National Health Service

National Institute for Health and Care Research

Occupational Therapist/s

Podiatrist/s

Physiotherapist/s

Radiographer/s

Speech and Language Therapist/s

United Kingdom

Department of Health and Social Care. The NHS Long Term Plan. 2019. Available at: NHS Long Term Plan » The NHS Long Term Plan Accessed 21/01/2023.

Department of Health and Social Care. The Future of Clinical Research Delivery: 2022 to 2025 implementation plan. 2022. Available at: The Future of Clinical Research Delivery: 2022 to 2025 implementation plan - GOV.UK Accessed 01/02/2023.

National Health Service England. The Allied Health Professions (AHPs) strategy for England – AHPs Deliver. 2022. Available at: NHS England » The Allied Health Professions (AHPs) strategy for England – AHPs Deliver Accessed 21/01/2023.

National Health Service England. We are the NHS: People Plan for 2020/21 – action for us all. 2020. Available at: NHS England » We are the NHS: People Plan for 2020/21 – action for us all Accessed 21/01/2023.

Jones D, Keenan A-M. The rise and rise of NMAHPs in UK clinical research. Future Healthc J. 2021;8:2:e195–7. https://doi.org/10.7861/fhj.2021-0098 .

Article   Google Scholar  

Manley K, Crouch R, Ward R, Clift E, Jackson C, Christie J, Williams H, Harden B. The role of the multi- professional consultant practitioner in supporting workforce transformation in the UK. Adv J Prof Pract. 2022;3(2):1–26. https://doi.org/10.22024/UniKent/03/ajpp.1057 .

Matus J, Walker A, Mickan S. Research capacity building frameworks for allied health professionals: a systematic review. BMC Health Serv Res. 2018;18:716. https://doi.org/10.1186/s12913-018-3518-7 .

Article   PubMed   PubMed Central   Google Scholar  

Slade S, Philip K, Morris M. Frameworks for embedding a research culture in allied health practice: a rapid review. Health Res Policy Syst. 2018;16:29. https://doi.org/10.1186/s12961-018-0304-2 .

Carrick-Sen D, Moore A, Davidson P, Gendong H, Jackson D. International Perspectives of Nurses, midwives and Allied Health professionals Clinical Academic roles: are we at Tipping Point? Int J Practice-based Learn Health Social Care. 2019;7(2):1–15. https://doi.org/10.18552/ijpblhsc.v7i2.639 .

Harris J, Grafton K, Cooke J. Developing a consolidated research framework for clinical allied health professionals practicing in the UK. BMC Health Serv Res. 2020;20:852. https://doi.org/10.1186/s12913-020-05650-3 .

Gee M, Cooke J. How do NHS organisations plan research capacity development? Strategies, strengths and opportunities for improvement. BMC Health Serv Res. 2018;18:198. https://doi.org/10.1186/s12913-018-2992-2 .

Boaz A, Hanney S, Jones T, Soper B. Does the engagement of clinicians and organisations in research improve healthcare performance: a three-stage review. BMJ Open Access. 2015;5. https://doi.org/10.1136/bmjopen-2015-009415 .

Ozdemir B, Karthikesalingam A, Sinha S, Poloniecki J, Hinchliffe J, Thompson M, Gower J, Boaz A, Holt P. Research activity and the association with mortality. PLoS ONE. 2015;10:2. https://doi.org/10.1371/journal.pone.0118253 .

Article   CAS   Google Scholar  

Jonker L, Fisher S. The correlation between National Health Service trusts’ clinical trial activity and both mortality rates and care quality commission ratings: a retrospective cross-sectional study. Public Health. 2018:157, p. 1–6. https://doi.org/10.1016/j.puhe.2017.12.022 .

Jonker L, Fisher S, Dagnan D. Patients admitted to more research-active hospitals have more confidence in staff and are better informed about their condition and medication: results from a retrospective cross-sectional study. J Eval Clin Pract. 2019;26:203–8. https://doi.org/10.1111/jep.13118 .

Article   PubMed   Google Scholar  

Newington L, Wells M, Adonis A, Bolton L, Bolton Saghdaoui L, Coffery M, Crow J, Fadeeva Costa O, Hughes C, Savage M, Shahabi L, Alexander C. A qualitative systematic review and thematic synthesis exploring the impacts of clinical academic activity by healthcare professionals outside medicine. BMC Health Serv Res. 2021;21:400. https://doi.org/10.1186/s12913-021-06354-y .

Comer C, Collings R, McCracken A, Payne C, Moore A. AHP’s perceptions of research in the UK NHS: a survey of research capacity and culture. BMC Health Serv Res. 2022;22:1094. https://doi.org/10.1186/s12913-022-08465-6 .

Cordrey T, King E, Pilkington E, Gore K, Gustafson O. Exploring research capacity and culture of allied health professional: a mixed methods evaluation. BMC Health Serv Res. 2022;22:85. https://doi.org/10.1186/s12913-022-07480-x .

Baltruks D, Callaghan P. Nursing, midwifery and allied health clinical academic research careers in the UK. London: Council of Deans of Health; 2018.

Google Scholar  

Cooper J, Mitchell K, Richardson A, Bramley L. Developing the role of the clinical academic nurse, midwife and allied health professional in healthcare organisations. Int J Practice-Based Learn Health Social Care. 2019;7(2):16–24. https://doi.org/10.18552/ijpblhsc.v7i2.637 .

Hampshaw S, Cooke J, Robertson S, Wood E, Tod A, King R. Understanding the value of a PhD for post-doctoral registered UK nurses: a cross-sectional survey. J Nurs Adm Manag. 2022;30(4):1011–7. https://doi.org/10.1111/jonm.13581 .

Borkowski D, McKinstry C, Cotche M, Williams C, Haines T. Research culture in allied health: a systematic review. Australian J Prim Care. 2016;22(4):294–303. https://doi.org/10.1071/PY15122 .

Marjanovic S, Ball S, Harshfield A, Dimova S, Prideaux R, Carpenter A, Punch D, Simmons R. Involving NHS staff in research. Cambridge: The Healthcare Improvement Studies Institute; 2019.

Health Education England. Allied Health professions’ Research and Innovation Strategy for England. London: Health E/ ducation England; 2022.

Seale C. Philosophy, politics and values. In: Seale C, editor. Researching society and culture. 4th ed. London: Sage; 2018. pp. 9–25.

Braun V, Clarke V, Gray D, editors. Collecting qualitative data: a practical guide to textual, media and virtual techniques. Cambridge, UK: Cambridge University Press; 2017.

O’Brien B, Harris I, Beckman T, Reed D, Cook D. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014;89(9):1245–51.

Diamond A, Ball C, Vorley T, Hughe T, Howe P, Nathwani T. The impact of Doctoral Careers. Final Rep, 130. 2014.

Wilkes L, Cummings J, Ratanapongleka M, Carter B. Doctoral theses in nursing and midwifery: challenging their contribution to nursing scholarship and the profession. Australian J Adv Nurs. 2015;32(4):6–14.

Bryan B, Guccione K. Was it worth it? A qualitative exploration into graduate perceptions of doctoral value. High Educ Res Dev. 2018;37(6):1124–40. https://doi.org/10.1080/07294360.2018.1479378 .

Shakir M, ur Rahman A. Conducting pilot study in a qualitative inquiry: learning some useful lessons. J Posit School Psychol 2 022;6:10, p.1620–4.

Chalmers S, Hill J, Connell L, Ackerley S, Kulkarni A, Roddam H. The value of allied health professional research engagement on healthcare performance: a systematic review. BMC Health Serv Res. 2023;23:766. https://doi.org/10.1186/s12913-023-09555-9 .

Article   CAS   PubMed   PubMed Central   Google Scholar  

National Institute for Health and Care Research. CQC inspections to give more exposure to clinical research taking place in NHS trusts. National Institute for Health and Care Research. 2019. https://www.nihr.ac.uk/news/cqc-inspections-to-give-more-exposure-to-clinical-research-taking-place-in-nhs-trusts/20352 [Accessed 29.03.2023].

Braun V, Clarke V, Boulton E, Davey L, McEvoy C. The online survey as a qualitative research tool. Int J Soc Res Methodol. 2020;24:6. https://doi.org/10.1080/13645579.2020.1805550 .

Westwood G, Richardson A, Latter S, Macleod Clark J, Fader M. Building clinical academic leadership capacity: sustainability through partnership. J Res Nurs. 2018;23(4):346–57. https://doi.org/10.1177/1744987117748348 .

Download references

Acknowledgements

Not applicable.

This study was funded through the National Institute for Health Research (NIHR) Collaboration and Leadership for Allied Health and Care Research for Yorkshire and Humber and the specific AHP data analysis and writing by Health Education England.

Author information

Authors and affiliations.

Dr Jo Watson Consulting Ltd., Hampshire, UK

Division of Nursing and Midwifery, Health Sciences School, University of Sheffield, Sheffield, UK

Steven Robertson, Tony Ryan & Jo Cooke

School of Health and Related Research, Health Sciences School, University of Sheffield, Sheffield, UK

NIHR Health Determinants Research Collaboration, Doncaster, UK

Susan Hampshaw

Health Education England, Manchester, UK

Hazel Roddam

You can also search for this author in PubMed   Google Scholar

Contributions

JW – Contributed to data interpretation, led the preparation and revisions of the manuscript, specifically leading on background, discussion and conclusion. SR - Led on the qualitative data analysis, co-led on the interpretation, and contributed to the preparation and revision of the manuscript, specifically leading on method, results and limitations. TR - Led on the quantitative analysis, contributed to the qualitative analysis and interpretation, and contributed to revisions of the manuscript. EW - Completed initial data management and early analysis of the quantitative data and contributed to revisions of the manuscript. JC - Co-led the conception and design of the study, acquisition of data, interpretation of data, and contributed to revisions of the manuscript. SH - Co-led the conception and design of the study, acquisition of data, interpretation of data and contributed to revisions of the manuscript. HR – Contributed to data interpretation, brought together the writing team and contributed to planning and revisions of the manuscript.

Corresponding author

Correspondence to Jo Watson .

Ethics declarations

Ethics approval and consent to participate.

Ethics approval was obtained from the University of Sheffield ethics committee (Ref: 023667). Informed consent was obtained from all participants in the study.

Consent for publication

Competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Watson, J., Robertson, S., Ryan, T. et al. Understanding the value of a doctorate for allied health professionals in practice in the UK: a survey. BMC Health Serv Res 24 , 566 (2024). https://doi.org/10.1186/s12913-024-11035-7

Download citation

Received : 09 May 2023

Accepted : 23 April 2024

Published : 02 May 2024

DOI : https://doi.org/10.1186/s12913-024-11035-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Allied health professions
  • Research capacity
  • Research capability building
  • Research culture
  • Service improvement
  • Workforce development

BMC Health Services Research

ISSN: 1472-6963

peer review articles on critical thinking

IMAGES

  1. The Extraordinary Teaching Project: Developing Critical Thinking through Peer Review

    peer review articles on critical thinking

  2. Critical Thinking Checklist: Essay Peer Review

    peer review articles on critical thinking

  3. How peer review works? From article submission to publishing

    peer review articles on critical thinking

  4. Writing a critical analysis of an article. How To Write An Analysis Of

    peer review articles on critical thinking

  5. Thinking Critically about Critical Thinking Peer-Review & Sharing Activity

    peer review articles on critical thinking

  6. How to Publish Your Article in a Peer-Reviewed Journal: Survival Guide

    peer review articles on critical thinking

VIDEO

  1. THIS Got Through Peer Review?!

  2. Pharma Pulse: Are peer reviewed medical journal articles reliable?

  3. Peer review: critical feedback #shorts

  4. This Is Exactly How Positive Thinking Works

  5. Submitting an assignment

  6. Module 2

COMMENTS

  1. Bridging critical thinking and transformative learning: The role of

    Although the literature on critical thinking and transformative learning has remained relatively distinct, ... peer-reviewed journal articles, and scholarly books - tend to make objective, evidence-based arguments to support theoretical positions. We regard these readings as necessary for the growth of critical thinkers. Gradually, through ...

  2. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  3. Predicting Everyday Critical Thinking: A Review of Critical Thinking

    There have been a few published and peer-reviewed studies of the assessment that provide weak evidence to support its reliability and validity. In terms of the factor structure, Michael et ... Butler, Heather A. 2024. "Predicting Everyday Critical Thinking: A Review of Critical Thinking Assessments" Journal of Intelligence 12, no. 2: 16. https: ...

  4. Frontiers

    Scientific thinking is the ability to generate, test, and evaluate claims, data, and theories (e.g., Bullock et al., 2009; Koerber et al., 2015 ). Simply stated, the basic tenets of scientific thinking provide students with the tools to distinguish good information from bad. Students have access to nearly limitless information, and the skills ...

  5. The effectiveness of collaborative problem solving in promoting

    Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners' critical thinking in online peer review. Audio Visual Educ Res 6:53-60.

  6. Facilitating critical thinking in decision making-based professional

    Critical thinking ability is one of the higher order thinking capabilities. In peer review, learners learn to comment on works from different perspectives; the process of commenting involves logical thinking and skills in order to make reasonable statements or evaluations (Hovardas, Tsivitanidou, & Zacharia, 2014). The interactive peer review ...

  7. Critical thinking in nursing clinical practice, education and research

    Critical thinking is a complex, dynamic process formed by attitudes and strategic skills, with the aim of achieving a specific goal or objective. The attitudes, including the critical thinking attitudes, constitute an important part of the idea of good care, of the good professional. It could be said that they become a virtue of the nursing ...

  8. Critical Thinking: The Development of an Essential Skill for Nursing

    Critical thinking is applied by nurses in the process of solving problems of patients and decision-making process with creativity to enhance the effect. It is an essential process for a safe, efficient and skillful nursing intervention. Critical thinking according to Scriven and Paul is the mental active process and subtle perception, analysis ...

  9. Critical Thinking

    Critical Thinking. pp. $11.69 (paper). Harvard Educational Review (2021) 91 (1): 133-135. In the preface to Critical Thinking, Jonathan Haber notes that the term critical thinking has become a hallmark of almost any set of educational goals set out in the past thirty years. Yet, the myriad politicians, policy makers, indus-try leaders, and ...

  10. Exploring Students' Critical Thinking Skills Using the ...

    Critical thinking (CT) has been defined as a cognitive process involving reasonable reflective thinking to develop a decision based on the problem faced by a person; CTS include a person's ability for higher-order thinking, problem-solving, and metacognition (Ennis, 1989).Furthermore, CT is reasonable reflective thinking focused on a decision that the students believe in or do, which is in the ...

  11. The role of critical thinking skills and learning styles of university

    Critical thinking is one of the aspects of thinking that has been accepted as a way to overcome the difficulties and to facilitate the access to information in life . ... the students' academic performance was reviewed. After data collection, the data were coded and analyzed, using the SPSS 14 ( SPSS Inc, Chicago, IL, USA) software. To describe ...

  12. Peer interaction and the learning of critical thinking skills

    Kuhn (1991) suggested that practice might help improve thinking skills, and in particular that peer-based practice would be effective in improving such skills. Three studies that attempted to use peer interaction to help enhance students' argumentative reasoning skills are briefly reviewed. Some evidence is provided that supports Kuhn's ...

  13. To Clarity and Beyond: Situating Higher-Order, Critical, and Critical

    For this systematic review, learning from multiple texts served as the specific context for investigating the constructs of higher-order (HOT), critical (CT), and critical-analytic (CAT) thinking. Examining the manifestations of HOT, CT, and CAT within the specific context of learning from multiple texts allowed us to clarify and disentangle these valued modes of thought. We begin by ...

  14. Full article: Children's critical thinking skills: perceptions of

    Introduction. The importance of fostering and developing critical thinking (CT) in children from a young age (Lai Citation 2011) has been widely discussed and endorsed in scholarship (Facione Citation 2011; Lipman Citation 1991).Education policy often highlights CT skills as an essential component of twenty-first-century skills - the set of skills needed to solve the challenges of a rapidly ...

  15. Growth of critical thinking skills in middle school immersive science

    The similarities and differences in the definitions can be seen in at least 10 different assessments of CT skills, from the Watson-Glaser Critical Thinking Appraisal Tool (Watson & Glaser, 2010) to the California Critical thinking Disposition Inventory (Liu et al., 2010). A comprehensive review of the use of these measures and some of their ...

  16. Fostering Critical Thinking, Reasoning, and Argumentation Skills ...

    Developing a position on a socio-scientific issue and defending it using a well-reasoned justification involves complex cognitive skills that are challenging to both teach and assess. Our work centers on instructional strategies for fostering critical thinking skills in high school students using bioethical case studies, decision-making frameworks, and structured analysis tools to scaffold ...

  17. PDF Teaching Critical Thinking Skills: Literature Review

    purposeful, reasoned and goal directed'. Halpren (1997, p. 4) states, 'Critical thinking is purposeful, reasoned, and goal-directed. It is the kind of thinking involved in solving problems, formulating inferences, calculating likelihoods, and making decisions. Critical thinkers use these skills appropriately, without prompting, and

  18. Full article: Life skills for 'real life': How critical thinking is

    Introduction. In European educational policy, 'critical thinking' is one of four 'civic competence areas' that educational institutions and programmes should promote (European Commission/EACEA/Eurydice Citation 2017).Although interpreted in multiple and sometimes contradictory ways, critical thinking is commonly described, in line with frameworks of '21st century skills' (e.g ...

  19. PDF Reggio Emilia: An Essential Tool to Develop Critical Thinking in ...

    According to New (2007), "Reggio Emilia's municipal early childhood program (...) is committed not to the development of more and better child artists, but to the development of creative, critically thinking and collaboratively engaged citizens". Artistic training in early childhood education it is based on plastic experimentation ...

  20. Critical thinking in nursing clinical practice, education and research

    Based on selective analysis of the descriptive and empirical literature that addresses conceptual review of critical thinking, we conducted an analysis of this topic in the settings of clinical practice, training and research from the virtue ethical framework. Following JBI critical appraisal checklist for text and opinion papers, we argue the ...

  21. The Perception of Critical Thinking and Problem Solving Skill among

    Social science students appear to perform better in this skill, as compared to science and engineering students. © 2015 The Authors. Published by Elsevier Ltd. Peer-review under responsibility of GLTR International Sdn. Berhad. Keywords:Critical thinking, gender differences, academic discipline; problem solving; Malaysian undergraduate ...

  22. The Peer-Review Process

    Peer-Review is the process by which scholars critically evaluate each other's research article prior to publication in an academic journal. This page will help you better understand the Peer-Review process , as well as help you identify articles that are peer-reviewed .

  23. Thinking Skills and Creativity

    About the journal. This leading international journal, launched in 2006, uniquely identifies and details critical issues in the future of learning and teaching of creativity, as well as innovations in teaching for thinking. As a peer-reviewed forum for interdisciplinary researchers and communities of researcher-practitioner-educators, the ...

  24. Understanding the value of a doctorate for allied health professionals

    Background The need to transform the United Kingdom's (UK) delivery of health and care services to better meet population needs and expectations is well-established, as is the critical importance of research and innovation to drive those transformations. Allied health professionals (AHPs) represent a significant proportion of the healthcare workforce. Developing and expanding their skills ...