• Reference Manager
  • Simple TEXT file

People also looked at

Original research article, performance assessment of critical thinking: conceptualization, design, and implementation.

characteristics of evaluation in critical thinking

  • 1 Lynch School of Education and Human Development, Boston College, Chestnut Hill, MA, United States
  • 2 Graduate School of Education, Stanford University, Stanford, CA, United States
  • 3 Department of Business and Economics Education, Johannes Gutenberg University, Mainz, Germany

Enhancing students’ critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves dealing with dilemmas involving ambiguity or conflicts among principles and contradictory information. We argue that performance assessment provides the most realistic—and most credible—approach to measuring CT. From this conceptualization and construct definition, we describe one possible framework for building performance assessments of CT with attention to extended performance tasks within the assessment system. The framework is a product of an ongoing, collaborative effort, the International Performance Assessment of Learning (iPAL). The framework comprises four main aspects: (1) The storyline describes a carefully curated version of a complex, real-world situation. (2) The challenge frames the task to be accomplished (3). A portfolio of documents in a range of formats is drawn from multiple sources chosen to have specific characteristics. (4) The scoring rubric comprises a set of scales each linked to a facet of the construct. We discuss a number of use cases, as well as the challenges that arise with the use and valid interpretation of performance assessments. The final section presents elements of the iPAL research program that involve various refinements and extensions of the assessment framework, a number of empirical studies, along with linkages to current work in online reading and information processing.

Introduction

In their mission statements, most colleges declare that a principal goal is to develop students’ higher-order cognitive skills such as critical thinking (CT) and reasoning (e.g., Shavelson, 2010 ; Hyytinen et al., 2019 ). The importance of CT is echoed by business leaders ( Association of American Colleges and Universities [AACU], 2018 ), as well as by college faculty (for curricular analyses in Germany, see e.g., Zlatkin-Troitschanskaia et al., 2018 ). Indeed, in the 2019 administration of the Faculty Survey of Student Engagement (FSSE), 93% of faculty reported that they “very much” or “quite a bit” structure their courses to support student development with respect to thinking critically and analytically. In a listing of 21st century skills, CT was the most highly ranked among FSSE respondents ( Indiana University, 2019 ). Nevertheless, there is considerable evidence that many college students do not develop these skills to a satisfactory standard ( Arum and Roksa, 2011 ; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019 ). This state of affairs represents a serious challenge to higher education – and to society at large.

In view of the importance of CT, as well as evidence of substantial variation in its development during college, its proper measurement is essential to tracking progress in skill development and to providing useful feedback to both teachers and learners. Feedback can help focus students’ attention on key skill areas in need of improvement, and provide insight to teachers on choices of pedagogical strategies and time allocation. Moreover, comparative studies at the program and institutional level can inform higher education leaders and policy makers.

The conceptualization and definition of CT presented here is closely related to models of information processing and online reasoning, the skills that are the focus of this special issue. These two skills are especially germane to the learning environments that college students experience today when much of their academic work is done online. Ideally, students should be capable of more than naïve Internet search, followed by copy-and-paste (e.g., McGrew et al., 2017 ); rather, for example, they should be able to critically evaluate both sources of evidence and the quality of the evidence itself in light of a given purpose ( Leu et al., 2020 ).

In this paper, we present a systematic approach to conceptualizing CT. From that conceptualization and construct definition, we present one possible framework for building performance assessments of CT with particular attention to extended performance tasks within the test environment. The penultimate section discusses some of the challenges that arise with the use and valid interpretation of performance assessment scores. We conclude the paper with a section on future perspectives in an emerging field of research – the iPAL program.

Conceptual Foundations, Definition and Measurement of Critical Thinking

In this section, we briefly review the concept of CT and its definition. In accordance with the principles of evidence-centered design (ECD; Mislevy et al., 2003 ), the conceptualization drives the measurement of the construct; that is, implementation of ECD directly links aspects of the assessment framework to specific facets of the construct. We then argue that performance assessments designed in accordance with such an assessment framework provide the most realistic—and most credible—approach to measuring CT. The section concludes with a sketch of an approach to CT measurement grounded in performance assessment .

Concept and Definition of Critical Thinking

Taxonomies of 21st century skills ( Pellegrino and Hilton, 2012 ) abound, and it is neither surprising that CT appears in most taxonomies of learning, nor that there are many different approaches to defining and operationalizing the construct of CT. There is, however, general agreement that CT is a multifaceted construct ( Liu et al., 2014 ). Liu et al. (2014) identified five key facets of CT: (i) evaluating evidence and the use of evidence; (ii) analyzing arguments; (iii) understanding implications and consequences; (iv) developing sound arguments; and (v) understanding causation and explanation.

There is empirical support for these facets from college faculty. A 2016–2017 survey conducted by the Higher Education Research Institute (HERI) at the University of California, Los Angeles found that a substantial majority of faculty respondents “frequently” encouraged students to: (i) evaluate the quality or reliability of the information they receive; (ii) recognize biases that affect their thinking; (iii) analyze multiple sources of information before coming to a conclusion; and (iv) support their opinions with a logical argument ( Stolzenberg et al., 2019 ).

There is general agreement that CT involves the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion (e.g., Erwin and Sebrell, 2003 ; Kosslyn and Nelson, 2017 ; Shavelson et al., 2018 ). We further suggest that CT includes dealing with dilemmas of ambiguity or conflict among principles and contradictory information ( Oser and Biedermann, 2020 ).

Importantly, Oser and Biedermann (2020) posit that CT can be manifested at three levels. The first level, Critical Analysis , is the most complex of the three levels. Critical Analysis requires both knowledge in a specific discipline (conceptual) and procedural analytical (deduction, inclusion, etc.) knowledge. The second level is Critical Reflection , which involves more generic skills “… necessary for every responsible member of a society” (p. 90). It is “a basic attitude that must be taken into consideration if (new) information is questioned to be true or false, reliable or not reliable, moral or immoral etc.” (p. 90). To engage in Critical Reflection, one needs not only apply analytic reasoning, but also adopt a reflective stance toward the political, social, and other consequences of choosing a course of action. It also involves analyzing the potential motives of various actors involved in the dilemma of interest. The third level, Critical Alertness , involves questioning one’s own or others’ thinking from a skeptical point of view.

Wheeler and Haertel (1993) categorized higher-order skills, such as CT, into two types: (i) when solving problems and making decisions in professional and everyday life, for instance, related to civic affairs and the environment; and (ii) in situations where various mental processes (e.g., comparing, evaluating, and justifying) are developed through formal instruction, usually in a discipline. Hence, in both settings, individuals must confront situations that typically involve a problematic event, contradictory information, and possibly conflicting principles. Indeed, there is an ongoing debate concerning whether CT should be evaluated using generic or discipline-based assessments ( Nagel et al., 2020 ). Whether CT skills are conceptualized as generic or discipline-specific has implications for how they are assessed and how they are incorporated into the classroom.

In the iPAL project, CT is characterized as a multifaceted construct that comprises conceptualizing, analyzing, drawing inferences or synthesizing information, evaluating claims, and applying the results of these reasoning processes to various purposes (e.g., solve a problem, decide on a course of action, find an answer to a given question or reach a conclusion) ( Shavelson et al., 2019 ). In the course of carrying out a CT task, an individual typically engages in activities such as specifying or clarifying a problem; deciding what information is relevant to the problem; evaluating the trustworthiness of information; avoiding judgmental errors based on “fast thinking”; avoiding biases and stereotypes; recognizing different perspectives and how they can reframe a situation; considering the consequences of alternative courses of actions; and communicating clearly and concisely decisions and actions. The order in which activities are carried out can vary among individuals and the processes can be non-linear and reciprocal.

In this article, we focus on generic CT skills. The importance of these skills derives not only from their utility in academic and professional settings, but also the many situations involving challenging moral and ethical issues – often framed in terms of conflicting principles and/or interests – to which individuals have to apply these skills ( Kegan, 1994 ; Tessier-Lavigne, 2020 ). Conflicts and dilemmas are ubiquitous in the contexts in which adults find themselves: work, family, civil society. Moreover, to remain viable in the global economic environment – one characterized by increased competition and advances in second generation artificial intelligence (AI) – today’s college students will need to continually develop and leverage their CT skills. Ideally, colleges offer a supportive environment in which students can develop and practice effective approaches to reasoning about and acting in learning, professional and everyday situations.

Measurement of Critical Thinking

Critical thinking is a multifaceted construct that poses many challenges to those who would develop relevant and valid assessments. For those interested in current approaches to the measurement of CT that are not the focus of this paper, consult Zlatkin-Troitschanskaia et al. (2018) .

In this paper, we have singled out performance assessment as it offers important advantages to measuring CT. Extant tests of CT typically employ response formats such as forced-choice or short-answer, and scenario-based tasks (for an overview, see Liu et al., 2014 ). They all suffer from moderate to severe construct underrepresentation; that is, they fail to capture important facets of the CT construct such as perspective taking and communication. High fidelity performance tasks are viewed as more authentic in that they provide a problem context and require responses that are more similar to what individuals confront in the real world than what is offered by traditional multiple-choice items ( Messick, 1994 ; Braun, 2019 ). This greater verisimilitude promises higher levels of construct representation and lower levels of construct-irrelevant variance. Such performance tasks have the capacity to measure facets of CT that are imperfectly assessed, if at all, using traditional assessments ( Lane and Stone, 2006 ; Braun, 2019 ; Shavelson et al., 2019 ). However, these assertions must be empirically validated, and the measures should be subjected to psychometric analyses. Evidence of the reliability, validity, and interpretative challenges of performance assessment (PA) are extensively detailed in Davey et al. (2015) .

We adopt the following definition of performance assessment:

A performance assessment (sometimes called a work sample when assessing job performance) … is an activity or set of activities that requires test takers, either individually or in groups, to generate products or performances in response to a complex, most often real-world task. These products and performances provide observable evidence bearing on test takers’ knowledge, skills, and abilities—their competencies—in completing the assessment ( Davey et al., 2015 , p. 10).

A performance assessment typically includes an extended performance task and short constructed-response and selected-response (i.e., multiple-choice) tasks (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). In this paper, we refer to both individual performance- and constructed-response tasks as performance tasks (PT) (For an example, see Table 1 in section “iPAL Assessment Framework”).

www.frontiersin.org

Table 1. The iPAL assessment framework.

An Approach to Performance Assessment of Critical Thinking: The iPAL Program

The approach to CT presented here is the result of ongoing work undertaken by the International Performance Assessment of Learning collaborative (iPAL 1 ). iPAL is an international consortium of volunteers, primarily from academia, who have come together to address the dearth in higher education of research and practice in measuring CT with performance tasks ( Shavelson et al., 2018 ). In this section, we present iPAL’s assessment framework as the basis of measuring CT, with examples along the way.

iPAL Background

The iPAL assessment framework builds on the Council of Aid to Education’s Collegiate Learning Assessment (CLA). The CLA was designed to measure cross-disciplinary, generic competencies, such as CT, analytic reasoning, problem solving, and written communication ( Klein et al., 2007 ; Shavelson, 2010 ). Ideally, each PA contained an extended PT (e.g., examining a range of evidential materials related to the crash of an aircraft) and two short PT’s: one in which students either critique an argument or provide a solution in response to a real-world societal issue.

Motivated by considerations of adequate reliability, in 2012, the CLA was later modified to create the CLA+. The CLA+ includes two subtests: a PT and a 25-item Selected Response Question (SRQ) section. The PT presents a document or problem statement and an assignment based on that document which elicits an open-ended response. The CLA+ added the SRQ section (which is not linked substantively to the PT scenario) to increase the number of student responses to obtain more reliable estimates of performance at the student-level than could be achieved with a single PT ( Zahner, 2013 ; Davey et al., 2015 ).

iPAL Assessment Framework

Methodological foundations.

The iPAL framework evolved from the Collegiate Learning Assessment developed by Klein et al. (2007) . It was also informed by the results from the AHELO pilot study ( Organisation for Economic Co-operation and Development [OECD], 2012 , 2013 ), as well as the KoKoHs research program in Germany (for an overview see, Zlatkin-Troitschanskaia et al., 2017 , 2020 ). The ongoing refinement of the iPAL framework has been guided in part by the principles of Evidence Centered Design (ECD) ( Mislevy et al., 2003 ; Mislevy and Haertel, 2006 ; Haertel and Fujii, 2017 ).

In educational measurement, an assessment framework plays a critical intermediary role between the theoretical formulation of the construct and the development of the assessment instrument containing tasks (or items) intended to elicit evidence with respect to that construct ( Mislevy et al., 2003 ). Builders of the assessment framework draw on the construct theory and operationalize it in a way that provides explicit guidance to PT’s developers. Thus, the framework should reflect the relevant facets of the construct, where relevance is determined by substantive theory or an appropriate alternative such as behavioral samples from real-world situations of interest (criterion-sampling; McClelland, 1973 ), as well as the intended use(s) (for an example, see Shavelson et al., 2019 ). By following the requirements and guidelines embodied in the framework, instrument developers strengthen the claim of construct validity for the instrument ( Messick, 1994 ).

An assessment framework can be specified at different levels of granularity: an assessment battery (“omnibus” assessment, for an example see below), a single performance task, or a specific component of an assessment ( Shavelson, 2010 ; Davey et al., 2015 ). In the iPAL program, a performance assessment comprises one or more extended performance tasks and additional selected-response and short constructed-response items. The focus of the framework specified below is on a single PT intended to elicit evidence with respect to some facets of CT, such as the evaluation of the trustworthiness of the documents provided and the capacity to address conflicts of principles.

From the ECD perspective, an assessment is an instrument for generating information to support an evidentiary argument and, therefore, the intended inferences (claims) must guide each stage of the design process. The construct of interest is operationalized through the Student Model , which represents the target knowledge, skills, and abilities, as well as the relationships among them. The student model should also make explicit the assumptions regarding student competencies in foundational skills or content knowledge. The Task Model specifies the features of the problems or items posed to the respondent, with the goal of eliciting the evidence desired. The assessment framework also describes the collection of task models comprising the instrument, with considerations of construct validity, various psychometric characteristics (e.g., reliability) and practical constraints (e.g., testing time and cost). The student model provides grounds for evidence of validity, especially cognitive validity; namely, that the students are thinking critically in responding to the task(s).

In the present context, the target construct (CT) is the competence of individuals to think critically, which entails solving complex, real-world problems, and clearly communicating their conclusions or recommendations for action based on trustworthy, relevant and unbiased information. The situations, drawn from actual events, are challenging and may arise in many possible settings. In contrast to more reductionist approaches to assessment development, the iPAL approach and framework rests on the assumption that properly addressing these situational demands requires the application of a constellation of CT skills appropriate to the particular task presented (e.g., Shavelson, 2010 , 2013 ). For a PT, the assessment framework must also specify the rubric by which the responses will be evaluated. The rubric must be properly linked to the target construct so that the resulting score profile constitutes evidence that is both relevant and interpretable in terms of the student model (for an example, see Zlatkin-Troitschanskaia et al., 2019 ).

iPAL Task Framework

The iPAL ‘omnibus’ framework comprises four main aspects: A storyline , a challenge , a document library , and a scoring rubric . Table 1 displays these aspects, brief descriptions of each, and the corresponding examples drawn from an iPAL performance assessment (Version adapted from original in Hyytinen and Toom, 2019 ). Storylines are drawn from various domains; for example, the worlds of business, public policy, civics, medicine, and family. They often involve moral and/or ethical considerations. Deriving an appropriate storyline from a real-world situation requires careful consideration of which features are to be kept in toto , which adapted for purposes of the assessment, and which to be discarded. Framing the challenge demands care in wording so that there is minimal ambiguity in what is required of the respondent. The difficulty of the challenge depends, in large part, on the nature and extent of the information provided in the document library , the amount of scaffolding included, as well as the scope of the required response. The amount of information and the scope of the challenge should be commensurate with the amount of time available. As is evident from the table, the characteristics of the documents in the library are intended to elicit responses related to facets of CT. For example, with regard to bias, the information provided is intended to play to judgmental errors due to fast thinking and/or motivational reasoning. Ideally, the situation should accommodate multiple solutions of varying degrees of merit.

The dimensions of the scoring rubric are derived from the Task Model and Student Model ( Mislevy et al., 2003 ) and signal which features are to be extracted from the response and indicate how they are to be evaluated. There should be a direct link between the evaluation of the evidence and the claims that are made with respect to the key features of the task model and student model . More specifically, the task model specifies the various manipulations embodied in the PA and so informs scoring, while the student model specifies the capacities students employ in more or less effectively responding to the tasks. The score scales for each of the five facets of CT (see section “Concept and Definition of Critical Thinking”) can be specified using appropriate behavioral anchors (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). Of particular importance is the evaluation of the response with respect to the last dimension of the scoring rubric; namely, the overall coherence and persuasiveness of the argument, building on the explicit or implicit characteristics related to the first five dimensions. The scoring process must be monitored carefully to ensure that (trained) raters are judging each response based on the same types of features and evaluation criteria ( Braun, 2019 ) as indicated by interrater agreement coefficients.

The scoring rubric of the iPAL omnibus framework can be modified for specific tasks ( Lane and Stone, 2006 ). This generic rubric helps ensure consistency across rubrics for different storylines. For example, Zlatkin-Troitschanskaia et al. (2019 , p. 473) used the following scoring scheme:

Based on our construct definition of CT and its four dimensions: (D1-Info) recognizing and evaluating information, (D2-Decision) recognizing and evaluating arguments and making decisions, (D3-Conseq) recognizing and evaluating the consequences of decisions, and (D4-Writing), we developed a corresponding analytic dimensional scoring … The students’ performance is evaluated along the four dimensions, which in turn are subdivided into a total of 23 indicators as (sub)categories of CT … For each dimension, we sought detailed evidence in students’ responses for the indicators and scored them on a six-point Likert-type scale. In order to reduce judgment distortions, an elaborate procedure of ‘behaviorally anchored rating scales’ (Smith and Kendall, 1963) was applied by assigning concrete behavioral expectations to certain scale points (Bernardin et al., 1976). To this end, we defined the scale levels by short descriptions of typical behavior and anchored them with concrete examples. … We trained four raters in 1 day using a specially developed training course to evaluate students’ performance along the 23 indicators clustered into four dimensions (for a description of the rater training, see Klotzer, 2018).

Shavelson et al. (2019) examined the interrater agreement of the scoring scheme developed by Zlatkin-Troitschanskaia et al. (2019) and “found that with 23 items and 2 raters the generalizability (“reliability”) coefficient for total scores to be 0.74 (with 4 raters, 0.84)” ( Shavelson et al., 2019 , p. 15). In the study by Zlatkin-Troitschanskaia et al. (2019 , p. 478) three score profiles were identified (low-, middle-, and high-performer) for students. Proper interpretation of such profiles requires care. For example, there may be multiple possible explanations for low scores such as poor CT skills, a lack of a disposition to engage with the challenge, or the two attributes jointly. These alternative explanations for student performance can potentially pose a threat to the evidentiary argument. In this case, auxiliary information may be available to aid in resolving the ambiguity. For example, student responses to selected- and short-constructed-response items in the PA can provide relevant information about the levels of the different skills possessed by the student. When sufficient data are available, the scores can be modeled statistically and/or qualitatively in such a way as to bring them to bear on the technical quality or interpretability of the claims of the assessment: reliability, validity, and utility evidence ( Davey et al., 2015 ; Zlatkin-Troitschanskaia et al., 2019 ). These kinds of concerns are less critical when PT’s are used in classroom settings. The instructor can draw on other sources of evidence, including direct discussion with the student.

Use of iPAL Performance Assessments in Educational Practice: Evidence From Preliminary Validation Studies

The assessment framework described here supports the development of a PT in a general setting. Many modifications are possible and, indeed, desirable. If the PT is to be more deeply embedded in a certain discipline (e.g., economics, law, or medicine), for example, then the framework must specify characteristics of the narrative and the complementary documents as to the breadth and depth of disciplinary knowledge that is represented.

At present, preliminary field trials employing the omnibus framework (i.e., a full set of documents) indicated that 60 min was generally an inadequate amount of time for students to engage with the full set of complementary documents and to craft a complete response to the challenge (for an example, see Shavelson et al., 2019 ). Accordingly, it would be helpful to develop modified frameworks for PT’s that require substantially less time. For an example, see a short performance assessment of civic online reasoning, requiring response times from 10 to 50 min ( Wineburg et al., 2016 ). Such assessment frameworks could be derived from the omnibus framework by focusing on a reduced number of facets of CT, and specifying the characteristics of the complementary documents to be included – or, perhaps, choices among sets of documents. In principle, one could build a ‘family’ of PT’s, each using the same (or nearly the same) storyline and a subset of the full collection of complementary documents.

Paul and Elder (2007) argue that the goal of CT assessments should be to provide faculty with important information about how well their instruction supports the development of students’ CT. In that spirit, the full family of PT’s could represent all facets of the construct while affording instructors and students more specific insights on strengths and weaknesses with respect to particular facets of CT. Moreover, the framework should be expanded to include the design of a set of short answer and/or multiple choice items to accompany the PT. Ideally, these additional items would be based on the same narrative as the PT to collect more nuanced information on students’ precursor skills such as reading comprehension, while enhancing the overall reliability of the assessment. Areas where students are under-prepared could be addressed before, or even in parallel with the development of the focal CT skills. The parallel approach follows the co-requisite model of developmental education. In other settings (e.g., for summative assessment), these complementary items would be administered after the PT to augment the evidence in relation to the various claims. The full PT taking 90 min or more could serve as a capstone assessment.

As we transition from simply delivering paper-based assessments by computer to taking full advantage of the affordances of a digital platform, we should learn from the hard-won lessons of the past so that we can make swifter progress with fewer missteps. In that regard, we must take validity as the touchstone – assessment design, development and deployment must all be tightly linked to the operational definition of the CT construct. Considerations of reliability and practicality come into play with various use cases that highlight different purposes for the assessment (for future perspectives, see next section).

The iPAL assessment framework represents a feasible compromise between commercial, standardized assessments of CT (e.g., Liu et al., 2014 ), on the one hand, and, on the other, freedom for individual faculty to develop assessment tasks according to idiosyncratic models. It imposes a degree of standardization on both task development and scoring, while still allowing some flexibility for faculty to tailor the assessment to meet their unique needs. In so doing, it addresses a key weakness of the AAC&U’s VALUE initiative 2 (retrieved 5/7/2020) that has achieved wide acceptance among United States colleges.

The VALUE initiative has produced generic scoring rubrics for 15 domains including CT, problem-solving and written communication. A rubric for a particular skill domain (e.g., critical thinking) has five to six dimensions with four ordered performance levels for each dimension (1 = lowest, 4 = highest). The performance levels are accompanied by language that is intended to clearly differentiate among levels. 3 Faculty are asked to submit student work products from a senior level course that is intended to yield evidence with respect to student learning outcomes in a particular domain and that, they believe, can elicit performances at the highest level. The collection of work products is then graded by faculty from other institutions who have been trained to apply the rubrics.

A principal difficulty is that there is neither a common framework to guide the design of the challenge, nor any control on task complexity and difficulty. Consequently, there is substantial heterogeneity in the quality and evidential value of the submitted responses. This also causes difficulties with task scoring and inter-rater reliability. Shavelson et al. (2009) discuss some of the problems arising with non-standardized collections of student work.

In this context, one advantage of the iPAL framework is that it can provide valuable guidance and an explicit structure for faculty in developing performance tasks for both instruction and formative assessment. When faculty design assessments, their focus is typically on content coverage rather than other potentially important characteristics, such as the degree of construct representation and the adequacy of their scoring procedures ( Braun, 2019 ).

Concluding Reflections

Challenges to interpretation and implementation.

Performance tasks such as those generated by iPAL are attractive instruments for assessing CT skills (e.g., Shavelson, 2010 ; Shavelson et al., 2019 ). The attraction mainly rests on the assumption that elaborated PT’s are more authentic (direct) and more completely capture facets of the target construct (i.e., possess greater construct representation) than the widely used selected-response tests. However, as Messick (1994) noted authenticity is a “promissory note” that must be redeemed with empirical research. In practice, there are trade-offs among authenticity, construct validity, and psychometric quality such as reliability ( Davey et al., 2015 ).

One reason for Messick (1994) caution is that authenticity does not guarantee construct validity. The latter must be established by drawing on multiple sources of evidence ( American Educational Research Association et al., 2014 ). Following the ECD principles in designing and developing the PT, as well as the associated scoring rubrics, constitutes an important type of evidence. Further, as Leighton (2019) argues, response process data (“cognitive validity”) is needed to validate claims regarding the cognitive complexity of PT’s. Relevant data can be obtained through cognitive laboratory studies involving methods such as think aloud protocols or eye-tracking. Although time-consuming and expensive, such studies can yield not only evidence of validity, but also valuable information to guide refinements of the PT.

Going forward, iPAL PT’s must be subjected to validation studies as recommended in the Standards for Psychological and Educational Testing by American Educational Research Association et al. (2014) . With a particular focus on the criterion “relationships to other variables,” a framework should include assumptions about the theoretically expected relationships among the indicators assessed by the PT, as well as the indicators’ relationships to external variables such as intelligence or prior (task-relevant) knowledge.

Complementing the necessity of evaluating construct validity, there is the need to consider potential sources of construct-irrelevant variance (CIV). One pertains to student motivation, which is typically greater when the stakes are higher. If students are not motivated, then their performance is likely to be impacted by factors unrelated to their (construct-relevant) ability ( Lane and Stone, 2006 ; Braun et al., 2011 ; Shavelson, 2013 ). Differential motivation across groups can also bias comparisons. Student motivation might be enhanced if the PT is administered in the context of a course with the promise of generating useful feedback on students’ skill profiles.

Construct-irrelevant variance can also occur when students are not equally prepared for the format of the PT or fully appreciate the response requirements. This source of CIV could be alleviated by providing students with practice PT’s. Finally, the use of novel forms of documentation, such as those from the Internet, can potentially introduce CIV due to differential familiarity with forms of representation or contents. Interestingly, this suggests that there may be a conflict between enhancing construct representation and reducing CIV.

Another potential source of CIV is related to response evaluation. Even with training, human raters can vary in accuracy and usage of the full score range. In addition, raters may attend to features of responses that are unrelated to the target construct, such as the length of the students’ responses or the frequency of grammatical errors ( Lane and Stone, 2006 ). Some of these sources of variance could be addressed in an online environment, where word processing software could alert students to potential grammatical and spelling errors before they submit their final work product.

Performance tasks generally take longer to administer and are more costly than traditional assessments, making it more difficult to reliably measure student performance ( Messick, 1994 ; Davey et al., 2015 ). Indeed, it is well known that more than one performance task is needed to obtain high reliability ( Shavelson, 2013 ). This is due to both student-task interactions and variability in scoring. Sources of student-task interactions are differential familiarity with the topic ( Hyytinen and Toom, 2019 ) and differential motivation to engage with the task. The level of reliability required, however, depends on the context of use. For use in formative assessment as part of an instructional program, reliability can be lower than use for summative purposes. In the former case, other types of evidence are generally available to support interpretation and guide pedagogical decisions. Further studies are needed to obtain estimates of reliability in typical instructional settings.

With sufficient data, more sophisticated psychometric analyses become possible. One challenge is that the assumption of unidimensionality required for many psychometric models might be untenable for performance tasks ( Davey et al., 2015 ). Davey et al. (2015) provide the example of a mathematics assessment that requires students to demonstrate not only their mathematics skills but also their written communication skills. Although the iPAL framework does not explicitly address students’ reading comprehension and organization skills, students will likely need to call on these abilities to accomplish the task. Moreover, as the operational definition of CT makes evident, the student must not only deploy several skills in responding to the challenge of the PT, but also carry out component tasks in sequence. The former requirement strongly indicates the need for a multi-dimensional IRT model, while the latter suggests that the usual assumption of local item independence may well be problematic ( Lane and Stone, 2006 ). At the same time, the analytic scoring rubric should facilitate the use of latent class analysis to partition data from large groups into meaningful categories ( Zlatkin-Troitschanskaia et al., 2019 ).

Future Perspectives

Although the iPAL consortium has made substantial progress in the assessment of CT, much remains to be done. Further refinement of existing PT’s and their adaptation to different languages and cultures must continue. To this point, there are a number of examples: The refugee crisis PT (cited in Table 1 ) was translated and adapted from Finnish to US English and then to Colombian Spanish. A PT concerning kidney transplants was translated and adapted from German to US English. Finally, two PT’s based on ‘legacy admissions’ to US colleges were translated and adapted to Colombian Spanish.

With respect to data collection, there is a need for sufficient data to support psychometric analysis of student responses, especially the relationships among the different components of the scoring rubric, as this would inform both task development and response evaluation ( Zlatkin-Troitschanskaia et al., 2019 ). In addition, more intensive study of response processes through cognitive laboratories and the like are needed to strengthen the evidential argument for construct validity ( Leighton, 2019 ). We are currently conducting empirical studies, collecting data on both iPAL PT’s and other measures of CT. These studies will provide evidence of convergent and discriminant validity.

At the same time, efforts should be directed at further development to support different ways CT PT’s might be used—i.e., use cases—especially those that call for formative use of PT’s. Incorporating formative assessment into courses can plausibly be expected to improve students’ competency acquisition ( Zlatkin-Troitschanskaia et al., 2017 ). With suitable choices of storylines, appropriate combinations of (modified) PT’s, supplemented by short-answer and multiple-choice items, could be interwoven into ordinary classroom activities. The supplementary items may be completely separate from the PT’s (as is the case with the CLA+), loosely coupled with the PT’s (as in drawing on the same storyline), or tightly linked to the PT’s (as in requiring elaboration of certain components of the response to the PT).

As an alternative to such integration, stand-alone modules could be embedded in courses to yield evidence of students’ generic CT skills. Core curriculum courses or general education courses offer ideal settings for embedding performance assessments. If these assessments were administered to a representative sample of students in each cohort over their years in college, the results would yield important information on the development of CT skills at a population level. For another example, these PA’s could be used to assess the competence profiles of students entering Bachelor’s or graduate-level programs as a basis for more targeted instructional support.

Thus, in considering different use cases for the assessment of CT, it is evident that several modifications of the iPAL omnibus assessment framework are needed. As noted earlier, assessments built according to this framework are demanding with respect to the extensive preliminary work required by a task and the time required to properly complete it. Thus, it would be helpful to have modified versions of the framework, focusing on one or two facets of the CT construct and calling for a smaller number of supplementary documents. The challenge to the student should be suitably reduced.

Some members of the iPAL collaborative have developed PT’s that are embedded in disciplines such as engineering, law and education ( Crump et al., 2019 ; for teacher education examples, see Jeschke et al., 2019 ). These are proving to be of great interest to various stakeholders and further development is likely. Consequently, it is essential that an appropriate assessment framework be established and implemented. It is both a conceptual and an empirical question as to whether a single framework can guide development in different domains.

Performance Assessment in Online Learning Environment

Over the last 15 years, increasing amounts of time in both college and work are spent using computers and other electronic devices. This has led to formulation of models for the new literacies that attempt to capture some key characteristics of these activities. A prominent example is a model proposed by Leu et al. (2020) . The model frames online reading as a process of problem-based inquiry that calls on five practices to occur during online research and comprehension:

1. Reading to identify important questions,

2. Reading to locate information,

3. Reading to critically evaluate information,

4. Reading to synthesize online information, and

5. Reading and writing to communicate online information.

The parallels with the iPAL definition of CT are evident and suggest there may be benefits to closer links between these two lines of research. For example, a report by Leu et al. (2014) describes empirical studies comparing assessments of online reading using either open-ended or multiple-choice response formats.

The iPAL consortium has begun to take advantage of the affordances of the online environment (for examples, see Schmidt et al. and Nagel et al. in this special issue). Most obviously, Supplementary Materials can now include archival photographs, audio recordings, or videos. Additional tasks might include the online search for relevant documents, though this would add considerably to the time demands. This online search could occur within a simulated Internet environment, as is the case for the IEA’s ePIRLS assessment ( Mullis et al., 2017 ).

The prospect of having access to a wealth of materials that can add to task authenticity is exciting. Yet it can also add ambiguity and information overload. Increased authenticity, then, should be weighed against validity concerns and the time required to absorb the content in these materials. Modifications of the design framework and extensive empirical testing will be required to decide on appropriate trade-offs. A related possibility is to employ some of these materials in short-answer (or even selected-response) items that supplement the main PT. Response formats could include highlighting text or using a drag-and-drop menu to construct a response. Students’ responses could be automatically scored, thereby containing costs. With automated scoring, feedback to students and faculty, including suggestions for next steps in strengthening CT skills, could also be provided without adding to faculty workload. Therefore, taking advantage of the online environment to incorporate new types of supplementary documents should be a high priority and, perhaps, to introduce new response formats as well. Finally, further investigation of the overlap between this formulation of CT and the characterization of online reading promulgated by Leu et al. (2020) is a promising direction to pursue.

Data Availability Statement

All datasets generated for this study are included in the article/supplementary material.

Author Contributions

HB wrote the article. RS, OZ-T, and KB were involved in the preparation and revision of the article and co-wrote the manuscript. All authors contributed to the article and approved the submitted version.

This study was funded in part by the Spencer Foundation (Grant No. #201700123).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We would like to thank all the researchers who have participated in the iPAL program.

  • ^ https://www.ipal-rd.com/
  • ^ https://www.aacu.org/value
  • ^ When test results are reported by means of substantively defined categories, the scoring is termed “criterion-referenced”. This is, in contrast to results, reported as percentiles; such scoring is termed “norm-referenced”.

American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, D.C: American Educational Research Association.

Google Scholar

Arum, R., and Roksa, J. (2011). Academically Adrift: Limited Learning on College Campuses. Chicago, IL: University of Chicago Press.

Association of American Colleges and Universities (n.d.). VALUE: What is value?. Available online at:: https://www.aacu.org/value (accessed May 7, 2020).

Association of American Colleges and Universities [AACU] (2018). Fulfilling the American Dream: Liberal Education and the Future of Work. Available online at:: https://www.aacu.org/research/2018-future-of-work (accessed May 1, 2020).

Braun, H. (2019). Performance assessment and standardization in higher education: a problematic conjunction? Br. J. Educ. Psychol. 89, 429–440. doi: 10.1111/bjep.12274

PubMed Abstract | CrossRef Full Text | Google Scholar

Braun, H. I., Kirsch, I., and Yamoto, K. (2011). An experimental study of the effects of monetary incentives on performance on the 12th grade NAEP reading assessment. Teach. Coll. Rec. 113, 2309–2344.

Crump, N., Sepulveda, C., Fajardo, A., and Aguilera, A. (2019). Systematization of performance tests in critical thinking: an interdisciplinary construction experience. Rev. Estud. Educ. 2, 17–47.

Davey, T., Ferrara, S., Shavelson, R., Holland, P., Webb, N., and Wise, L. (2015). Psychometric Considerations for the Next Generation of Performance Assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service.

Erwin, T. D., and Sebrell, K. W. (2003). Assessment of critical thinking: ETS’s tasks in critical thinking. J. Gen. Educ. 52, 50–70. doi: 10.1353/jge.2003.0019

CrossRef Full Text | Google Scholar

Haertel, G. D., and Fujii, R. (2017). “Evidence-centered design and postsecondary assessment,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 313–339. doi: 10.4324/9781315709307-26

Hyytinen, H., and Toom, A. (2019). Developing a performance assessment task in the Finnish higher education context: conceptual and empirical insights. Br. J. Educ. Psychol. 89, 551–563. doi: 10.1111/bjep.12283

Hyytinen, H., Toom, A., and Shavelson, R. J. (2019). “Enhancing scientific thinking through the development of critical thinking in higher education,” in Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills , eds M. Murtonen and K. Balloo (London: Palgrave MacMillan).

Indiana University (2019). FSSE 2019 Frequencies: FSSE 2019 Aggregate. Available online at:: http://fsse.indiana.edu/pdf/FSSE_IR_2019/summary_tables/FSSE19_Frequencies_(FSSE_2019).pdf (accessed May 1, 2020).

Jeschke, C., Kuhn, C., Lindmeier, A., Zlatkin-Troitschanskaia, O., Saas, H., and Heinze, A. (2019). Performance assessment to investigate the domain specificity of instructional skills among pre-service and in-service teachers of mathematics and economics. Br. J. Educ. Psychol. 89, 538–550. doi: 10.1111/bjep.12277

Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life. Cambridge, MA: Harvard University Press.

Klein, S., Benjamin, R., Shavelson, R., and Bolus, R. (2007). The collegiate learning assessment: facts and fantasies. Eval. Rev. 31, 415–439. doi: 10.1177/0193841x07303318

Kosslyn, S. M., and Nelson, B. (2017). Building the Intentional University: Minerva and the Future of Higher Education. Cambridge, MAL: The MIT Press.

Lane, S., and Stone, C. A. (2006). “Performance assessment,” in Educational Measurement , 4th Edn, ed. R. L. Brennan (Lanham, MA: Rowman & Littlefield Publishers), 387–432.

Leighton, J. P. (2019). The risk–return trade-off: performance assessments and cognitive validation of inferences. Br. J. Educ. Psychol. 89, 441–455. doi: 10.1111/bjep.12271

Leu, D. J., Kiili, C., Forzani, E., Zawilinski, L., McVerry, J. G., and O’Byrne, W. I. (2020). “The new literacies of online research and comprehension,” in The Concise Encyclopedia of Applied Linguistics , ed. C. A. Chapelle (Oxford: Wiley-Blackwell), 844–852.

Leu, D. J., Kulikowich, J. M., Kennedy, C., and Maykel, C. (2014). “The ORCA Project: designing technology-based assessments for online research,” in Paper Presented at the American Educational Research Annual Meeting , Philadelphia, PA.

Liu, O. L., Frankel, L., and Roohr, K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessments. ETS Res. Rep. Ser. 1, 1–23. doi: 10.1002/ets2.12009

McClelland, D. C. (1973). Testing for competence rather than for “intelligence.”. Am. Psychol. 28, 1–14. doi: 10.1037/h0034092

McGrew, S., Ortega, T., Breakstone, J., and Wineburg, S. (2017). The challenge that’s bigger than fake news: civic reasoning in a social media environment. Am. Educ. 4, 4-9, 39.

Mejía, A., Mariño, J. P., and Molina, A. (2019). Incorporating perspective analysis into critical thinking performance assessments. Br. J. Educ. Psychol. 89, 456–467. doi: 10.1111/bjep.12297

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educ. Res. 23, 13–23. doi: 10.3102/0013189x023002013

Mislevy, R. J., Almond, R. G., and Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Res. Rep. Ser. 2003, i–29. doi: 10.1002/j.2333-8504.2003.tb01908.x

Mislevy, R. J., and Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educ. Meas. Issues Pract. 25, 6–20. doi: 10.1111/j.1745-3992.2006.00075.x

Mullis, I. V. S., Martin, M. O., Foy, P., and Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Available online at:: http://timssandpirls.bc.edu/pirls2016/international-results/ (accessed May 1, 2020).

Nagel, M.-T., Zlatkin-Troitschanskaia, O., Schmidt, S., and Beck, K. (2020). “Performance assessment of generic and domain-specific skills in higher education economics,” in Student Learning in German Higher Education , eds O. Zlatkin-Troitschanskaia, H. A. Pant, M. Toepper, and C. Lautenbach (Berlin: Springer), 281–299. doi: 10.1007/978-3-658-27886-1_14

Organisation for Economic Co-operation and Development [OECD] (2012). AHELO: Feasibility Study Report , Vol. 1. Paris: OECD. Design and implementation.

Organisation for Economic Co-operation and Development [OECD] (2013). AHELO: Feasibility Study Report , Vol. 2. Paris: OECD. Data analysis and national experiences.

Oser, F. K., and Biedermann, H. (2020). “A three-level model for critical thinking: critical alertness, critical reflection, and critical analysis,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. O. Zlatkin-Troitschanskaia (Cham: Springer), 89–106. doi: 10.1007/978-3-030-26578-6_7

Paul, R., and Elder, L. (2007). Consequential validity: using assessment to drive instruction. Found. Crit. Think. 29, 31–40.

Pellegrino, J. W., and Hilton, M. L. (eds) (2012). Education for life and work: Developing Transferable Knowledge and Skills in the 21st Century. Washington DC: National Academies Press.

Shavelson, R. (2010). Measuring College Learning Responsibly: Accountability in a New Era. Redwood City, CA: Stanford University Press.

Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educ. Psychol. 48, 73–86. doi: 10.1080/00461520.2013.779483

Shavelson, R. J., Zlatkin-Troitschanskaia, O., Beck, K., Schmidt, S., and Marino, J. P. (2019). Assessment of university students’ critical thinking: next generation performance assessment. Int. J. Test. 19, 337–362. doi: 10.1080/15305058.2018.1543309

Shavelson, R. J., Zlatkin-Troitschanskaia, O., and Marino, J. P. (2018). “International performance assessment of learning in higher education (iPAL): research and development,” in Assessment of Learning Outcomes in Higher Education: Cross-National Comparisons and Perspectives , eds O. Zlatkin-Troitschanskaia, M. Toepper, H. A. Pant, C. Lautenbach, and C. Kuhn (Berlin: Springer), 193–214. doi: 10.1007/978-3-319-74338-7_10

Shavelson, R. J., Klein, S., and Benjamin, R. (2009). The limitations of portfolios. Inside Higher Educ. Available online at: https://www.insidehighered.com/views/2009/10/16/limitations-portfolios

Stolzenberg, E. B., Eagan, M. K., Zimmerman, H. B., Berdan Lozano, J., Cesar-Davis, N. M., Aragon, M. C., et al. (2019). Undergraduate Teaching Faculty: The HERI Faculty Survey 2016–2017. Los Angeles, CA: UCLA.

Tessier-Lavigne, M. (2020). Putting Ethics at the Heart of Innovation. Stanford, CA: Stanford Magazine.

Wheeler, P., and Haertel, G. D. (1993). Resource Handbook on Performance Assessment and Measurement: A Tool for Students, Practitioners, and Policymakers. Palm Coast, FL: Owl Press.

Wineburg, S., McGrew, S., Breakstone, J., and Ortega, T. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Executive Summary. Stanford, CA: Stanford History Education Group.

Zahner, D. (2013). Reliability and Validity–CLA+. Council for Aid to Education. Available online at:: https://pdfs.semanticscholar.org/91ae/8edfac44bce3bed37d8c9091da01d6db3776.pdf .

Zlatkin-Troitschanskaia, O., and Shavelson, R. J. (2019). Performance assessment of student learning in higher education [Special issue]. Br. J. Educ. Psychol. 89, i–iv, 413–563.

Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., and Brückner, S. (2017). Modeling and Measuring Competencies in Higher Education: Approaches to Challenges in Higher Education Policy and Practice. Berlin: Springer VS.

Zlatkin-Troitschanskaia, O., Pant, H. A., Toepper, M., and Lautenbach, C. (eds) (2020). Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results. Wiesbaden: Springer.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., and Pant, H. A. (2018). “Assessment of learning outcomes in higher education: international comparisons and perspectives,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 686–697.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., Schmidt, S., and Beck, K. (2019). On the complementarity of holistic and analytic approaches to performance assessment scoring. Br. J. Educ. Psychol. 89, 468–484. doi: 10.1111/bjep.12286

Keywords : critical thinking, performance assessment, assessment framework, scoring rubric, evidence-centered design, 21st century skills, higher education

Citation: Braun HI, Shavelson RJ, Zlatkin-Troitschanskaia O and Borowiec K (2020) Performance Assessment of Critical Thinking: Conceptualization, Design, and Implementation. Front. Educ. 5:156. doi: 10.3389/feduc.2020.00156

Received: 30 May 2020; Accepted: 04 August 2020; Published: 08 September 2020.

Reviewed by:

Copyright © 2020 Braun, Shavelson, Zlatkin-Troitschanskaia and Borowiec. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Henry I. Braun, [email protected]

This article is part of the Research Topic

Assessing Information Processing and Online Reasoning as a Prerequisite for Learning in Higher Education

loading

How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.

Self-Awareness

Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

characteristics of evaluation in critical thinking

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

Book Insights

Work Disrupted: Opportunity, Resilience, and Growth in the Accelerated Future of Work

Jeff Schwartz and Suzanne Riss

Zenger and Folkman's 10 Fatal Leadership Flaws

Avoiding Common Mistakes in Leadership

Add comment

Comments (1)

priyanka ghogare

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Member Extras

Most Popular

Latest Updates

Article az45dcz

Pain Points Podcast - Presentations Pt 2

Article ad84neo

NEW! Pain Points - How Do I Decide?

Mind Tools Store

About Mind Tools Content

Discover something new today

Finding the Best Mix in Training Methods

Using Mediation To Resolve Conflict

Resolving conflicts peacefully with mediation

How Emotionally Intelligent Are You?

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

Developing personal accountability.

Taking Responsibility to Get Ahead

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

Member Podcast

Logo for Minnesota Libraries Publishing Project

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

7 Critical Thinking and Evaluating Information

In this chapter, you will read a chapter on Critical Thinking and Evaluating Information from a module on Effective Learning Strategies, Student Success by Jazzabel Maya at Austin Community College, Creative Commons Attribution Non-Commercial Share Alike

Use warming up, working out, and cooling down strategies to read the chapter. You will participate in a discussion and write a journal after you finish reading.

Remember to write down the strategies you’re using to warm up, work out, and cool down.

Chapter 7: Critical Thinking and Evaluating Information

LEARNING OBJECTIVES

By the end of this section, you will be able to:

  • Define critical thinking
  • Describe the role that logic plays in critical thinking
  • Describe how both critical and creative thinking skills can be used to problem-solve
  • Describe how critical thinking skills can be used to evaluate information
  • Apply the CRAAP test to evaluate sources of information
  • Identify strategies for developing yourself as a critical thinker

Critical Thinking and Evaluating Information

Critical Thinking

As a college student, you are tasked with engaging and expanding your thinking skills. One of the most important of these skills is critical thinking because it relates to nearly all tasks, situations, topics, careers, environments, challenges, and opportunities. It is a “domain-general” thinking skill, not one that is specific to a particular subject area.

What Is Critical Thinking?

Critical thinking  is clear, reasonable, reflective thinking focused on deciding what to believe or do. It means asking probing questions like “How do we know?” or “Is this true in every case or just in this instance?” It involves being skeptical and challenging assumptions rather than simply memorizing facts or blindly accepting what you hear or read.

Imagine, for example, that you’re reading a history textbook. You wonder who wrote it and why, because you detect certain biases in the writing. You find that the author has a limited scope of research focused only on a particular group within a population. In this case, your critical thinking reveals that there are “other sides to the story.”

Who are critical thinkers, and what characteristics do they have in common? Critical thinkers are usually curious and reflective people. They like to explore and probe new areas and seek knowledge, clarification, and new solutions. They ask pertinent questions, evaluate statements and arguments, and they distinguish between facts and opinion. They are also willing to examine their own beliefs, possessing a manner of humility that allows them to admit lack of knowledge or understanding when needed. They are open to changing their mind. Perhaps most of all, they actively enjoy learning, and seeking new knowledge is a lifelong pursuit. This may well be you!

No matter where you are on the road to being a critical thinker, you can always more fully develop and finely tune your skills. Doing so will help you develop more balanced arguments, express yourself clearly, read critically, and glean important information efficiently. Critical thinking skills will help you in any profession or any circumstance of life, from science to art to business to teaching. With critical thinking, you become a clearer thinker and problem solver.

Critical Thinking and Logic

Critical thinking is fundamentally a process of questioning information and data. You may question the information you read in a textbook, or you may question what a politician or a professor or a classmate says. You can also question a commonly-held belief or a new idea. With critical thinking, anything and everything is subject to question and examination for the purpose of logically constructing reasoned perspectives.

What Is Logic?

The word  logic  comes from the Ancient Greek  logike , referring to the science or art of reasoning. Using logic, a person evaluates arguments and reasoning and strives to distinguish between good and bad reasoning, or between truth and falsehood. Using logic, you can evaluate the ideas and claims of others, make good decisions, and form sound beliefs about the world. [1]

Questions of Logic in Critical Thinking

Let’s use a simple example of applying logic to a critical-thinking situation. In this hypothetical scenario, a man has a Ph.D. in political science, and he works as a professor at a local college. His wife works at the college, too. They have three young children in the local school system, and their family is well known in the community. The man is now running for political office. Are his credentials and experience sufficient for entering public office? Will he be effective in the political office? Some voters might believe that his personal life and current job, on the surface, suggest he will do well in the position, and they will vote for him. In truth, the characteristics described don’t guarantee that the man will do a good job. The information is somewhat irrelevant. What else might you want to know? How about whether the man had already held a political office and done a good job? In this case, we want to think critically about how much information is adequate in order to make a decision based on  logic  instead of  assumptions.

The following questions, presented in Figure 1, below, are ones you may apply to formulating a logical, reasoned perspective in the above scenario or any other situation:

  • What’s happening?  Gather the basic information and begin to think of questions.
  • Why is it important?  Ask yourself why it’s significant and whether or not you agree.
  • What don’t I see?  Is there anything important missing?
  • How do I know?  Ask yourself where the information came from and how it was constructed.
  • Who is saying it?  What’s the position of the speaker and what is influencing them?
  • What else?   What if?  What other ideas exist and are there other possibilities?

Infographic titled "Questions a Critical Thinker Asks." From the top, text reads: What's Happening? Gather the basic information and begin to think of questions (image of two stick figures talking to each other). Why is it Important? Ask yourself why it's significant and whether or not you agree. (Image of bearded stick figure sitting on a rock.) What Don't I See? Is there anything important missing? (Image of stick figure wearing a blindfold, whistling, walking away from a sign labeled Answers.) How Do I Know? Ask yourself where the information came from and how it was constructed. (Image of stick figure in a lab coat, glasses, holding a beaker.) Who is Saying It? What's the position of the speaker and what is influencing them? (Image of stick figure reading a newspaper.) What Else? What If? What other ideas exist and are there other possibilities? (Stick figure version of Albert Einstein with a thought bubble saying "If only time were relative...".

Problem-Solving with Critical Thinking

For most people, a typical day is filled with critical thinking and problem-solving challenges. In fact, critical thinking and problem-solving go hand-in-hand. They both refer to using knowledge, facts, and data to solve problems effectively. But with problem-solving, you are specifically identifying, selecting, and defending your solution. Below are some examples of using critical thinking to problem-solve:

  • Your roommate was upset and said some unkind words to you, which put a crimp in the relationship. You try to see through the angry behaviors to determine how you might best support the roommate and help bring the relationship back to a comfortable spot.
  • Your campus club has been languishing due to lack of participation and funds. The new club president, though, is a marketing major and has identified some strategies to interest students in joining and supporting the club. Implementation is forthcoming.
  • Your final art class project challenges you to conceptualize form in new ways. On the last day of class when students present their projects, you describe the techniques you used to fulfill the assignment. You explain why and how you selected that approach.
  • Your math teacher sees that the class is not quite grasping a concept. She uses clever questioning to dispel anxiety and guide you to a new understanding of the concept.
  • You have a job interview for a position that you feel you are only partially qualified for, although you really want the job and you are excited about the prospects. You analyze how you will explain your skills and experiences in a way to show that you are a good match for the prospective employer.
  • You are doing well in college, and most of your college and living expenses are covered. But there are some gaps between what you want and what you feel you can afford. You analyze your income, savings, and budget to better calculate what you will need to stay in college and maintain your desired level of spending.

Problem-Solving Action Checklist

Problem-solving can be an efficient and rewarding process, especially if you are organized and mindful of critical steps and strategies. Remember to assume the attributes of a good critical thinker: if you are curious, reflective, knowledge-seeking, open to change, probing, organized, and ethical, your challenge or problem will be less of a hurdle, and you’ll be in a good position to find intelligent solutions. The steps outlined in this checklist will help you adhere to these qualities in your approach to any problem:

Critical and Creative Thinking

Critical and creative thinking (described in more detail in Chapter 6: Theories of Learning) complement each other when it comes to problem-solving. The following words, by Dr. Andrew Robert Baker, are excerpted from his “Thinking Critically and Creatively” essay. Dr. Baker illuminates some of the many ways that college students will be exposed to critical and creative thinking and how it can enrich their learning experiences.

THINKING CRITICALLY AND CREATIVELY Critical thinking skills are perhaps the most fundamental skills involved in making judgments and solving problems. You use them every day, and you can continue improving them. The ability to think critically about a matter—to analyze a question, situation, or problem down to its most basic parts—is what helps us evaluate the accuracy and truthfulness of statements, claims, and information we read and hear. It is the sharp knife that, when honed, separates fact from fiction, honesty from lies, and the accurate from the misleading. We all use this skill to one degree or another almost every day. For example, we use critical thinking every day as we consider the latest consumer products and why one particular product is the best among its peers. Is it a quality product because a celebrity endorses it? Because a lot of other people may have used it? Because it is made by one company versus another? Or perhaps because it is made in one country or another? These are questions representative of critical thinking. The academic setting demands more of us in terms of critical thinking than everyday life. It demands that we evaluate information and analyze myriad issues. It is the environment where our critical thinking skills can be the difference between success and failure. In this environment we must consider information in an analytical, critical manner. We must ask questions—What is the source of this information? Is this source an expert one and what makes it so? Are there multiple perspectives to consider on an issue? Do multiple sources agree or disagree on an issue? Does quality research substantiate information or opinion? Do I have any personal biases that may affect my consideration of this information? It is only through purposeful, frequent, intentional questioning such as this that we can sharpen our critical thinking skills and improve as students, learners and researchers. While critical thinking analyzes information and roots out the true nature and facets of problems, it is creative thinking that drives progress forward when it comes to solving these problems. Exceptional creative thinkers are people that invent new solutions to existing problems that do not rely on past or current solutions. They are the ones who invent solution C when everyone else is still arguing between A and B. Creative thinking skills involve using strategies to clear the mind so that our thoughts and ideas can transcend the current limitations of a problem and allow us to see beyond barriers that prevent new solutions from being found. Brainstorming is the simplest example of intentional creative thinking that most people have tried at least once. With the quick generation of many ideas at once, we can block-out our brain’s natural tendency to limit our solution-generating abilities so we can access and combine many possible solutions/thoughts and invent new ones. It is sort of like sprinting through a race’s finish line only to find there is new track on the other side and we can keep going, if we choose. As with critical thinking, higher education both demands creative thinking from us and is the perfect place to practice and develop the skill. Everything from word problems in a math class, to opinion or persuasive speeches and papers, call upon our creative thinking skills to generate new solutions and perspectives in response to our professor’s demands. Creative thinking skills ask questions such as—What if? Why not? What else is out there? Can I combine perspectives/solutions? What is something no one else has brought-up? What is being forgotten/ignored? What about ______? It is the opening of doors and options that follows problem-identification. Consider an assignment that required you to compare two different authors on the topic of education and select and defend one as better. Now add to this scenario that your professor clearly prefers one author over the other. While critical thinking can get you as far as identifying the similarities and differences between these authors and evaluating their merits, it is creative thinking that you must use if you wish to challenge your professor’s opinion and invent new perspectives on the authors that have not previously been considered. So, what can we do to develop our critical and creative thinking skills? Although many students may dislike it, group work is an excellent way to develop our thinking skills. Many times I have heard from students their disdain for working in groups based on scheduling, varied levels of commitment to the group or project, and personality conflicts too, of course. True—it’s not always easy, but that is why it is so effective. When we work collaboratively on a project or problem we bring many brains to bear on a subject. These different brains will naturally develop varied ways of solving or explaining problems and examining information. To the observant individual we see that this places us in a constant state of back and forth critical/creative thinking modes. For example, in group work we are simultaneously analyzing information and generating solutions on our own, while challenging other’s analyses/ideas and responding to challenges to our own analyses/ideas. This is part of why students tend to avoid group work—it challenges us as thinkers and forces us to analyze others while defending ourselves, which is not something we are used to or comfortable with as most of our educational experiences involve solo work. Your professors know this—that’s why we assign it—to help you grow as students, learners, and thinkers! —Dr. Andrew Robert Baker,  Foundations of Academic Success: Words of Wisdom

Evaluating Information with Critical Thinking

Evaluating information can be one of the most complex tasks you will be faced with in college. But if you utilize the following four strategies, you will be well on your way to success:

  • Read for understanding
  • Examine arguments
  • Clarify thinking
  • Cultivate “habits of mind”

Read for Understanding

When you read, take notes or mark the text to track your thinking about what you are reading. As you make connections and ask questions in response to what you read,  you monitor your comprehension and enhance your long-term understanding of the material. You will want to mark important arguments and key facts. Indicate where you agree and disagree or have further questions. You don’t necessarily need to read every word, but make sure you understand the concepts or the intentions behind what is written. See the chapter on  Active Reading Strategies  for additional tips.

Examine Arguments

When you examine arguments or claims that an author, speaker, or other source is making, your goal is to identify and examine the hard facts. You can use the spectrum of authority strategy for this purpose. The spectrum of authority strategy assists you in identifying the “hot” end of an argument—feelings, beliefs, cultural influences, and societal influences—and the “cold” end of an argument—scientific influences. The most compelling arguments balance elements from both ends of the spectrum. The following video explains this strategy in further detail:

Clarify Thinking

When you use critical thinking to evaluate information, you need to clarify your thinking to yourself and likely to others. Doing this well is mainly a process of asking and answering probing questions, such as the logic questions discussed earlier. Design your questions to fit your needs, but be sure to cover adequate ground. What is the purpose? What question are we trying to answer? What point of view is being expressed? What assumptions are we or others making? What are the facts and data we know, and how do we know them? What are the concepts we’re working with? What are the conclusions, and do they make sense? What are the implications?

Cultivate “Habits of Mind”

“Habits of mind” are the personal commitments, values, and standards you have about the principle of good thinking. Consider your intellectual commitments, values, and standards. Do you approach problems with an open mind, a respect for truth, and an inquiring attitude? Some good habits to have when thinking critically are being receptive to having your opinions changed, having respect for others, being independent and not accepting something is true until you’ve had the time to examine the available evidence, being fair-minded, having respect for a reason, having an inquiring mind, not making assumptions, and always, especially, questioning your own conclusions—in other words, developing an intellectual work ethic. Try to work these qualities into your daily life.

In 2010, a textbook being used in fourth-grade classrooms in Virginia became big news for all the wrong reasons. The book,  Our Virginia  by Joy Masoff, had caught the attention of a parent who was helping her child do her homework, according to  an article in  The Washington Post . Carol Sheriff was a historian for the College of William and Mary and as she worked with her daughter, she began to notice some glaring historical errors, not the least of which was a passage which described how thousands of African Americans fought for the South during the Civil War.

Further investigation into the book revealed that, although the author had written textbooks on a variety of subjects, she was not a trained historian. The research she had done to write  Our Virginia,  and in particular the information she included about Black Confederate soldiers, was done through the Internet and included sources created by groups like the Sons of Confederate Veterans, an organization which promotes views of history that de-emphasize the role of slavery in the Civil War.

How did a book with errors like these come to be used as part of the curriculum and who was at fault? Was it Masoff for using untrustworthy sources for her research? Was it the editors who allowed the book to be published with these errors intact? Was it the school board for approving the book without more closely reviewing its accuracy?

There are a number of issues at play in the case of  Our Virginia , but there’s no question that evaluating sources is an important part of the research process and doesn’t just apply to Internet sources. Using inaccurate, irrelevant, or poorly researched sources can affect the quality of your own work. Being able to understand and apply the concepts that follow is crucial to becoming a more savvy user and creator of information.

When you begin evaluating sources, what should you consider? The  CRAAP test  is a series of common evaluative elements you can use to evaluate the  C urrency,  R elevance,  A uthority,  A ccuracy, and  P urpose of your sources. The CRAAP test was developed by librarians at California State University at Chico and it gives you a good, overall set of elements to look for when evaluating a resource. Let’s consider what each of these evaluative elements means. You can visit the ACC Library’s Web page for a tutorial on  Evaluating Information  using the CRAAP test.

One of the most important and interesting steps to take as you begin researching a subject is selecting the resources that will help you build your thesis and support your assertions. Certain topics require you to pay special attention to how current your resource is—because they are time sensitive, because they have evolved so much over the years, or because new research comes out on the topic so frequently. When evaluating the currency of an article, consider the following:

  • When was the item written, and how frequently does the publication come out?
  • Is there evidence of newly added or updated information in the item?
  • If the information is dated, is it still suitable for your topic?
  • How frequently does information change about your topic?

Understanding what resources are most applicable to your subject and why they are applicable can help you focus and refine your thesis. Many topics are broad and searching for information on them produces a wide range of resources. Narrowing your topic and focusing on resources specific to your needs can help reduce the piles of information and help you focus in on what is truly important to read and reference. When determining relevance consider the following:

  • Does the item contain information relevant to your argument or thesis?
  • Read the article’s introduction, thesis, and conclusion.
  • Scan main headings and identify article keywords.
  • For book resources, start with the index or table of contents—how wide a scope does the item have? Will you use part or all of this resource?
  • Does the information presented support or refute your ideas?
  • If the information refutes your ideas, how will this change your argument?
  • Does the material provide you with current information?
  • What is the material’s intended audience?

Understanding more about your information’s source helps you determine when, how, and where to use that information. Is your author an expert on the subject? Do they have some personal stake in the argument they are making? What is the author or information producer’s background? When determining the authority of your source, consider the following:

  • What are the author’s credentials?
  • What is the author’s level of education, experience, and/or occupation?
  • What qualifies the author to write about this topic?
  • What affiliations does the author have? Could these affiliations affect their position?
  • What organization or body published the information? Is it authoritative? Does it have an explicit position or bias?

Determining where information comes from, if the evidence supports the information, and if the information has been reviewed or refereed can help you decide how and whether to use a source. When determining the accuracy of a source, consider the following:

  • Is the source well-documented? Does it include footnotes, citations, or a bibliography?
  • Is information in the source presented as fact, opinion, or propaganda? Are biases clear?
  • Can you verify information from the references cited in the source?
  • Is the information written clearly and free of typographical and grammatical mistakes? Does the source look to be edited before publication? A clean, well-presented paper does not always indicate accuracy, but usually at least means more eyes have been on the information.

Knowing why the information was created is a key to evaluation. Understanding the reason or purpose of the information, if the information has clear intentions, or if the information is fact, opinion, or propaganda will help you decide how and why to use information:

  • Is the author’s purpose to inform, sell, persuade, or entertain?
  • Does the source have an obvious bias or prejudice?
  • Is the article presented from multiple points of view?
  • Does the author omit important facts or data that might disprove their argument?
  • Is the author’s language informal, joking, emotional, or impassioned?
  • Is the information clearly supported by evidence?

When you feel overwhelmed by the information you are finding, the CRAAP test can help you determine which information is the most useful to your research topic. How you respond to what you find out using the CRAAP test will depend on your topic. Maybe you want to use two overtly biased resources to inform an overview of typical arguments in a particular field. Perhaps your topic is historical and currency means the past hundred years rather than the past one or two years. Use the CRAAP test, be knowledgeable about your topic, and you will be on your way to evaluating information efficiently and well!

Developing Yourself As a Critical Thinker

Critical thinking is a fundamental skill for college students, but it should also be a lifelong pursuit. Below are additional strategies to develop yourself as a critical thinker in college and in everyday life:

  • Reflect and practice : Always reflect on what you’ve learned. Is it true all the time? How did you arrive at your conclusions?
  • Use wasted time : It’s certainly important to make time for relaxing, but if you find you are indulging in too much of a good thing, think about using your time more constructively. Determine when you do your best thinking and try to learn something new during that part of the day.
  • Redefine the way you see things : It can be very uninteresting to always think the same way. Challenge yourself to see familiar things in new ways. Put yourself in someone else’s shoes and consider things from a different angle or perspective.  If you’re trying to solve a problem, list all your concerns: what you need in order to solve it, who can help, what some possible barriers might be, etc. It’s often possible to reframe a problem as an opportunity. Try to find a solution where there seems to be none.
  • Analyze the influences on your thinking and in your life : Why do you think or feel the way you do? Analyze your influences. Think about who in your life influences you. Do you feel or react a certain way because of social convention, or because you believe it is what is expected of you? Try to break out of any molds that may be constricting you.
  • Express yourself : Critical thinking also involves being able to express yourself clearly. Most important in expressing yourself clearly is stating one point at a time. You might be inclined to argue every thought, but you might have greater impact if you focus just on your main arguments. This will help others to follow your thinking clearly. For more abstract ideas, assume that your audience may not understand. Provide examples, analogies, or metaphors where you can.
  • Enhance your wellness : It’s easier to think critically when you take care of your mental and physical health. Try taking activity breaks throughout the day to reach 30 to 60 minutes of physical activity each day. Scheduling physical activity into your day can help lower stress and increase mental alertness. Also,  do your most difficult work when you have the most energy . Think about the time of day you are most effective and have the most energy. Plan to do your most difficult work during these times. And be sure to  reach out for help i f you feel you need assistance with your mental or physical health (see  Maintaining Your Mental and Physical Health  for more information).

Complete Section #2 Below: ACTIVITY: REFLECT ON CRITICAL THINKING

Key takeaways.

  • Critical thinking is logical and reflective thinking focused on deciding what to believe or do.
  • Critical thinking involves questioning and evaluating information.
  • Critical and creative thinking both contribute to our ability to solve problems in a variety of contexts.
  • Evaluating information is a complex, but essential, process. You can use the CRAAP test to help determine if sources and information are reliable.
  • You can take specific actions to develop and strengthen your critical thinking skills.

Use the warm up, work out, and cool down strategies for a discussion.

Prepare for a discussion by writing down the main ideas and most important supporting points in this chapter. Prepare several of your own responses to the supporting points. These might be examples of how you use critical thinking in your life. What questions might you be prepared to ask your fellow students during this discussion.

After the discussion, reflect on what you’ve learned from the other students.

Use warm up, work out, and cool down strategies for this journal writing activity.

Think about someone you consider to be a critical thinker (friend, professor, historical figure, etc). What qualities does he/she have?

  • Review some of the critical thinking strategies discussed on this page. Pick one strategy that makes sense to you. How can you apply this critical thinking technique to your academic work?
  • Habits of mind are attitudes and beliefs that influence how you approach the world (i.e., inquiring attitude, open mind, respect for truth, etc). What is one habit of mind you would like to actively develop over the next year? How will you develop a daily practice to cultivate this habit?
  • Write your responses in journal form, and submit according to your instructor’s guidelines.

Academic Literacy Copyright © by Lori-Beth Larsen is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

The Peak Performance Center

The Peak Performance Center

The pursuit of performance excellence, critical thinking.

Critical Thinking header

Critical thinking refers to the process of actively analyzing, assessing, synthesizing, evaluating and reflecting on information gathered from observation, experience, or communication. It is thinking in a clear, logical, reasoned, and reflective manner to solve problems or make decisions. Basically, critical thinking is taking a hard look at something to understand what it really means.

Critical Thinkers

Critical thinkers do not simply accept all ideas, theories, and conclusions as facts. They have a mindset of questioning ideas and conclusions. They make reasoned judgments that are logical and well thought out by assessing the evidence that supports a specific theory or conclusion.

When presented with a new piece of new information, critical thinkers may ask questions such as;

“What information supports that?”

“How was this information obtained?”

“Who obtained the information?”

“How do we know the information is valid?”

“Why is it that way?”

“What makes it do that?”

“How do we know that?”

“Are there other possibilities?”

Critical Thinking

Combination of Analytical and Creative Thinking

Many people perceive critical thinking just as analytical thinking. However, critical thinking incorporates both analytical thinking and creative thinking. Critical thinking does involve breaking down information into parts and analyzing the parts in a logical, step-by-step manner. However, it also involves challenging consensus to formulate new creative ideas and generate innovative solutions. It is critical thinking that helps to evaluate and improve your creative ideas.

Critical Thinking Skills

Elements of Critical Thinking

Critical thinking involves:

  • Gathering relevant information
  • Evaluating information
  • Asking questions
  • Assessing bias or unsubstantiated assumptions
  • Making inferences from the information and filling in gaps
  • Using abstract ideas to interpret information
  • Formulating ideas
  • Weighing opinions
  • Reaching well-reasoned conclusions
  • Considering alternative possibilities
  • Testing conclusions
  • Verifying if evidence/argument support the conclusions

Developing Critical Thinking Skills

Critical thinking is considered a higher order thinking skills, such as analysis, synthesis, deduction, inference, reason, and evaluation. In order to demonstrate critical thinking, you would need to develop skills in;

Interpreting : understanding the significance or meaning of information

Analyzing : breaking information down into its parts

Connecting : making connections between related items or pieces of information.

Integrating : connecting and combining information to better understand the relationship between the information.

Evaluating : judging the value, credibility, or strength of something

Reasoning : creating an argument through logical steps

Deducing : forming a logical opinion about something based on the information or evidence that is available

Inferring : figuring something out through reasoning based on assumptions and ideas

Generating : producing new information, ideas, products, or ways of viewing things.

Blooms Taxonomy

Bloom’s Taxonomy Revised

Mind Mapping

Chunking Information

Brainstorming

characteristics of evaluation in critical thinking

Copyright © 2024 | WordPress Theme by MH Themes

web analytics

Assessment of Critical Thinking

  • First Online: 10 December 2023

Cite this chapter

characteristics of evaluation in critical thinking

  • Dirk Jahn 3 &
  • Michael Cursio 4  

132 Accesses

The term “to assess” has various meanings, such as to judge, evaluate, estimate, gauge, or determine. Assessment is therefore a diagnostic inventory of certain characteristics of a section of observable reality on the basis of defined criteria. In a pedagogical context, assessments aim to make learners’ knowledge, skills, or attitudes observable in certain application situations and to assess them on the basis of observation criteria.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

To give an example: Holistic Critical Thinking Rubric from East Georgia College; Available at https://studylib.net/doc/7608742/east-georgia-college-holistic-critical-thinking-rubric-cr… (04/03/2020).

Astleitner, H. (1998). Kritisches Denken. Basisqualifikation für Lehrer und Ausbildner . Studien.

Google Scholar  

Biggs, J. (2003). Aligning teaching and assessment to curriculum objectives . https://www.heacademy.ac.uk/sites/default/files/biggs-aligning-teaching-and-assessment.pdf . Accessed 21 Apr 2015.

Brookfield, S. (2003). Critical thinking in adulthood. In D. J. Fasko & D. J. Fasko (Eds.), Critical thinking and reasoning. Current research, theory, and practice (pp. 143–163). Hampton Press.

Ennis, R. H. (2003). Critical thinking assessment. In D. Fasko (Ed.), Critical thinking and reasoning. Current research, theory, and practice (pp. 293–314). Hampton Press.

Garrison, D. R. (1992). Critical thinking and self-directed learning in adult education: an analysis of responsibility and control issues. Adult Education Quarterly, 42 (3), 136–148.

Article   Google Scholar  

Garrison, D. R., & Anderson, T. (2003). E-learning in the 21st century. A framework for research and practice . Routledge.

Book   Google Scholar  

Grotjahn, R. (1999). Testtheorie: Grundzüge und Anwendung in der Praxis. Materialien Deutsch als Fremdsprache, 53 , 304–341.

Halpern, D. F. (2003). The “how” and “why” of critical thinking assessment. In D. Fasko (Ed.), Critical thinking and reasoning: Current research, theory and practice . Hampton Press.

Handke, J., & Schäfer, A. M. (2012). E-learning, E-teaching und E-assessment in der Hochschullehre: Eine Anleitung: Eine Anleitung . Oldenbourg.

Ingenkamp, K. (1985). Lehrbuch der Pädagogischen Diagnostik . Beltz Verlag.

Jahn, D. (2012a). Kritisches Denken fördern können. Entwicklung eines didaktischen Designs zur Qualifizierung pädagogischer Professionals . Shaker.

Landis, M., Swain, K. D., Friehe, M. J., & Coufal, K. L. (2007). Evaluating critical thinking in class and online: Comparison of the Newman method and the Facione Rubric. Teacher Education Quarterly, 34 (4), 121–136.

Newman, D. R., Webb, B., & Cochrane, C. (1995). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Interpersonal Computing and Technology: An Electronic Journal for the 21st Century, 2 , 56–77.

Newman, D. R., Johnson, C., Cochrane, C. & Webb, B. (1996). An experiment in group learning technology: evaluating critical thinking in face-to-face and computer-supported seminars . Verfügbar unter: http://emoderators.com/ipct-j/1996/n1/newman/contents.html . Accessed 12 Apr.

Pandero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9 , 129–144.

Reinmann-Rothmeier, G., & Mandl, H. (1999). Unterrichten und Lernumgebungen gestalten (überarbeitete Fassung). Forschungsbericht Nr. 60. Ludwig-Maximilans-Universität München Institut für Pädagogische Psychologie und Empirische Pädagogik.

Rieck, K, unter Mitarbeit von Hoffmann, D., & Friege, G. (2005). Gute Aufgaben. In Modulbeschreibung des Programms SINUS-Transfer Grundschule. https://www.schulportal-thueringen.de/get-data/a79020fe-f99b-4153-8de5-cfff12f92f30/N1.pdf . Accessed 27 Jan 2020.

Sopka, S., Simon, M., & Beckers, S. (2013). “Assessment drives Learning”: Konzepte zur Erfolgs- und Qualitätskontrolle. In M. St. Pierre & G. Breuer (Eds.), Simulation in der Medizin . Springer.

Wilbers, K. (2014). Wirtschaftsunterricht gestalten. Toolbox (2. Aufl.). epubli.

Wilbers, K. (2019). Wirtschaftsunterricht gestalten. epubli GmbH. https://www.pedocs.de/volltexte/2019/17949/pdf/Wilbers_2019_Wi.rtschaftsunterricht_gestalten.pdf . Accessed 24 Okt 2019.

Download references

Author information

Authors and affiliations.

Friedrich Alexander Uni, Fortbildungszentrum Hochschullehre FBZHL, Fürth, Bayern, Germany

Friedrich Alexander Universität Erlangen-Nürnberg, Fortbildungszentrum Hochschullehre FBZHL, Fürth, Germany

Michael Cursio

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Fachmedien Wiesbaden GmbH, part of Springer Nature

About this chapter

Jahn, D., Cursio, M. (2023). Assessment of Critical Thinking. In: Critical Thinking. Springer VS, Wiesbaden. https://doi.org/10.1007/978-3-658-41543-3_8

Download citation

DOI : https://doi.org/10.1007/978-3-658-41543-3_8

Published : 10 December 2023

Publisher Name : Springer VS, Wiesbaden

Print ISBN : 978-3-658-41542-6

Online ISBN : 978-3-658-41543-3

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

University of Louisville

  • Programs & Services
  • Delphi Center

Ideas to Action (i2a)

  • What is Critical Thinking?

The ability to think critically calls for a higher-order thinking than simply the ability to recall information.

Definitions of critical thinking, its elements, and its associated activities fill the educational literature of the past forty years. Critical thinking has been described as an ability to question; to acknowledge and test previously held assumptions; to recognize ambiguity; to examine, interpret, evaluate, reason, and reflect; to make informed judgments and decisions; and to clarify, articulate, and justify positions (Hullfish & Smith, 1961; Ennis, 1962; Ruggiero, 1975; Scriven, 1976; Hallet, 1984; Kitchener, 1986; Pascarella & Terenzini, 1991; Mines et al., 1990; Halpern, 1996; Paul & Elder, 2001; Petress, 2004; Holyoak & Morrison, 2005; among others).

After a careful review of the mountainous body of literature defining critical thinking and its elements, UofL has chosen to adopt the language of Michael Scriven and Richard Paul (2003) as a comprehensive, concise operating definition:

Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.

Paul and Scriven go on to suggest that critical thinking is based on: "universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness. It entails the examination of those structures or elements of thought implicit in all reasoning: purpose, problem, or question-at-issue, assumptions, concepts, empirical grounding; reasoning leading to conclusions, implication and consequences, objections from alternative viewpoints, and frame of reference. Critical thinking - in being responsive to variable subject matter, issues, and purposes - is incorporated in a family of interwoven modes of thinking, among them: scientific thinking, mathematical thinking, historical thinking, anthropological thinking, economic thinking, moral thinking, and philosophical thinking."

This conceptualization of critical thinking has been refined and developed further by Richard Paul and Linder Elder into the Paul-Elder framework of critical thinking. Currently, this approach is one of the most widely published and cited frameworks in the critical thinking literature. According to the Paul-Elder framework, critical thinking is the:

  • Analysis of thinking by focusing on the parts or structures of thinking ("the Elements of Thought")
  • Evaluation of thinking by focusing on the quality ("the Universal Intellectual Standards")
  • Improvement of thinking by using what you have learned ("the Intellectual Traits")

Selection of a Critical Thinking Framework

The University of Louisville chose the Paul-Elder model of Critical Thinking as the approach to guide our efforts in developing and enhancing our critical thinking curriculum. The Paul-Elder framework was selected based on criteria adapted from the characteristics of a good model of critical thinking developed at Surry Community College. The Paul-Elder critical thinking framework is comprehensive, uses discipline-neutral terminology, is applicable to all disciplines, defines specific cognitive skills including metacognition, and offers high quality resources.

Why the selection of a single critical thinking framework?

The use of a single critical thinking framework is an important aspect of institution-wide critical thinking initiatives (Paul and Nosich, 1993; Paul, 2004). According to this view, critical thinking instruction should not be relegated to one or two disciplines or departments with discipline specific language and conceptualizations. Rather, critical thinking instruction should be explicitly infused in all courses so that critical thinking skills can be developed and reinforced in student learning across the curriculum. The use of a common approach with a common language allows for a central organizer and for the development of critical thinking skill sets in all courses.

  • SACS & QEP
  • Planning and Implementation
  • Why Focus on Critical Thinking?
  • Paul-Elder Critical Thinking Framework
  • Culminating Undergraduate Experience
  • Community Engagement
  • Frequently Asked Questions
  • What is i2a?

Copyright © 2012 - University of Louisville , Delphi Center

  • Skip to content
  • Skip to search
  • Staff portal (Inside the department)
  • Student portal
  • Key links for students

Other users

  • Forgot password

Notifications

{{item.title}}, my essentials, ask for help, contact edconnect, directory a to z, how to guides, evaluation resource hub, evaluative thinking.

Evaluative thinking is a disciplined approach to inquiry and reflective practice that helps us make sound judgements using good evidence, as a matter of habit.

The following video discusses evaluative thinking. It runs for 3:34 minutes.

Evaluation Capacity Building - Evaluative Thinking

Video transcript

A form of critical thinking

Evaluation is a form of critical thinking that involves examining evidence to make a judgement.

Evaluative claims have two parts: a conclusion and an explanation.

For example:

  • xyz was great, because?
  • xyz is disappointing, because?
  • xyz is a good way to go in this situation, because?

Drawing conclusions based on intuition is not evaluation. Neither is personal opinion, speculation or conjecture.

Each of us makes evaluative judgements every day. Sometimes these are quick assessments that don't matter much, like what to order for lunch. At other times we need to slow down our thought processes, weighing up all the factors carefully and making our deliberation transparent to others.

A disciplined approach

Evaluating a strategic direction or project in a school draws on similar thinking processes and mental disciplines as assessing student performance or recruiting a new staff member.

When we engage in evaluative thinking, we seek to:

  • suspend judgement , considering alternative explanations and allowing new evidence to change our mind
  • question assumptions , particularly about the pathway of cause and effect
  • select and develop solutions that are informed by a strong evidence base and are responsive to our context and priorities
  • value the lessons we can learn from all our experiences ? disappointments as well as triumphs
  • wrestle with questions of impact and effectiveness, not just activity and implementation
  • maximise the value of existing data sources already available to us, mindful of their limitations
  • work to improve the strength of our evidence base as we go.

Cognitive bias

Evaluative thinking helps us navigate the cognitive biases that cloud our judgement.

Cognitive bias occurs when our analysis of a situation is compromised by 'mental shortcuts' or patterns of thinking that place undue emphasis on a particular perspective.

Confirmation bias is one type of cognitive bias can easily compromise an evaluation. This is where the evaluator is already leaning towards a particular conclusion before they see the data. Without realising it, they then pay more attention to data that supports this position.

Although we may not be able to free ourselves from our cognitive biases, being aware of them is a good first step. The mental disciplines of evaluative thinking can help manage these biases, and to keep our reasoning sharp and convincing.

Read more about cognitive bias.

Develop evaluative thinking

Working openly with colleagues helps to develop evaluative thinking in ourselves and others. Evaluative thinking sometimes comes naturally, but at other times it can feel a bit challenging - even threatening. If we want to develop evaluative thinking in others, we first need to model it ourselves.

A good way to strengthen evaluative practice in schools is to engage in evaluative thinking as a group: deliberately, transparently and in a supportive context. In this way people have the time and space to reflect on their thinking. This is particularly important if we are to identify or 'unlearn' bad habits that we may have fallen into.

For example, the simple act of being asked 'What makes you think that?' prompts us to explain how we formed our judgements, including the evidence we have considered as part of this.

The importance of modelling and collaborative practice in evaluation is highlighted in the Australian Institute for Teaching and School Leadership's (AITSL ) profile relating to leading improvement, innovation and change . This profile encourages school leaders to develop 'a culture of continuous improvement' and 'a culture of trust and collaboration, where change and innovation based on research and evidence can flourish'.

As part of doing this, the Leadership Profile highlights the value of 'evaluating outcomes and refining actions as change is implemented? taking account of the impact of change on others, providing opportunities for regular feedback'.

Keep reading

  • Disciplines of evaluative thinking
  • Professional learning
  • Teaching and learning
  • Building capacity

Business Unit:

  • Centre for Education Statistics and Evaluation

ABLE blog: thoughts, learnings and experiences

  • Productivity
  • Thoughtful learning

9 characteristics of critical thinking (and how you can develop them)

9 characteristics of critical thinking (and how you can develop them)

It's no secret that critical thinking is essential for growth and success. Yet many people aren't quite sure what it means — it sounds like being a critic or cynical, traits that many people want to avoid.

However, thinking critically isn't about being negative. On the contrary, effective critical thinkers possess many positive traits. Attributes like curiosity, compassion, and communication are among the top commonalities that critical thinkers share, and the good news is that we can all learn to develop these capabilities.

This article will discuss some of the principal characteristics of critical thinking and how developing these qualities can help you improve your decision-making and problem-solving skills. With a bit of self-reflection and practice, you'll be well on your way to making better decisions, solving complex problems, and achieving success across all areas of your life.

What is critical thinking?

Scholarly works on critical thinking propose many ways of interpreting the concept ( at least 17 in one reference! ), making it challenging to pinpoint one exact definition. In general, critical thinking refers to rational, goal-directed thought through logical arguments and reasoning. We use critical thinking to objectively assess and evaluate information to form reasonable judgments.

Critical thinking has its roots in ancient Greece. The philosopher Socrates is credited with being one of the first to encourage his students to think critically about their beliefs and ideas. Socrates believed that by encouraging people to question their assumptions, they would be able to see the flaws in their reasoning and improve their thought processes.

Today, critical thinking skills are considered vital for success in academia and everyday life. One of the defining " 21st-century skills ," critical thinking is integral to problem-solving, decision making, and goal setting.

Why is it necessary to develop critical thinking skills?

Characteristics of critical thinking: question marks and a light bulb icon

Critical thinking skills help us learn new information, understand complex concepts, and make better decisions. The ability to be objective and reasonable is an asset that can enhance personal and professional relationships.

The U.S. Department of Labor reports critical thinking is among the top desired skills in the workplace. The ability to develop a properly thought-out solution in a reasonable amount of time is highly valued by employers. Companies want employees who can solve problems independently and work well in a team. A desirable employee can evaluate situations critically and creatively, collaborate with others, and make sound judgments.

Critical thinking is an essential component of academic study as well. Critical thinking skills are vital to learners because they allow students to build on their prior knowledge and construct new understandings. This will enable learners to expand their knowledge and experience across various subjects.

Despite its importance, though, critical thinking is not something that we develop naturally or casually. Even though critical thinking is considered an essential learning outcome in many universities, only 45% of college students in a well-known study reported that their skills had improved after two years of classes.

9 characteristics of critical thinking

Clearly, improving our ability to think critically will require some self-improvement work. As lifelong learners, we can use this opportunity for self-reflection to identify where we can improve our thinking processes.

Strong critical thinkers possess a common set of personality traits, habits, and dispositions. Being aware of these attributes and putting them into action can help us develop a strong foundation for critical thinking. These essential characteristics of critical thinking can be used as a toolkit for applying specific thinking processes to any given situation.

Characteristics of critical thinking: illustration of a human head with a lightbulb in it

Curiosity is one of the most significant characteristics of critical thinking. Research has shown that a state of curiosity drives us to continually seek new information . This inquisitiveness supports critical thinking as we need to constantly expand our knowledge to make well-informed decisions.

Curiosity also facilitates critical thinking because it encourages us to question our thoughts and mental models, the filters we use to understand the world. This is essential to avoid critical thinking barriers like biases and misconceptions. Challenging our beliefs and getting curious about all sides of an issue will help us have an open mind during the critical thinking process.

Actionable Tip: Choose to be curious. When you ask “why,” you learn about things around you and clarify ambiguities. Google anything you are curious about, read new books, and play with a child. Kids have a natural curiosity that can be inspiring.

characteristics of evaluation in critical thinking

Pique your curiosity

ABLE is the next-level all-in-one knowledge acquisition and productivity app for avid learners and curious minds.

2. Analytical

Investigation is a crucial component of critical thinking, so it's important to be analytical. Analytical thinking involves breaking down complex ideas into their simplest forms . The first step when tackling a problem or making a decision is to analyze information and consider it in smaller pieces. Then, we use critical thinking by gathering additional information before getting to a judgment or solution.

Being analytical is helpful for critical thinking because it allows us to look at data in detail. When examining an issue from various perspectives, we should pay close attention to these details to arrive at a decision based on facts. Taking these steps is crucial to making good decisions.

Actionable Tip: Become aware of your daily surroundings. Examine how things work — breaking things down into steps will encourage analysis. You can also play brain and puzzle games. These provide an enjoyable way to stimulate analytical thinking.

3. Introspective

Critical thinkers are typically introspective. Introspection is a process of examining our own thoughts and feelings. We do this as a form of metacognition, or thinking about thinking. Researchers believe that we can improve our problem-solving skills by using metacognition to analyze our reasoning processes .

Being introspective is essential to critical thinking because it helps us be self-aware. Self-awareness encourages us to acknowledge and face our own biases, prejudices, and selfish tendencies. If we know our assumptions, we can question them and suspend judgment until we have all the facts.

Actionable Tip: Start a journal. Keep track of your thoughts, feelings, and opinions throughout the day, especially when faced with difficult decisions. Look for patterns. You can avoid common thought fallacies by being aware of them.

4. Able to make inferences

Another characteristic of critical thinking is the ability to make inferences, which are logical conclusions based on reviewing the facts, events, and ideas available. Analyzing the available information and observing patterns and trends will help you find relationships and make informed decisions based on what is likely to happen.

The ability to distinguish assumptions from inferences is crucial to critical thinking. We decide something is true by inference because another thing is also true, but we decide something by assumption because of what we believe or think we know. While both assumptions and inferences can be valid or invalid, inferences are more rational because data support them.

Actionable Tip: Keep an eye on your choices and patterns during the day, noticing when you infer. Practice applying the Inference Equation — I observe + I already know = So now I am thinking — to help distinguish when you infer or assume.

5. Observant

Wooden blocks with icons of the 5 senses

Observation skills are also a key part of critical thinking. Observation is more than just looking — it involves arranging, combining, and classifying information through all five senses to build understanding. People with keen observation skills notice small details and catch slight changes in their surroundings.

Observation is one of the first skills we learn as children , and it is critical for problem-solving. Being observant allows us to collect more information about a situation and use that information to make better decisions and solve problems. Further, it facilitates seeing things from different perspectives and finding alternative solutions.

Actionable Tip: Limit your use of devices, and be mindful of your surroundings. Notice and name one thing for each of your five senses when you enter a new environment or even a familiar one. Being aware of what you see, hear, smell, taste, and touch allows you to fully experience the moment and it develops your ability to observe your surroundings.

6. Open-minded and compassionate

Open-minded and compassionate people are good critical thinkers. Being open-minded means considering new ideas and perspectives, even if they conflict with your own. This allows you to examine different sides of an issue without immediately dismissing them. Likewise, compassionate people can empathize with others, even if they disagree. When you understand another person's point of view, you can find common ground and understanding.

Critical thinking requires an open mind when analyzing opposing arguments and compassion when listening to the perspective of others. By exploring different viewpoints and seeking to understand others' perspectives, critical thinkers can gain a more well-rounded understanding of an issue. Using this deeper understanding, we can make better decisions and solve more complex problems.

Actionable Tip: Cultivate open-mindedness and compassion by regularly exposing yourself to new ideas and views. Read books on unfamiliar topics, listen to podcasts with diverse opinions, or talk with people from different backgrounds.

7. Able to determine relevance

The ability to assess relevance is an essential characteristic of critical thinking. Relevance is defined as being logically connected and significant to the subject. When a fact or statement is essential to a topic, it can be deemed relevant.

Relevance plays a vital role in many stages of the critical thinking process . It's especially crucial to identify the most pertinent facts before evaluating an argument. Despite being accurate and seemingly meaningful, a point may not matter much to your subject. Your criteria and standards are equally relevant, as you can't make a sound decision with irrelevant guidelines.

Actionable Tip: When you're in a conversation, pay attention to how each statement relates to what you're talking about. It's surprising how often we stray from the point with irrelevant information. Asking yourself, "How does that relate to the topic?" can help you spot unrelated issues.

I CAN or I WILL written in wooden blocks

Critical thinking requires willingness. Some scholars argue that the "willingness to inquire" is the most fundamental characteristic of critical thinking , which encompasses all the others. Being willing goes hand in hand with other traits, like being flexible and humble. Flexible thinkers are willing to adapt their thinking to new evidence or arguments. Those who are humble are willing to acknowledge their faults and recognize their limitations.

It's essential for critical thinking that we have an open mind and are willing to challenge the status quo. The willingness to question assumptions, consider multiple perspectives, and think outside the box allows critical thinkers to reach new and necessary conclusions.

Actionable Tip: Cultivate willingness by adopting a growth mindset. See challenges as learning opportunities. Celebrate others' accomplishments, and get curious about what led to their success.

9. Effective communicators

Being a good critical thinker requires effective communication. Effective critical thinkers know that communication is imperative when solving problems. They can articulate their goals and concerns clearly while recognizing others' perspectives. Critical thinking requires people to be able to listen to each other's opinions and share their experiences respectfully to find the best solutions.

A good communicator is also an attentive and active listener. Listening actively goes beyond simply hearing what someone says. Being engaged in the discussion involves:

  • Listening to what they say
  • Being present
  • Asking questions that clarify their position

Actively listening is crucial for critical thinking because it helps us understand other people's perspectives.

Actionable Tip: The next time you speak with a friend, family member, or even a complete stranger, take the time to genuinely listen to what they're saying. It may surprise you how much you can learn about others — and about yourself — when you take the time to listen carefully.

The nine traits above represent just a few of the most common characteristics of critical thinking. By developing or strengthening these characteristics, you can enhance your capacity for critical thinking.

Get to the core of critical thinking

Critical thinking is essential for success in every aspect of life, from personal relationships to professional careers. By developing your critical thinking skills , you can challenge the status quo and gain a new perspective on the world around you. You can start improving your critical thinking skills today by determining which characteristics of critical thinking you need to work on and using the actionable tips to strengthen them. With practice, you can become a great critical thinker.

I hope you have enjoyed reading this article. Feel free to share, recommend and connect 🙏

Connect with me on Twitter 👉   https://twitter.com/iamborisv

And follow Able's journey on Twitter: https://twitter.com/meet_able

And subscribe to our newsletter to read more valuable articles before it gets published on our blog.

Now we're building a Discord community of like-minded people, and we would be honoured and delighted to see you there.

Erin E. Rupp

Erin E. Rupp

Read more posts by this author

3 critical thinking strategies to enhance your problem-solving skills

5 remedies for poor time management (and how to know if you need them).

What is abstract thinking? 10 activities to improve your abstract thinking skills

What is abstract thinking? 10 activities to improve your abstract thinking skills

5 examples of cognitive learning theory (and how you can use them)

5 examples of cognitive learning theory (and how you can use them)

0 results found.

  • Aegis Alpha SA
  • We build in public

Building with passion in

U.S. flag

An official website of the United States government

The .gov means it's official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Browse Titles

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Hughes RG, editor. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Rockville (MD): Agency for Healthcare Research and Quality (US); 2008 Apr.

Cover of Patient Safety and Quality

Patient Safety and Quality: An Evidence-Based Handbook for Nurses.

Chapter 6 clinical reasoning, decisionmaking, and action: thinking critically and clinically.

Patricia Benner ; Ronda G. Hughes ; Molly Sutphen .

Affiliations

This chapter examines multiple thinking strategies that are needed for high-quality clinical practice. Clinical reasoning and judgment are examined in relation to other modes of thinking used by clinical nurses in providing quality health care to patients that avoids adverse events and patient harm. The clinician’s ability to provide safe, high-quality care can be dependent upon their ability to reason, think, and judge, which can be limited by lack of experience. The expert performance of nurses is dependent upon continual learning and evaluation of performance.

  • Critical Thinking

Nursing education has emphasized critical thinking as an essential nursing skill for more than 50 years. 1 The definitions of critical thinking have evolved over the years. There are several key definitions for critical thinking to consider. The American Philosophical Association (APA) defined critical thinking as purposeful, self-regulatory judgment that uses cognitive tools such as interpretation, analysis, evaluation, inference, and explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations on which judgment is based. 2 A more expansive general definition of critical thinking is

. . . in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem solving abilities and a commitment to overcome our native egocentrism and sociocentrism. Every clinician must develop rigorous habits of critical thinking, but they cannot escape completely the situatedness and structures of the clinical traditions and practices in which they must make decisions and act quickly in specific clinical situations. 3

There are three key definitions for nursing, which differ slightly. Bittner and Tobin defined critical thinking as being “influenced by knowledge and experience, using strategies such as reflective thinking as a part of learning to identify the issues and opportunities, and holistically synthesize the information in nursing practice” 4 (p. 268). Scheffer and Rubenfeld 5 expanded on the APA definition for nurses through a consensus process, resulting in the following definition:

Critical thinking in nursing is an essential component of professional accountability and quality nursing care. Critical thinkers in nursing exhibit these habits of the mind: confidence, contextual perspective, creativity, flexibility, inquisitiveness, intellectual integrity, intuition, openmindedness, perseverance, and reflection. Critical thinkers in nursing practice the cognitive skills of analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting, and transforming knowledge 6 (Scheffer & Rubenfeld, p. 357).

The National League for Nursing Accreditation Commission (NLNAC) defined critical thinking as:

the deliberate nonlinear process of collecting, interpreting, analyzing, drawing conclusions about, presenting, and evaluating information that is both factually and belief based. This is demonstrated in nursing by clinical judgment, which includes ethical, diagnostic, and therapeutic dimensions and research 7 (p. 8).

These concepts are furthered by the American Association of Colleges of Nurses’ definition of critical thinking in their Essentials of Baccalaureate Nursing :

Critical thinking underlies independent and interdependent decision making. Critical thinking includes questioning, analysis, synthesis, interpretation, inference, inductive and deductive reasoning, intuition, application, and creativity 8 (p. 9).
Course work or ethical experiences should provide the graduate with the knowledge and skills to:
  • Use nursing and other appropriate theories and models, and an appropriate ethical framework;
  • Apply research-based knowledge from nursing and the sciences as the basis for practice;
  • Use clinical judgment and decision-making skills;
  • Engage in self-reflective and collegial dialogue about professional practice;
  • Evaluate nursing care outcomes through the acquisition of data and the questioning of inconsistencies, allowing for the revision of actions and goals;
  • Engage in creative problem solving 8 (p. 10).

Taken together, these definitions of critical thinking set forth the scope and key elements of thought processes involved in providing clinical care. Exactly how critical thinking is defined will influence how it is taught and to what standard of care nurses will be held accountable.

Professional and regulatory bodies in nursing education have required that critical thinking be central to all nursing curricula, but they have not adequately distinguished critical reflection from ethical, clinical, or even creative thinking for decisionmaking or actions required by the clinician. Other essential modes of thought such as clinical reasoning, evaluation of evidence, creative thinking, or the application of well-established standards of practice—all distinct from critical reflection—have been subsumed under the rubric of critical thinking. In the nursing education literature, clinical reasoning and judgment are often conflated with critical thinking. The accrediting bodies and nursing scholars have included decisionmaking and action-oriented, practical, ethical, and clinical reasoning in the rubric of critical reflection and thinking. One might say that this harmless semantic confusion is corrected by actual practices, except that students need to understand the distinctions between critical reflection and clinical reasoning, and they need to learn to discern when each is better suited, just as students need to also engage in applying standards, evidence-based practices, and creative thinking.

The growing body of research, patient acuity, and complexity of care demand higher-order thinking skills. Critical thinking involves the application of knowledge and experience to identify patient problems and to direct clinical judgments and actions that result in positive patient outcomes. These skills can be cultivated by educators who display the virtues of critical thinking, including independence of thought, intellectual curiosity, courage, humility, empathy, integrity, perseverance, and fair-mindedness. 9

The process of critical thinking is stimulated by integrating the essential knowledge, experiences, and clinical reasoning that support professional practice. The emerging paradigm for clinical thinking and cognition is that it is social and dialogical rather than monological and individual. 10–12 Clinicians pool their wisdom and multiple perspectives, yet some clinical knowledge can be demonstrated only in the situation (e.g., how to suction an extremely fragile patient whose oxygen saturations sink too low). Early warnings of problematic situations are made possible by clinicians comparing their observations to that of other providers. Clinicians form practice communities that create styles of practice, including ways of doing things, communication styles and mechanisms, and shared expectations about performance and expertise of team members.

By holding up critical thinking as a large umbrella for different modes of thinking, students can easily misconstrue the logic and purposes of different modes of thinking. Clinicians and scientists alike need multiple thinking strategies, such as critical thinking, clinical judgment, diagnostic reasoning, deliberative rationality, scientific reasoning, dialogue, argument, creative thinking, and so on. In particular, clinicians need forethought and an ongoing grasp of a patient’s health status and care needs trajectory, which requires an assessment of their own clarity and understanding of the situation at hand, critical reflection, critical reasoning, and clinical judgment.

Critical Reflection, Critical Reasoning, and Judgment

Critical reflection requires that the thinker examine the underlying assumptions and radically question or doubt the validity of arguments, assertions, and even facts of the case. Critical reflective skills are essential for clinicians; however, these skills are not sufficient for the clinician who must decide how to act in particular situations and avoid patient injury. For example, in everyday practice, clinicians cannot afford to critically reflect on the well-established tenets of “normal” or “typical” human circulatory systems when trying to figure out a particular patient’s alterations from that typical, well-grounded understanding that has existed since Harvey’s work in 1628. 13 Yet critical reflection can generate new scientifically based ideas. For example, there is a lack of adequate research on the differences between women’s and men’s circulatory systems and the typical pathophysiology related to heart attacks. Available research is based upon multiple, taken-for-granted starting points about the general nature of the circulatory system. As such, critical reflection may not provide what is needed for a clinician to act in a situation. This idea can be considered reasonable since critical reflective thinking is not sufficient for good clinical reasoning and judgment. The clinician’s development of skillful critical reflection depends upon being taught what to pay attention to, and thus gaining a sense of salience that informs the powers of perceptual grasp. The powers of noticing or perceptual grasp depend upon noticing what is salient and the capacity to respond to the situation.

Critical reflection is a crucial professional skill, but it is not the only reasoning skill or logic clinicians require. The ability to think critically uses reflection, induction, deduction, analysis, challenging assumptions, and evaluation of data and information to guide decisionmaking. 9 , 14 , 15 Critical reasoning is a process whereby knowledge and experience are applied in considering multiple possibilities to achieve the desired goals, 16 while considering the patient’s situation. 14 It is a process where both inductive and deductive cognitive skills are used. 17 Sometimes clinical reasoning is presented as a form of evaluating scientific knowledge, sometimes even as a form of scientific reasoning. Critical thinking is inherent in making sound clinical reasoning. 18

An essential point of tension and confusion exists in practice traditions such as nursing and medicine when clinical reasoning and critical reflection become entangled, because the clinician must have some established bases that are not questioned when engaging in clinical decisions and actions, such as standing orders. The clinician must act in the particular situation and time with the best clinical and scientific knowledge available. The clinician cannot afford to indulge in either ritualistic unexamined knowledge or diagnostic or therapeutic nihilism caused by radical doubt, as in critical reflection, because they must find an intelligent and effective way to think and act in particular clinical situations. Critical reflection skills are essential to assist practitioners to rethink outmoded or even wrong-headed approaches to health care, health promotion, and prevention of illness and complications, especially when new evidence is available. Breakdowns in practice, high failure rates in particular therapies, new diseases, new scientific discoveries, and societal changes call for critical reflection about past assumptions and no-longer-tenable beliefs.

Clinical reasoning stands out as a situated, practice-based form of reasoning that requires a background of scientific and technological research-based knowledge about general cases, more so than any particular instance. It also requires practical ability to discern the relevance of the evidence behind general scientific and technical knowledge and how it applies to a particular patient. In dong so, the clinician considers the patient’s particular clinical trajectory, their concerns and preferences, and their particular vulnerabilities (e.g., having multiple comorbidities) and sensitivities to care interventions (e.g., known drug allergies, other conflicting comorbid conditions, incompatible therapies, and past responses to therapies) when forming clinical decisions or conclusions.

Situated in a practice setting, clinical reasoning occurs within social relationships or situations involving patient, family, community, and a team of health care providers. The expert clinician situates themselves within a nexus of relationships, with concerns that are bounded by the situation. Expert clinical reasoning is socially engaged with the relationships and concerns of those who are affected by the caregiving situation, and when certain circumstances are present, the adverse event. Halpern 19 has called excellent clinical ethical reasoning “emotional reasoning” in that the clinicians have emotional access to the patient/family concerns and their understanding of the particular care needs. Expert clinicians also seek an optimal perceptual grasp, one based on understanding and as undistorted as possible, based on an attuned emotional engagement and expert clinical knowledge. 19 , 20

Clergy educators 21 and nursing and medical educators have begun to recognize the wisdom of broadening their narrow vision of rationality beyond simple rational calculation (exemplified by cost-benefit analysis) to reconsider the need for character development—including emotional engagement, perception, habits of thought, and skill acquisition—as essential to the development of expert clinical reasoning, judgment, and action. 10 , 22–24 Practitioners of engineering, law, medicine, and nursing, like the clergy, have to develop a place to stand in their discipline’s tradition of knowledge and science in order to recognize and evaluate salient evidence in the moment. Diagnostic confusion and disciplinary nihilism are both threats to the clinician’s ability to act in particular situations. However, the practice and practitioners will not be self-improving and vital if they cannot engage in critical reflection on what is not of value, what is outmoded, and what does not work. As evidence evolves and expands, so too must clinical thought.

Clinical judgment requires clinical reasoning across time about the particular, and because of the relevance of this immediate historical unfolding, clinical reasoning can be very different from the scientific reasoning used to formulate, conduct, and assess clinical experiments. While scientific reasoning is also socially embedded in a nexus of social relationships and concerns, the goal of detached, critical objectivity used to conduct scientific experiments minimizes the interactive influence of the research on the experiment once it has begun. Scientific research in the natural and clinical sciences typically uses formal criteria to develop “yes” and “no” judgments at prespecified times. The scientist is always situated in past and immediate scientific history, preferring to evaluate static and predetermined points in time (e.g., snapshot reasoning), in contrast to a clinician who must always reason about transitions over time. 25 , 26

Techne and Phronesis

Distinctions between the mere scientific making of things and practice was first explored by Aristotle as distinctions between techne and phronesis. 27 Learning to be a good practitioner requires developing the requisite moral imagination for good practice. If, for example, patients exercise their rights and refuse treatments, practitioners are required to have the moral imagination to understand the probable basis for the patient’s refusal. For example, was the refusal based upon catastrophic thinking, unrealistic fears, misunderstanding, or even clinical depression?

Techne, as defined by Aristotle, encompasses the notion of formation of character and habitus 28 as embodied beings. In Aristotle’s terms, techne refers to the making of things or producing outcomes. 11 Joseph Dunne defines techne as “the activity of producing outcomes,” and it “is governed by a means-ends rationality where the maker or producer governs the thing or outcomes produced or made through gaining mastery over the means of producing the outcomes, to the point of being able to separate means and ends” 11 (p. 54). While some aspects of medical and nursing practice fall into the category of techne, much of nursing and medical practice falls outside means-ends rationality and must be governed by concern for doing good or what is best for the patient in particular circumstances, where being in a relationship and discerning particular human concerns at stake guide action.

Phronesis, in contrast to techne, includes reasoning about the particular, across time, through changes or transitions in the patient’s and/or the clinician’s understanding. As noted by Dunne, phronesis is “characterized at least as much by a perceptiveness with regard to concrete particulars as by a knowledge of universal principles” 11 (p. 273). This type of practical reasoning often takes the form of puzzle solving or the evaluation of immediate past “hot” history of the patient’s situation. Such a particular clinical situation is necessarily particular, even though many commonalities and similarities with other disease syndromes can be recognized through signs and symptoms and laboratory tests. 11 , 29 , 30 Pointing to knowledge embedded in a practice makes no claim for infallibility or “correctness.” Individual practitioners can be mistaken in their judgments because practices such as medicine and nursing are inherently underdetermined. 31

While phronetic knowledge must remain open to correction and improvement, real events, and consequences, it cannot consistently transcend the institutional setting’s capacities and supports for good practice. Phronesis is also dependent on ongoing experiential learning of the practitioner, where knowledge is refined, corrected, or refuted. The Western tradition, with the notable exception of Aristotle, valued knowledge that could be made universal and devalued practical know-how and experiential learning. Descartes codified this preference for formal logic and rational calculation.

Aristotle recognized that when knowledge is underdetermined, changeable, and particular, it cannot be turned into the universal or standardized. It must be perceived, discerned, and judged, all of which require experiential learning. In nursing and medicine, perceptual acuity in physical assessment and clinical judgment (i.e., reasoning across time about changes in the particular patient or the clinician’s understanding of the patient’s condition) fall into the Greek Aristotelian category of phronesis. Dewey 32 sought to rescue knowledge gained by practical activity in the world. He identified three flaws in the understanding of experience in Greek philosophy: (1) empirical knowing is the opposite of experience with science; (2) practice is reduced to techne or the application of rational thought or technique; and (3) action and skilled know-how are considered temporary and capricious as compared to reason, which the Greeks considered as ultimate reality.

In practice, nursing and medicine require both techne and phronesis. The clinician standardizes and routinizes what can be standardized and routinized, as exemplified by standardized blood pressure measurements, diagnoses, and even charting about the patient’s condition and treatment. 27 Procedural and scientific knowledge can often be formalized and standardized (e.g., practice guidelines), or at least made explicit and certain in practice, except for the necessary timing and adjustments made for particular patients. 11 , 22

Rational calculations available to techne—population trends and statistics, algorithms—are created as decision support structures and can improve accuracy when used as a stance of inquiry in making clinical judgments about particular patients. Aggregated evidence from clinical trials and ongoing working knowledge of pathophysiology, biochemistry, and genomics are essential. In addition, the skills of phronesis (clinical judgment that reasons across time, taking into account the transitions of the particular patient/family/community and transitions in the clinician’s understanding of the clinical situation) will be required for nursing, medicine, or any helping profession.

Thinking Critically

Being able to think critically enables nurses to meet the needs of patients within their context and considering their preferences; meet the needs of patients within the context of uncertainty; consider alternatives, resulting in higher-quality care; 33 and think reflectively, rather than simply accepting statements and performing tasks without significant understanding and evaluation. 34 Skillful practitioners can think critically because they have the following cognitive skills: information seeking, discriminating, analyzing, transforming knowledge, predicating, applying standards, and logical reasoning. 5 One’s ability to think critically can be affected by age, length of education (e.g., an associate vs. a baccalaureate decree in nursing), and completion of philosophy or logic subjects. 35–37 The skillful practitioner can think critically because of having the following characteristics: motivation, perseverance, fair-mindedness, and deliberate and careful attention to thinking. 5 , 9

Thinking critically implies that one has a knowledge base from which to reason and the ability to analyze and evaluate evidence. 38 Knowledge can be manifest by the logic and rational implications of decisionmaking. Clinical decisionmaking is particularly influenced by interpersonal relationships with colleagues, 39 patient conditions, availability of resources, 40 knowledge, and experience. 41 Of these, experience has been shown to enhance nurses’ abilities to make quick decisions 42 and fewer decision errors, 43 support the identification of salient cues, and foster the recognition and action on patterns of information. 44 , 45

Clinicians must develop the character and relational skills that enable them to perceive and understand their patient’s needs and concerns. This requires accurate interpretation of patient data that is relevant to the specific patient and situation. In nursing, this formation of moral agency focuses on learning to be responsible in particular ways demanded by the practice, and to pay attention and intelligently discern changes in patients’ concerns and/or clinical condition that require action on the part of the nurse or other health care workers to avert potential compromises to quality care.

Formation of the clinician’s character, skills, and habits are developed in schools and particular practice communities within a larger practice tradition. As Dunne notes,

A practice is not just a surface on which one can display instant virtuosity. It grounds one in a tradition that has been formed through an elaborate development and that exists at any juncture only in the dispositions (slowly and perhaps painfully acquired) of its recognized practitioners. The question may of course be asked whether there are any such practices in the contemporary world, whether the wholesale encroachment of Technique has not obliterated them—and whether this is not the whole point of MacIntyre’s recipe of withdrawal, as well as of the post-modern story of dispossession 11 (p. 378).

Clearly Dunne is engaging in critical reflection about the conditions for developing character, skills, and habits for skillful and ethical comportment of practitioners, as well as to act as moral agents for patients so that they and their families receive safe, effective, and compassionate care.

Professional socialization or professional values, while necessary, do not adequately address character and skill formation that transform the way the practitioner exists in his or her world, what the practitioner is capable of noticing and responding to, based upon well-established patterns of emotional responses, skills, dispositions to act, and the skills to respond, decide, and act. 46 The need for character and skill formation of the clinician is what makes a practice stand out from a mere technical, repetitious manufacturing process. 11 , 30 , 47

In nursing and medicine, many have questioned whether current health care institutions are designed to promote or hinder enlightened, compassionate practice, or whether they have deteriorated into commercial institutional models that focus primarily on efficiency and profit. MacIntyre points out the links between the ongoing development and improvement of practice traditions and the institutions that house them:

Lack of justice, lack of truthfulness, lack of courage, lack of the relevant intellectual virtues—these corrupt traditions, just as they do those institutions and practices which derive their life from the traditions of which they are the contemporary embodiments. To recognize this is of course also to recognize the existence of an additional virtue, one whose importance is perhaps most obvious when it is least present, the virtue of having an adequate sense of the traditions to which one belongs or which confront one. This virtue is not to be confused with any form of conservative antiquarianism; I am not praising those who choose the conventional conservative role of laudator temporis acti. It is rather the case that an adequate sense of tradition manifests itself in a grasp of those future possibilities which the past has made available to the present. Living traditions, just because they continue a not-yet-completed narrative, confront a future whose determinate and determinable character, so far as it possesses any, derives from the past 30 (p. 207).

It would be impossible to capture all the situated and distributed knowledge outside of actual practice situations and particular patients. Simulations are powerful as teaching tools to enable nurses’ ability to think critically because they give students the opportunity to practice in a simplified environment. However, students can be limited in their inability to convey underdetermined situations where much of the information is based on perceptions of many aspects of the patient and changes that have occurred over time. Simulations cannot have the sub-cultures formed in practice settings that set the social mood of trust, distrust, competency, limited resources, or other forms of situated possibilities.

One of the hallmark studies in nursing providing keen insight into understanding the influence of experience was a qualitative study of adult, pediatric, and neonatal intensive care unit (ICU) nurses, where the nurses were clustered into advanced beginner, intermediate, and expert level of practice categories. The advanced beginner (having up to 6 months of work experience) used procedures and protocols to determine which clinical actions were needed. When confronted with a complex patient situation, the advanced beginner felt their practice was unsafe because of a knowledge deficit or because of a knowledge application confusion. The transition from advanced beginners to competent practitioners began when they first had experience with actual clinical situations and could benefit from the knowledge gained from the mistakes of their colleagues. Competent nurses continuously questioned what they saw and heard, feeling an obligation to know more about clinical situations. In doing do, they moved from only using care plans and following the physicians’ orders to analyzing and interpreting patient situations. Beyond that, the proficient nurse acknowledged the changing relevance of clinical situations requiring action beyond what was planned or anticipated. The proficient nurse learned to acknowledge the changing needs of patient care and situation, and could organize interventions “by the situation as it unfolds rather than by preset goals 48 (p. 24). Both competent and proficient nurses (that is, intermediate level of practice) had at least two years of ICU experience. 48 Finally, the expert nurse had a more fully developed grasp of a clinical situation, a sense of confidence in what is known about the situation, and could differentiate the precise clinical problem in little time. 48

Expertise is acquired through professional experience and is indicative of a nurse who has moved beyond mere proficiency. As Gadamer 29 points out, experience involves a turning around of preconceived notions, preunderstandings, and extends or adds nuances to understanding. Dewey 49 notes that experience requires a prepared “creature” and an enriched environment. The opportunity to reflect and narrate one’s experiential learning can clarify, extend, or even refute experiential learning.

Experiential learning requires time and nurturing, but time alone does not ensure experiential learning. Aristotle linked experiential learning to the development of character and moral sensitivities of a person learning a practice. 50 New nurses/new graduates have limited work experience and must experience continuing learning until they have reached an acceptable level of performance. 51 After that, further improvements are not predictable, and years of experience are an inadequate predictor of expertise. 52

The most effective knower and developer of practical knowledge creates an ongoing dialogue and connection between lessons of the day and experiential learning over time. Gadamer, in a late life interview, highlighted the open-endedness and ongoing nature of experiential learning in the following interview response:

Being experienced does not mean that one now knows something once and for all and becomes rigid in this knowledge; rather, one becomes more open to new experiences. A person who is experienced is undogmatic. Experience has the effect of freeing one to be open to new experience … In our experience we bring nothing to a close; we are constantly learning new things from our experience … this I call the interminability of all experience 32 (p. 403).

Practical endeavor, supported by scientific knowledge, requires experiential learning, the development of skilled know-how, and perceptual acuity in order to make the scientific knowledge relevant to the situation. Clinical perceptual and skilled know-how helps the practitioner discern when particular scientific findings might be relevant. 53

Often experience and knowledge, confirmed by experimentation, are treated as oppositions, an either-or choice. However, in practice it is readily acknowledged that experiential knowledge fuels scientific investigation, and scientific investigation fuels further experiential learning. Experiential learning from particular clinical cases can help the clinician recognize future similar cases and fuel new scientific questions and study. For example, less experienced nurses—and it could be argued experienced as well—can use nursing diagnoses practice guidelines as part of their professional advancement. Guidelines are used to reflect their interpretation of patients’ needs, responses, and situation, 54 a process that requires critical thinking and decisionmaking. 55 , 56 Using guidelines also reflects one’s problem identification and problem-solving abilities. 56 Conversely, the ability to proficiently conduct a series of tasks without nursing diagnoses is the hallmark of expertise. 39 , 57

Experience precedes expertise. As expertise develops from experience and gaining knowledge and transitions to the proficiency stage, the nurses’ thinking moves from steps and procedures (i.e., task-oriented care) toward “chunks” or patterns 39 (i.e., patient-specific care). In doing so, the nurse thinks reflectively, rather than merely accepting statements and performing procedures without significant understanding and evaluation. 34 Expert nurses do not rely on rules and logical thought processes in problem-solving and decisionmaking. 39 Instead, they use abstract principles, can see the situation as a complex whole, perceive situations comprehensively, and can be fully involved in the situation. 48 Expert nurses can perform high-level care without conscious awareness of the knowledge they are using, 39 , 58 and they are able to provide that care with flexibility and speed. Through a combination of knowledge and skills gained from a range of theoretical and experiential sources, expert nurses also provide holistic care. 39 Thus, the best care comes from the combination of theoretical, tacit, and experiential knowledge. 59 , 60

Experts are thought to eventually develop the ability to intuitively know what to do and to quickly recognize critical aspects of the situation. 22 Some have proposed that expert nurses provide high-quality patient care, 61 , 62 but that is not consistently documented—particularly in consideration of patient outcomes—and a full understanding between the differential impact of care rendered by an “expert” nurse is not fully understood. In fact, several studies have found that length of professional experience is often unrelated and even negatively related to performance measures and outcomes. 63 , 64

In a review of the literature on expertise in nursing, Ericsson and colleagues 65 found that focusing on challenging, less-frequent situations would reveal individual performance differences on tasks that require speed and flexibility, such as that experienced during a code or an adverse event. Superior performance was associated with extensive training and immediate feedback about outcomes, which can be obtained through continual training, simulation, and processes such as root-cause analysis following an adverse event. Therefore, efforts to improve performance benefited from continual monitoring, planning, and retrospective evaluation. Even then, the nurse’s ability to perform as an expert is dependent upon their ability to use intuition or insights gained through interactions with patients. 39

Intuition and Perception

Intuition is the instant understanding of knowledge without evidence of sensible thought. 66 According to Young, 67 intuition in clinical practice is a process whereby the nurse recognizes something about a patient that is difficult to verbalize. Intuition is characterized by factual knowledge, “immediate possession of knowledge, and knowledge independent of the linear reasoning process” 68 (p. 23). When intuition is used, one filters information initially triggered by the imagination, leading to the integration of all knowledge and information to problem solve. 69 Clinicians use their interactions with patients and intuition, drawing on tacit or experiential knowledge, 70 , 71 to apply the correct knowledge to make the correct decisions to address patient needs. Yet there is a “conflated belief in the nurses’ ability to know what is best for the patient” 72 (p. 251) because the nurses’ and patients’ identification of the patients’ needs can vary. 73

A review of research and rhetoric involving intuition by King and Appleton 62 found that all nurses, including students, used intuition (i.e., gut feelings). They found evidence, predominately in critical care units, that intuition was triggered in response to knowledge and as a trigger for action and/or reflection with a direct bearing on the analytical process involved in patient care. The challenge for nurses was that rigid adherence to checklists, guidelines, and standardized documentation, 62 ignored the benefits of intuition. This view was furthered by Rew and Barrow 68 , 74 in their reviews of the literature, where they found that intuition was imperative to complex decisionmaking, 68 difficult to measure and assess in a quantitative manner, and was not linked to physiologic measures. 74

Intuition is a way of explaining professional expertise. 75 Expert nurses rely on their intuitive judgment that has been developed over time. 39 , 76 Intuition is an informal, nonanalytically based, unstructured, deliberate calculation that facilitates problem solving, 77 a process of arriving at salient conclusions based on relatively small amounts of knowledge and/or information. 78 Experts can have rapid insight into a situation by using intuition to recognize patterns and similarities, achieve commonsense understanding, and sense the salient information combined with deliberative rationality. 10 Intuitive recognition of similarities and commonalities between patients are often the first diagnostic clue or early warning, which must then be followed up with critical evaluation of evidence among the competing conditions. This situation calls for intuitive judgment that can distinguish “expert human judgment from the decisions” made by a novice 79 (p. 23).

Shaw 80 equates intuition with direct perception. Direct perception is dependent upon being able to detect complex patterns and relationships that one has learned through experience are important. Recognizing these patterns and relationships generally occurs rapidly and is complex, making it difficult to articulate or describe. Perceptual skills, like those of the expert nurse, are essential to recognizing current and changing clinical conditions. Perception requires attentiveness and the development of a sense of what is salient. Often in nursing and medicine, means and ends are fused, as is the case for a “good enough” birth experience and a peaceful death.

  • Applying Practice Evidence

Research continues to find that using evidence-based guidelines in practice, informed through research evidence, improves patients’ outcomes. 81–83 Research-based guidelines are intended to provide guidance for specific areas of health care delivery. 84 The clinician—both the novice and expert—is expected to use the best available evidence for the most efficacious therapies and interventions in particular instances, to ensure the highest-quality care, especially when deviations from the evidence-based norm may heighten risks to patient safety. Otherwise, if nursing and medicine were exact sciences, or consisted only of techne, then a 1:1 relationship could be established between results of aggregated evidence-based research and the best path for all patients.

Evaluating Evidence

Before research should be used in practice, it must be evaluated. There are many complexities and nuances in evaluating the research evidence for clinical practice. Evaluation of research behind evidence-based medicine requires critical thinking and good clinical judgment. Sometimes the research findings are mixed or even conflicting. As such, the validity, reliability, and generalizability of available research are fundamental to evaluating whether evidence can be applied in practice. To do so, clinicians must select the best scientific evidence relevant to particular patients—a complex process that involves intuition to apply the evidence. Critical thinking is required for evaluating the best available scientific evidence for the treatment and care of a particular patient.

Good clinical judgment is required to select the most relevant research evidence. The best clinical judgment, that is, reasoning across time about the particular patient through changes in the patient’s concerns and condition and/or the clinician’s understanding, are also required. This type of judgment requires clinicians to make careful observations and evaluations of the patient over time, as well as know the patient’s concerns and social circumstances. To evolve to this level of judgment, additional education beyond clinical preparation if often required.

Sources of Evidence

Evidence that can be used in clinical practice has different sources and can be derived from research, patient’s preferences, and work-related experience. 85 , 86 Nurses have been found to obtain evidence from experienced colleagues believed to have clinical expertise and research-based knowledge 87 as well as other sources.

For many years now, randomized controlled trials (RCTs) have often been considered the best standard for evaluating clinical practice. Yet, unless the common threats to the validity (e.g., representativeness of the study population) and reliability (e.g., consistency in interventions and responses of study participants) of RCTs are addressed, the meaningfulness and generalizability of the study outcomes are very limited. Relevant patient populations may be excluded, such as women, children, minorities, the elderly, and patients with multiple chronic illnesses. The dropout rate of the trial may confound the results. And it is easier to get positive results published than it is to get negative results published. Thus, RCTs are generalizable (i.e., applicable) only to the population studied—which may not reflect the needs of the patient under the clinicians care. In instances such as these, clinicians need to also consider applied research using prospective or retrospective populations with case control to guide decisionmaking, yet this too requires critical thinking and good clinical judgment.

Another source of available evidence may come from the gold standard of aggregated systematic evaluation of clinical trial outcomes for the therapy and clinical condition in question, be generated by basic and clinical science relevant to the patient’s particular pathophysiology or care need situation, or stem from personal clinical experience. The clinician then takes all of the available evidence and considers the particular patient’s known clinical responses to past therapies, their clinical condition and history, the progression or stages of the patient’s illness and recovery, and available resources.

In clinical practice, the particular is examined in relation to the established generalizations of science. With readily available summaries of scientific evidence (e.g., systematic reviews and practice guidelines) available to nurses and physicians, one might wonder whether deep background understanding is still advantageous. Might it not be expendable, since it is likely to be out of date given the current scientific evidence? But this assumption is a false opposition and false choice because without a deep background understanding, the clinician does not know how to best find and evaluate scientific evidence for the particular case in hand. The clinician’s sense of salience in any given situation depends on past clinical experience and current scientific evidence.

Evidence-Based Practice

The concept of evidence-based practice is dependent upon synthesizing evidence from the variety of sources and applying it appropriately to the care needs of populations and individuals. This implies that evidence-based practice, indicative of expertise in practice, appropriately applies evidence to the specific situations and unique needs of patients. 88 , 89 Unfortunately, even though providing evidence-based care is an essential component of health care quality, it is well known that evidence-based practices are not used consistently.

Conceptually, evidence used in practice advances clinical knowledge, and that knowledge supports independent clinical decisions in the best interest of the patient. 90 , 91 Decisions must prudently consider the factors not necessarily addressed in the guideline, such as the patient’s lifestyle, drug sensitivities and allergies, and comorbidities. Nurses who want to improve the quality and safety of care can do so though improving the consistency of data and information interpretation inherent in evidence-based practice.

Initially, before evidence-based practice can begin, there needs to be an accurate clinical judgment of patient responses and needs. In the course of providing care, with careful consideration of patient safety and quality care, clinicians must give attention to the patient’s condition, their responses to health care interventions, and potential adverse reactions or events that could harm the patient. Nonetheless, there is wide variation in the ability of nurses to accurately interpret patient responses 92 and their risks. 93 Even though variance in interpretation is expected, nurses are obligated to continually improve their skills to ensure that patients receive quality care safely. 94 Patients are vulnerable to the actions and experience of their clinicians, which are inextricably linked to the quality of care patients have access to and subsequently receive.

The judgment of the patient’s condition determines subsequent interventions and patient outcomes. Attaining accurate and consistent interpretations of patient data and information is difficult because each piece can have different meanings, and interpretations are influenced by previous experiences. 95 Nurses use knowledge from clinical experience 96 , 97 and—although infrequently—research. 98–100

Once a problem has been identified, using a process that utilizes critical thinking to recognize the problem, the clinician then searches for and evaluates the research evidence 101 and evaluates potential discrepancies. The process of using evidence in practice involves “a problem-solving approach that incorporates the best available scientific evidence, clinicians’ expertise, and patient’s preferences and values” 102 (p. 28). Yet many nurses do not perceive that they have the education, tools, or resources to use evidence appropriately in practice. 103

Reported barriers to using research in practice have included difficulty in understanding the applicability and the complexity of research findings, failure of researchers to put findings into the clinical context, lack of skills in how to use research in practice, 104 , 105 amount of time required to access information and determine practice implications, 105–107 lack of organizational support to make changes and/or use in practice, 104 , 97 , 105 , 107 and lack of confidence in one’s ability to critically evaluate clinical evidence. 108

When Evidence Is Missing

In many clinical situations, there may be no clear guidelines and few or even no relevant clinical trials to guide decisionmaking. In these cases, the latest basic science about cellular and genomic functioning may be the most relevant science, or by default, guestimation. Consequently, good patient care requires more than a straightforward, unequivocal application of scientific evidence. The clinician must be able to draw on a good understanding of basic sciences, as well as guidelines derived from aggregated data and information from research investigations.

Practical knowledge is shaped by one’s practice discipline and the science and technology relevant to the situation at hand. But scientific, formal, discipline-specific knowledge are not sufficient for good clinical practice, whether the discipline be law, medicine, nursing, teaching, or social work. Practitioners still have to learn how to discern generalizable scientific knowledge, know how to use scientific knowledge in practical situations, discern what scientific evidence/knowledge is relevant, assess how the particular patient’s situation differs from the general scientific understanding, and recognize the complexity of care delivery—a process that is complex, ongoing, and changing, as new evidence can overturn old.

Practice communities like individual practitioners may also be mistaken, as is illustrated by variability in practice styles and practice outcomes across hospitals and regions in the United States. This variability in practice is why practitioners must learn to critically evaluate their practice and continually improve their practice over time. The goal is to create a living self-improving tradition.

Within health care, students, scientists, and practitioners are challenged to learn and use different modes of thinking when they are conflated under one term or rubric, using the best-suited thinking strategies for taking into consideration the purposes and the ends of the reasoning. Learning to be an effective, safe nurse or physician requires not only technical expertise, but also the ability to form helping relationships and engage in practical ethical and clinical reasoning. 50 Good ethical comportment requires that both the clinician and the scientist take into account the notions of good inherent in clinical and scientific practices. The notions of good clinical practice must include the relevant significance and the human concerns involved in decisionmaking in particular situations, centered on clinical grasp and clinical forethought.

The Three Apprenticeships of Professional Education

We have much to learn in comparing the pedagogies of formation across the professions, such as is being done currently by the Carnegie Foundation for the Advancement of Teaching. The Carnegie Foundation’s broad research program on the educational preparation of the profession focuses on three essential apprenticeships:

To capture the full range of crucial dimensions in professional education, we developed the idea of a three-fold apprenticeship: (1) intellectual training to learn the academic knowledge base and the capacity to think in ways important to the profession; (2) a skill-based apprenticeship of practice; and (3) an apprenticeship to the ethical standards, social roles, and responsibilities of the profession, through which the novice is introduced to the meaning of an integrated practice of all dimensions of the profession, grounded in the profession’s fundamental purposes. 109

This framework has allowed the investigators to describe tensions and shortfalls as well as strengths of widespread teaching practices, especially at articulation points among these dimensions of professional training.

Research has demonstrated that these three apprenticeships are taught best when they are integrated so that the intellectual training includes skilled know-how, clinical judgment, and ethical comportment. In the study of nursing, exemplary classroom and clinical teachers were found who do integrate the three apprenticeships in all of their teaching, as exemplified by the following anonymous student’s comments:

With that as well, I enjoyed the class just because I do have clinical experience in my background and I enjoyed it because it took those practical applications and the knowledge from pathophysiology and pharmacology, and all the other classes, and it tied it into the actual aspects of like what is going to happen at work. For example, I work in the emergency room and question: Why am I doing this procedure for this particular patient? Beforehand, when I was just a tech and I wasn’t going to school, I’d be doing it because I was told to be doing it—or I’d be doing CPR because, you know, the doc said, start CPR. I really enjoy the Care and Illness because now I know the process, the pathophysiological process of why I’m doing it and the clinical reasons of why they’re making the decisions, and the prioritization that goes on behind it. I think that’s the biggest point. Clinical experience is good, but not everybody has it. Yet when these students transition from school and clinicals to their job as a nurse, they will understand what’s going on and why.

The three apprenticeships are equally relevant and intertwined. In the Carnegie National Study of Nursing Education and the companion study on medical education as well as in cross-professional comparisons, teaching that gives an integrated access to professional practice is being examined. Once the three apprenticeships are separated, it is difficult to reintegrate them. The investigators are encouraged by teaching strategies that integrate the latest scientific knowledge and relevant clinical evidence with clinical reasoning about particular patients in unfolding rather than static cases, while keeping the patient and family experience and concerns relevant to clinical concerns and reasoning.

Clinical judgment or phronesis is required to evaluate and integrate techne and scientific evidence.

Within nursing, professional practice is wise and effective usually to the extent that the professional creates relational and communication contexts where clients/patients can be open and trusting. Effectiveness depends upon mutual influence between patient and practitioner, student and learner. This is another way in which clinical knowledge is dialogical and socially distributed. The following articulation of practical reasoning in nursing illustrates the social, dialogical nature of clinical reasoning and addresses the centrality of perception and understanding to good clinical reasoning, judgment and intervention.

Clinical Grasp *

Clinical grasp describes clinical inquiry in action. Clinical grasp begins with perception and includes problem identification and clinical judgment across time about the particular transitions of particular patients. Garrett Chan 20 described the clinician’s attempt at finding an “optimal grasp” or vantage point of understanding. Four aspects of clinical grasp, which are described in the following paragraphs, include (1) making qualitative distinctions, (2) engaging in detective work, (3) recognizing changing relevance, and (4) developing clinical knowledge in specific patient populations.

Making Qualitative Distinctions

Qualitative distinctions refer to those distinctions that can be made only in a particular contextual or historical situation. The context and sequence of events are essential for making qualitative distinctions; therefore, the clinician must pay attention to transitions in the situation and judgment. Many qualitative distinctions can be made only by observing differences through touch, sound, or sight, such as the qualities of a wound, skin turgor, color, capillary refill, or the engagement and energy level of the patient. Another example is assessing whether the patient was more fatigued after ambulating to the bathroom or from lack of sleep. Likewise the quality of the clinician’s touch is distinct as in offering reassurance, putting pressure on a bleeding wound, and so on. 110

Engaging in Detective Work, Modus Operandi Thinking, and Clinical Puzzle Solving

Clinical situations are open ended and underdetermined. Modus operandi thinking keeps track of the particular patient, the way the illness unfolds, the meanings of the patient’s responses as they have occurred in the particular time sequence. Modus operandi thinking requires keeping track of what has been tried and what has or has not worked with the patient. In this kind of reasoning-in-transition, gains and losses of understanding are noticed and adjustments in the problem approach are made.

We found that teachers in a medical surgical unit at the University of Washington deliberately teach their students to engage in “detective work.” Students are given the daily clinical assignment of “sleuthing” for undetected drug incompatibilities, questionable drug dosages, and unnoticed signs and symptoms. For example, one student noted that an unusual dosage of a heart medication was being given to a patient who did not have heart disease. The student first asked her teacher about the unusually high dosage. The teacher, in turn, asked the student whether she had asked the nurse or the patient about the dosage. Upon the student’s questioning, the nurse did not know why the patient was receiving the high dosage and assumed the drug was for heart disease. The patient’s staff nurse had not questioned the order. When the student asked the patient, the student found that the medication was being given for tremors and that the patient and the doctor had titrated the dosage for control of the tremors. This deliberate approach to teaching detective work, or modus operandi thinking, has characteristics of “critical reflection,” but stays situated and engaged, ferreting out the immediate history and unfolding of events.

Recognizing Changing Clinical Relevance

The meanings of signs and symptoms are changed by sequencing and history. The patient’s mental status, color, or pain level may continue to deteriorate or get better. The direction, implication, and consequences for the changes alter the relevance of the particular facts in the situation. The changing relevance entailed in a patient transitioning from primarily curative care to primarily palliative care is a dramatic example, where symptoms literally take on new meanings and require new treatments.

Developing Clinical Knowledge in Specific Patient Populations

Extensive experience with a specific patient population or patients with particular injuries or diseases allows the clinician to develop comparisons, distinctions, and nuanced differences within the population. The comparisons between many specific patients create a matrix of comparisons for clinicians, as well as a tacit, background set of expectations that create population- and patient-specific detective work if a patient does not meet the usual, predictable transitions in recovery. What is in the background and foreground of the clinician’s attention shifts as predictable changes in the patient’s condition occurs, such as is seen in recovering from heart surgery or progressing through the predictable stages of labor and delivery. Over time, the clinician develops a deep background understanding that allows for expert diagnostic and interventions skills.

Clinical Forethought

Clinical forethought is intertwined with clinical grasp, but it is much more deliberate and even routinized than clinical grasp. Clinical forethought is a pervasive habit of thought and action in nursing practice, and also in medicine, as clinicians think about disease and recovery trajectories and the implications of these changes for treatment. Clinical forethought plays a role in clinical grasp because it structures the practical logic of clinicians. At least four habits of thought and action are evident in what we are calling clinical forethought: (1) future think, (2) clinical forethought about specific patient populations, (3) anticipation of risks for particular patients, and (4) seeing the unexpected.

Future think

Future think is the broadest category of this logic of practice. Anticipating likely immediate futures helps the clinician make good plans and decisions about preparing the environment so that responding rapidly to changes in the patient is possible. Without a sense of salience about anticipated signs and symptoms and preparing the environment, essential clinical judgments and timely interventions would be impossible in the typically fast pace of acute and intensive patient care. Future think governs the style and content of the nurse’s attentiveness to the patient. Whether in a fast-paced care environment or a slower-paced rehabilitation setting, thinking and acting with anticipated futures guide clinical thinking and judgment. Future think captures the way judgment is suspended in a predictive net of anticipation and preparing oneself and the environment for a range of potential events.

Clinical forethought about specific diagnoses and injuries

This habit of thought and action is so second nature to the experienced nurse that the new or inexperienced nurse may have difficulty finding out about what seems to other colleagues as “obvious” preparation for particular patients and situations. Clinical forethought involves much local specific knowledge about who is a good resource and how to marshal support services and equipment for particular patients.

Examples of preparing for specific patient populations are pervasive, such as anticipating the need for a pacemaker during surgery and having the equipment assembled ready for use to save essential time. Another example includes forecasting an accident victim’s potential injuries, and recognizing that intubation might be needed.

Anticipation of crises, risks, and vulnerabilities for particular patients

This aspect of clinical forethought is central to knowing the particular patient, family, or community. Nurses situate the patient’s problems almost like a topography of possibilities. This vital clinical knowledge needs to be communicated to other caregivers and across care borders. Clinical teaching could be improved by enriching curricula with narrative examples from actual practice, and by helping students recognize commonly occurring clinical situations in the simulation and clinical setting. For example, if a patient is hemodynamically unstable, then managing life-sustaining physiologic functions will be a main orienting goal. If the patient is agitated and uncomfortable, then attending to comfort needs in relation to hemodynamics will be a priority. Providing comfort measures turns out to be a central background practice for making clinical judgments and contains within it much judgment and experiential learning.

When clinical teaching is too removed from typical contingencies and strong clinical situations in practice, students will lack practice in active thinking-in-action in ambiguous clinical situations. In the following example, an anonymous student recounted her experiences of meeting a patient:

I was used to different equipment and didn’t know how things went, didn’t know their routine, really. You can explain all you want in class, this is how it’s going to be, but when you get there … . Kim was my first instructor and my patient that she assigned me to—I walked into the room and he had every tube imaginable. And so I was a little overwhelmed. It’s not necessarily even that he was that critical … . She asked what tubes here have you seen? Well, I know peripheral lines. You taught me PICC [peripherally inserted central catheter] lines, and we just had that, but I don’t really feel comfortable doing it by myself, without you watching to make sure that I’m flushing it right and how to assess it. He had a chest tube and I had seen chest tubes, but never really knew the depth of what you had to assess and how you make sure that it’s all kosher and whatever. So she went through the chest tube and explained, it’s just bubbling a little bit and that’s okay. The site, check the site. The site looked okay and that she’d say if it wasn’t okay, this is what it might look like … . He had a feeding tube. I had done feeding tubes but that was like a long time ago in my LPN experiences schooling. So I hadn’t really done too much with the feeding stuff either … . He had a [nasogastric] tube, and knew pretty much about that and I think at the time it was clamped. So there were no issues with the suction or whatever. He had a Foley catheter. He had a feeding tube, a chest tube. I can’t even remember but there were a lot.

As noted earlier, a central characteristic of a practice discipline is that a self-improving practice requires ongoing experiential learning. One way nurse educators can enhance clinical inquiry is by increasing pedagogies of experiential learning. Current pedagogies for experiential learning in nursing include extensive preclinical study, care planning, and shared postclinical debriefings where students share their experiential learning with their classmates. Experiential learning requires open learning climates where students can discuss and examine transitions in understanding, including their false starts, or their misconceptions in actual clinical situations. Nursing educators typically develop open and interactive clinical learning communities, so that students seem committed to helping their classmates learn from their experiences that may have been difficult or even unsafe. One anonymous nurse educator described how students extend their experiential learning to their classmates during a postclinical conference:

So for example, the patient had difficulty breathing and the student wanted to give the meds instead of addressing the difficulty of breathing. Well, while we were sharing information about their patients, what they did that day, I didn’t tell the student to say this, but she said, ‘I just want to tell you what I did today in clinical so you don’t do the same thing, and here’s what happened.’ Everybody’s listening very attentively and they were asking her some questions. But she shared that. She didn’t have to. I didn’t tell her, you must share that in postconference or anything like that, but she just went ahead and shared that, I guess, to reinforce what she had learned that day but also to benefit her fellow students in case that thing comes up with them.

The teacher’s response to this student’s honesty and generosity exemplifies her own approach to developing an open community of learning. Focusing only on performance and on “being correct” prevents learning from breakdown or error and can dampen students’ curiosity and courage to learn experientially.

Seeing the unexpected

One of the keys to becoming an expert practitioner lies in how the person holds past experiential learning and background habitual skills and practices. This is a skill of foregrounding attention accurately and effectively in response to the nature of situational demands. Bourdieu 29 calls the recognition of the situation central to practical reasoning. If nothing is routinized as a habitual response pattern, then practitioners will not function effectively in emergencies. Unexpected occurrences may be overlooked. However, if expectations are held rigidly, then subtle changes from the usual will be missed, and habitual, rote responses will inappropriately rule. The clinician must be flexible in shifting between what is in background and foreground. This is accomplished by staying curious and open. The clinical “certainty” associated with perceptual grasp is distinct from the kind of “certainty” achievable in scientific experiments and through measurements. Recognition of similar or paradigmatic clinical situations is similar to “face recognition” or recognition of “family resemblances.” This concept is subject to faulty memory, false associative memories, and mistaken identities; therefore, such perceptual grasp is the beginning of curiosity and inquiry and not the end. Assessment and validation are required. In rapidly moving clinical situations, perceptual grasp is the starting point for clarification, confirmation, and action. Having the clinician say out loud how he or she is understanding the situation gives an opportunity for confirmation and disconfirmation from other clinicians present. 111 The relationship between foreground and background of attention needs to be fluid, so that missed expectations allow the nurse to see the unexpected. For example, when the background rhythm of a cardiac monitor changes, the nurse notices, and what had been background tacit awareness becomes the foreground of attention. A hallmark of expertise is the ability to notice the unexpected. 20 Background expectations of usual patient trajectories form with experience. Tacit expectations for patient trajectories form that enable the nurse to notice subtle failed expectations and pay attention to early signs of unexpected changes in the patient's condition. Clinical expectations gained from caring for similar patient populations form a tacit clinical forethought that enable the experienced clinician to notice missed expectations. Alterations from implicit or explicit expectations set the stage for experiential learning, depending on the openness of the learner.

Learning to provide safe and quality health care requires technical expertise, the ability to think critically, experience, and clinical judgment. The high-performance expectation of nurses is dependent upon the nurses’ continual learning, professional accountability, independent and interdependent decisionmaking, and creative problem-solving abilities.

This section of the paper was condensed and paraphrased from Benner, Hooper-Kyriakidis, and Stannard. 23 Patricia Hooper-Kyriakidis wrote the section on clinical grasp, and Patricia Benner wrote the section on clinical forethought.

  • Cite this Page Benner P, Hughes RG, Sutphen M. Clinical Reasoning, Decisionmaking, and Action: Thinking Critically and Clinically. In: Hughes RG, editor. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Rockville (MD): Agency for Healthcare Research and Quality (US); 2008 Apr. Chapter 6.
  • PDF version of this page (147K)

In this Page

  • Clinical Grasp

Other titles in this collection

  • Advances in Patient Safety

Related information

  • PMC PubMed Central citations
  • PubMed Links to PubMed

Similar articles in PubMed

  • Nurses' reasoning process during care planning taking pressure ulcer prevention as an example. A think-aloud study. [Int J Nurs Stud. 2007] Nurses' reasoning process during care planning taking pressure ulcer prevention as an example. A think-aloud study. Funkesson KH, Anbäcken EM, Ek AC. Int J Nurs Stud. 2007 Sep; 44(7):1109-19. Epub 2006 Jun 27.
  • Registered nurses' clinical reasoning skills and reasoning process: A think-aloud study. [Nurse Educ Today. 2016] Registered nurses' clinical reasoning skills and reasoning process: A think-aloud study. Lee J, Lee YJ, Bae J, Seo M. Nurse Educ Today. 2016 Nov; 46:75-80. Epub 2016 Aug 15.
  • Combining the arts: an applied critical thinking approach in the skills laboratory. [Nursingconnections. 2000] Combining the arts: an applied critical thinking approach in the skills laboratory. Peterson MJ, Bechtel GA. Nursingconnections. 2000 Summer; 13(2):43-9.
  • Review About critical thinking. [Dynamics. 2004] Review About critical thinking. Hynes P, Bennett J. Dynamics. 2004 Fall; 15(3):26-9.
  • Review The 'five rights' of clinical reasoning: an educational model to enhance nursing students' ability to identify and manage clinically 'at risk' patients. [Nurse Educ Today. 2010] Review The 'five rights' of clinical reasoning: an educational model to enhance nursing students' ability to identify and manage clinically 'at risk' patients. Levett-Jones T, Hoffman K, Dempsey J, Jeong SY, Noble D, Norton CA, Roche J, Hickey N. Nurse Educ Today. 2010 Aug; 30(6):515-20. Epub 2009 Nov 30.

Recent Activity

  • Clinical Reasoning, Decisionmaking, and Action: Thinking Critically and Clinical... Clinical Reasoning, Decisionmaking, and Action: Thinking Critically and Clinically - Patient Safety and Quality

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

Connect with NLM

National Library of Medicine 8600 Rockville Pike Bethesda, MD 20894

Web Policies FOIA HHS Vulnerability Disclosure

Help Accessibility Careers

statistics

Critical thinking definition

characteristics of evaluation in critical thinking

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

characteristics of evaluation in critical thinking

Key elements of Critical Thinking

What is critical thinking.

Critical thinking is the process of actively, and skilfully, analysing, evaluating, and synthesizing information to make reasoned and well-informed decisions or judgments.

How does critical thinking work?

Critical thinking involves the ability to objectively assess arguments, evidence, and ideas, identifying strengths and weaknesses, and arriving at logical and rational conclusions.

What are the Key elements of critical thinking?

  • Analysis : Carefully examining information and breaking it down into its components or parts to understand its meaning and implications.
  • Evaluation: Assessing the quality, relevance, and reliability of information, sources, or arguments to determine their credibility and validity.
  • Inference: Drawing logical conclusions based on available evidence and sound reasoning.
  • Deduction : Making specific conclusions based on general principles or premises.
  • Induction: Forming theories based on specific observations or evidence.
  • Problem-solving : Applying critical thinking skills to identify and solve problems effectively.
  • Scepticism: Questioning assumptions, biases, and preconceptions, and being open to alternative viewpoints and perspectives.
  • Decision-making : Using critical thinking to make informed decisions based on a careful evaluation of the available information.
  • Communication: Expressing ideas and arguments clearly and logically, supporting them with evidence and reasoning.

Why is critical thinking an essential skill in the workplace?

Critical thinking is an essential skill in various aspects of life, including education, professional settings, and everyday situations. It enables individuals to make well-reasoned judgments, avoid fallacies and biases, and navigate complex issues with a balanced and thoughtful approach.

Developing critical thinking skills can lead to better problem-solving, enhanced creativity, and improved decision-making abilities.

Develop critical thinking skills  is a micro-credential available to at the Australian Qualifications Institute , an RTO specialising in Business, leadership and HR, offers micro-credentials in these areas. Build your skills using micro-credentials and, if you choose, build those skills to a Nationally Recognised Diploma . Get in touch with Australian Qualifications Institute at [email protected] for personalised counselling in broadening your framework of skills with micro-credentials.

Previous Post How can I improve critical thinking skills?

Next post a step-by-step guide: how to run a successful project.

characteristics of evaluation in critical thinking

© 2024 Australian Qualifications Institute. | RTO Code 40744

  • Giving Back

American University

The impact of critical policy evaluation professional development on education policy practitioner's knowledge

Shifts in the sociopolitical environment further spotlighted the need for reforming our nation’s educational inequity problems (Hernández, 2020). As policy change or policy reforms target inequities, the goal is to improve student outcomes and access to opportunity. Policy evaluation in education systems is an important and often underutilized tool in reforming public problems (Golden, 2020). Policy evaluation illuminates the efficacy of policy reforms, clarifying what is working and why it works (Golden, 2020). Today, policymakers are using policy reforms as a tool to combat inequities, without supporting policy practitioners, analysts, and researchers with the capacity, mindsets, and infrastructure for the effective evaluation of the impacts of policy reforms, thus limiting public accountability and effective policy learning opportunities (Golden, 2020). This limits the progress of equity-focused policy reforms in public schools because society never realizes the full possibilities for impacting student outcomes and dismantling structures of injustice (Golden, 2020). As equity gaps persist in our public education systems, there is a need for a professional development training and policy evaluation tool that: (1) Expands policy practitioners’ evaluative thinking skills and policy evaluation best practices. (2) Builds on the tenants of evaluation culture to grow evaluation capacity. (3) Analyze equity-based reforms through an outcomes-based lens. This study utilizes a mixed-methods approach to understand how policy practitioners engage with the methodology and tools of professional development and critical outcomes-based policy evaluation tool and the impact that learning and using the evaluation tool had on their evaluative thinking. The research explored the following questions (1) In what ways do critical outcomes-based policy evaluation tools and methods impact policy practitioners’ knowledge? (2) How do practitioners engage with a critical outcomes-based policy evaluation tool? (3) How do leadership conditions and structures support practitioners in engaging in critical outcomes-based policy evaluation? The study found that the professional development component improved practitioners’ knowledge, skills, and mindsets with evaluative thinking and evaluation culture. The pretest to post scores highlighted the efficacy of the training and professional development component with practitioners. The impact of the tool was inconclusive because of the lack of a full implementation among the participants. However, the findings revealed that policy practitioners engaged practically with the evaluation tool and implemented parts and sections of the tool in their practice that were aligned to current needs and work streams and found that favorable given the demands of their roles. Leadership conditions were found to be a significant factor in the tool’s success’s implementation and use. With the role of partisan politics ever-present and shaping the policy priorities of some evaluation organizations and agencies, the actions and acceptance of leadership favorable to these types of practices will promote the use of this type of evaluation. The recommendations for this work moving forward are increasing the focus of the intervention on the organizational leadership in policy organizations, policy practitioners getting more support in political navigation, and additional versions of the policy evaluation tool that support feasible and practical use.

Usage metrics

Theses and Dissertations

  • Education policy, sociology and philosophy

SEP logo

  • Table of Contents
  • New in this Archive
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment. Political and business leaders endorse its importance.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o'clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68-69; 1933: 91-92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot's position, it must appear to project far out in front of the boat. Morevoer, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69-70; 1933: 92-93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond line from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009), others on the resulting judgment (Facione 1990a), and still others on the subsequent emotive response (Siegel 1988).

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in frequency in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the frequency of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Critical thinking dispositions can usefully be divided into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started) (Facione 1990a: 25). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), and Black (2012).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work.

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? Abrami et al. (2015) found that in the experimental and quasi-experimental studies that they analyzed dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), and Bailin et al. (1999b).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Casserly, Megan, 2012, “The 10 Skills That Will Get You Hired in 2013”, Forbes , Dec. 10, 2012. Available at https://www.forbes.com/sites/meghancasserly/2012/12/10/the-10-skills-that-will-get-you-a-job-in-2013/#79e7ff4e633d ; accessed 2017 11 06.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; accessed 2017 09 26.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; accessed 2018 04 09.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; accessed 2018 04 14.
  • Dumke, Glenn S., 1980, Chancellor’s Executive Order 338 , Long Beach, CA: California State University, Chancellor’s Office. Available at https://www.calstate.edu/eo/EO-338.pdf ; accessed 2017 11 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”. Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; accessed 2017 12 02.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://drive.google.com/file/d/0BzUoP_pmwy1gdEpCR05PeW9qUzA/view ; accessed 2017 12 01.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • Obama, Barack, 2014, State of the Union Address , January 28, 2014. [ Obama 2014 available online ]
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Information available at http://www.ocr.org.uk/qualifications/as-a-level-gce-critical-thinking-h052-h452/ ; accessed 2017 10 12.
  • OECD [Organization for Economic Cooperation and Development] Centre for Educational Research and Innovation, 2018, Fostering and Assessing Students’ Creative and Critical Thinking Skills in Higher Education , Paris: OECD. Available at http://www.oecd.org/education/ceri/Fostering-and-assessing-students-creative-and-critical-thinking-skills-in-higher-education.pdf ; accessed 2018 04 22.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; accessed 2017 11 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; accessed 2017 11 29.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2011, Curriculum for the Compulsory School, Preschool Class and the Recreation Centre , Stockholm: Ordförrådet AB. Available at http://malmo.se/download/18.29c3b78a132728ecb52800034181/pdf2687.pdf ; accessed 2017 11 16.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up this entry topic at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Center for Teaching Thinking (CTT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach (criticalTHINKING.net)
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2018 by David Hitchcock < hitchckd @ mcmaster . ca >

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

Stanford Center for the Study of Language and Information

The Stanford Encyclopedia of Philosophy is copyright © 2016 by The Metaphysics Research Lab , Center for the Study of Language and Information (CSLI), Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

medRxiv

OpenSAFELY: Effectiveness of COVID-19 vaccination in children and adolescents

  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Colm D Andrews
  • For correspondence: [email protected]
  • ORCID record for Edward P K Parker
  • ORCID record for Andrea L Schaffer
  • ORCID record for Amelia CA Green
  • ORCID record for Helen J Curtis
  • ORCID record for Alex J Walker
  • ORCID record for Lucy Bridges
  • ORCID record for Christopher Wood
  • ORCID record for Christopher Bates
  • ORCID record for Jonathan Cockburn
  • ORCID record for Amir Mehrkar
  • ORCID record for Brian MacKenna
  • ORCID record for Sebastian CJ Bacon
  • ORCID record for Ben Goldacre
  • ORCID record for Jonathan AC Sterne
  • ORCID record for William J Hulme
  • Info/History
  • Supplementary material
  • Preview PDF

Background Children and adolescents in England were offered BNT162b2 as part of the national COVID-19 vaccine roll out from September 2021. We assessed the safety and effectiveness of first and second dose BNT162b2 COVID-19 vaccination in children and adolescents in England.

Methods With the approval of NHS England, we conducted an observational study in the OpenSAFELY-TPP database, including a) adolescents aged 12-15 years, and b) children aged 5-11 years and comparing individuals receiving i) first vaccination with unvaccinated controls and ii) second vaccination to single-vaccinated controls. We matched vaccinated individuals with controls on age, sex, region, and other important characteristics. Outcomes were positive SARS-CoV-2 test (adolescents only); COVID-19 A&E attendance; COVID-19 hospitalisation; COVID-19 critical care admission; COVID-19 death, with non-COVID-19 death and fractures as negative control outcomes and A&E attendance, unplanned hospitalisation, pericarditis, and myocarditis as safety outcomes.

Results Amongst 820,926 previously unvaccinated adolescents, the incidence rate ratio (IRR) for positive SARS-CoV-2 test comparing vaccination with no vaccination was 0.74 (95% CI 0.72-0.75), although the 20-week risks were similar. The IRRs were 0.60 (0.37-0.97) for COVID-19 A&E attendance, 0.58 (0.38-0.89) for COVID-19 hospitalisation, 0.99 (0.93-1.06) for fractures, 0.89 (0.87-0.91) for A&E attendances and 0.88 (0.81-0.95) for unplanned hospitalisation. Amongst 441,858 adolescents who had received first vaccination IRRs comparing second dose with first dose only were 0.67 (0.65-0.69) for positive SARS-CoV-2 test, 1.00 (0.20-4.96) for COVID-19 A&E attendance, 0.60 (0.26-1.37) for COVID-19 hospitalisation, 0.94 (0.84-1.05) for fractures, 0.93 (0.89-0.98) for A&E attendance and 0.99 (0.86-1.13) for unplanned hospitalisation. Amongst 283,422 previously unvaccinated children and 132,462 children who had received a first vaccine dose, COVID-19-related outcomes were too rare to allow IRRs to be estimated precisely. A&E attendance and unplanned hospitalisation were slightly higher after first vaccination (IRRs versus no vaccination 1.05 (1.01-1.10) and 1.10 (0.95-1.26) respectively) but slightly lower after second vaccination (IRRs versus first dose 0.95 (0.86-1.05) and 0.78 (0.56-1.08) respectively). There were no COVID-19-related deaths in any group. Fewer than seven (exact number redacted) COVID-19-related critical care admissions occurred in the adolescent first dose vs unvaccinated cohort. Among both adolescents and children, myocarditis and pericarditis were documented only in the vaccinated groups, with rates of 27 and 10 cases/million after first and second doses respectively.

Conclusion BNT162b2 vaccination in adolescents reduced COVID-19 A&E attendance and hospitalisation, although these outcomes were rare. Protection against positive SARS-CoV-2 tests was transient.

Competing Interest Statement

BG has received research funding from the Laura and John Arnold Foundation, the NHS National Institute for Health Research (NIHR), the NIHR School of Primary Care Research, NHS England, the NIHR Oxford Biomedical Research Centre, the Mohn-Westlake Foundation, NIHR Applied Research Collaboration Oxford and Thames Valley, the Wellcome Trust, the Good Thinking Foundation, Health Data Research UK, the Health Foundation, the World Health Organisation, UKRI MRC, Asthma UK, the British Lung Foundation, and the Longitudinal Health and Wellbeing strand of the National Core Studies programme; he is a Non-Executive Director at NHS Digital; he also receives personal income from speaking and writing for lay audiences on the misuse of science. BMK is also employed by NHS England working on medicines policy and clinical lead for primary care medicines data. IJD has received unrestricted research grants and holds shares in GlaxoSmithKline (GSK).

Funding Statement

The OpenSAFELY Platform is supported by grants from the Wellcome Trust (222097/Z/20/Z); MRC (MR/V015757/1, MC_PC-20059, MR/W016729/1); NIHR (NIHR135559, COV-LT2-0073), and Health Data Research UK (HDRUK2021.000, 2021.0157). In addition, this research used data assets made available as part of the Data and Connectivity National Core Study, led by Health Data Research UK in partnership with the Office for National Statistics and funded by UK Research and Innovation (grant ref MC_PC_20058). BG has also received funding from: the Bennett Foundation, the Wellcome Trust, NIHR Oxford Biomedical Research Centre, NIHR Applied Research Collaboration Oxford and Thames Valley, the Mohn-Westlake Foundation; all Bennett Institute staff are supported by BG's grants on this work. The views expressed are those of the authors and not necessarily those of the NIHR, NHS England, UK Health Security Agency (UKHSA) or the Department of Health and Social Care.

Funders had no role in the study design, collection, analysis, and interpretation of data; in the writing of the report; and in the decision to submit the article for publication.

Author Declarations

I confirm all relevant ethical guidelines have been followed, and any necessary IRB and/or ethics committee approvals have been obtained.

The details of the IRB/oversight body that provided approval or exemption for the research described are given below:

This study was approved by the Health Research Authority (REC reference 20/LO/0651) and by the London School of Hygeine and Tropical Medicine Ethics Board (reference 21863).

I confirm that all necessary patient/participant consent has been obtained and the appropriate institutional forms have been archived, and that any patient/participant/sample identifiers included were not known to anyone (e.g., hospital staff, patients or participants themselves) outside the research group so cannot be used to identify individuals.

I understand that all clinical trials and any other prospective interventional studies must be registered with an ICMJE-approved registry, such as ClinicalTrials.gov. I confirm that any such study reported in the manuscript has been registered and the trial registration ID is provided (note: if posting a prospective study registered retrospectively, please provide a statement in the trial ID field explaining why the study was not registered in advance).

I have followed all appropriate research reporting guidelines, such as any relevant EQUATOR Network research reporting checklist(s) and other pertinent material, if applicable.

Data Availability

All data were linked, stored and analysed securely using the OpenSAFELY platform, https://www.opensafely.org/ , as part of the NHS England OpenSAFELY COVID-19 service. Data include pseudonymised data such as coded diagnoses, medications and physiological parameters. No free text data was included. All code is shared openly for review and re-use under MIT open license [ https://github.com/opensafely/vaccine-effectiveness-in-kids ]. Detailed pseudonymised patient data is potentially re-identifiable and therefore not shared. Primary care records managed by the GP software provider, TPP were linked to ONS death data and the Index of Multiple Deprivation through OpenSAFELY.

View the discussion thread.

Supplementary Material

Thank you for your interest in spreading the word about medRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Reddit logo

Citation Manager Formats

  • EndNote (tagged)
  • EndNote 8 (xml)
  • RefWorks Tagged
  • Ref Manager
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Primary Care Research
  • Addiction Medicine (325)
  • Allergy and Immunology (635)
  • Anesthesia (169)
  • Cardiovascular Medicine (2420)
  • Dentistry and Oral Medicine (292)
  • Dermatology (208)
  • Emergency Medicine (382)
  • Endocrinology (including Diabetes Mellitus and Metabolic Disease) (857)
  • Epidemiology (11842)
  • Forensic Medicine (10)
  • Gastroenterology (705)
  • Genetic and Genomic Medicine (3800)
  • Geriatric Medicine (353)
  • Health Economics (642)
  • Health Informatics (2429)
  • Health Policy (943)
  • Health Systems and Quality Improvement (911)
  • Hematology (344)
  • HIV/AIDS (796)
  • Infectious Diseases (except HIV/AIDS) (13376)
  • Intensive Care and Critical Care Medicine (771)
  • Medical Education (373)
  • Medical Ethics (105)
  • Nephrology (404)
  • Neurology (3551)
  • Nursing (201)
  • Nutrition (531)
  • Obstetrics and Gynecology (686)
  • Occupational and Environmental Health (671)
  • Oncology (1845)
  • Ophthalmology (541)
  • Orthopedics (224)
  • Otolaryngology (288)
  • Pain Medicine (234)
  • Palliative Medicine (68)
  • Pathology (452)
  • Pediatrics (1046)
  • Pharmacology and Therapeutics (430)
  • Primary Care Research (425)
  • Psychiatry and Clinical Psychology (3217)
  • Public and Global Health (6217)
  • Radiology and Imaging (1303)
  • Rehabilitation Medicine and Physical Therapy (757)
  • Respiratory Medicine (839)
  • Rheumatology (381)
  • Sexual and Reproductive Health (377)
  • Sports Medicine (328)
  • Surgery (409)
  • Toxicology (51)
  • Transplantation (174)
  • Urology (148)

IMAGES

  1. What is critical thinking?

    characteristics of evaluation in critical thinking

  2. Critical Thinking

    characteristics of evaluation in critical thinking

  3. Critical thinking 10

    characteristics of evaluation in critical thinking

  4. Critical Thinking Definition, Skills, and Examples

    characteristics of evaluation in critical thinking

  5. Critical_Thinking_Skills_Diagram_svg

    characteristics of evaluation in critical thinking

  6. The vector illustration in a concept of pyramid of Critical Analysis

    characteristics of evaluation in critical thinking

VIDEO

  1. How to develop Critical Thinking And Analytical Skills

  2. What is the Importance of Critical Thinking in Evaluating Sources?

  3. The Elements of Critical Thinking

  4. What is Critical Thinking, and How Does it Enhance Problem Solving Skills?

  5. Every COMMON Critical Thinking Skills Explained In Just 10 Minutes

  6. 8 Characteristics of Critical Thinkers

COMMENTS

  1. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  2. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  3. PDF Evaluating Critical Thinking Skills: Two Conceptualizations

    An adapted form of Paul and Elder's (n.d.) list of criteria, including clarity, accuracy, precision, relevance, depth, breadth, and logic, was used as it focuses on specific criteria. A more detailed description of the model used at the beginning of the analysis is shown in Table 1. It has seven criteria.

  4. (PDF) Defining and Teaching Evaluative Thinking ...

    Defining and Teaching Evaluative Thinking: Insights From Research on Critical Thinking. May 2015. American Journal of Evaluation 36 (3) DOI: 10.1177/1098214015581706. Authors: Jane Buckley. Thomas ...

  5. Frontiers

    Enhancing students' critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves ...

  6. Critical Thinking and Problem-Solving

    What is Critical Thinking? ... inference, and evaluation" (Angelo, 1995, p. 6). "Critical thinking is thinking that assesses itself" (Center for Critical Thinking, 1996b). "Critical thinking is the ability to think about one's thinking in such a way as 1. To recognize its strengths and weaknesses and, as a result, 2. ... Characteristics of ...

  7. Defining and Teaching Evaluative Thinking:

    To that end, we propose that ET is essentially critical thinking applied to contexts of evaluation. We argue that ECB, and the field of evaluation more generally, would benefit from an explicit and transparent appropriation of well-established concepts and teaching strategies derived from the long history of work on critical thinking.

  8. Assessment of Critical Thinking

    8.1 Introduction to the Assessment Concept. The term "to assess has various meanings, such as to judge, evaluate, estimate, ". gauge, or determine. Assessment is therefore a diagnostic inventory of certain characteristics of a section of observable reality on the basis of defined criteria. In a pedagogical context, assessments aim to make ...

  9. PDF Chapter 5 The Role of Evidence Evaluation in Critical Thinking

    The Role of Evidence Evaluation in Critical Thinking: Fostering Epistemic Vigilance Ravit Golan Duncan, Veronica L. Cavera, and Clark A. Chinn 5.1 Introduction: Promoting Reasoning in Epistemically Unfriendly Contexts The current times, with a global pandemic, have brought into focus the dangers of

  10. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  11. Critical thinking

    Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation. The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of the mind, thus a critical thinker is a person who practices the ...

  12. Critical Thinking and Evaluating Information

    Critical thinking is logical and reflective thinking focused on deciding what to believe or do. Critical thinking involves questioning and evaluating information. Critical and creative thinking both contribute to our ability to solve problems in a variety of contexts. Evaluating information is a complex, but essential, process.

  13. Critical Thinking

    Critical thinking refers to the process of actively analyzing, assessing, synthesizing, evaluating and reflecting on information gathered from observation, experience, or communication. It is thinking in a clear, logical, reasoned, and reflective manner to solve problems or make decisions. Basically, critical thinking is taking a hard look at ...

  14. Assessment of Critical Thinking

    2.1 Observing Learners in the Process of Critical Thinking. The desire for empirical assessment of competence in CT has spawned a variety of different lines of argument and assessment procedures based on them, depending on intent, tradition, and associated conceptual understanding (Jahn, 2012a). Depending on what is understood by CT and what function the assessment is supposed to have, there ...

  15. What is critical thinking?

    Critical thinking is a kind of thinking in which you question, analyse, interpret , evaluate and make a judgement about what you read, hear, say, or write. The term critical comes from the Greek word kritikos meaning "able to judge or discern". Good critical thinking is about making reliable judgements based on reliable information.

  16. What is Critical Thinking?

    Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. Paul and Scriven go on to suggest that ...

  17. Evaluative thinking

    A disciplined approach. Evaluative thinking is a disciplined approach to inquiry and reflective practice that helps us make sound judgements using good evidence, as a matter of habit. Evaluating a strategic direction or project in a school draws on similar thinking processes and mental disciplines as assessing student performance or recruiting ...

  18. 9 characteristics of critical thinking

    These essential characteristics of critical thinking can be used as a toolkit for applying specific thinking processes to any given situation. 1. Curious. Curiosity is one of the most significant characteristics of critical thinking. Research has shown that a state of curiosity drives us to continually seek new information. This inquisitiveness ...

  19. Clinical Reasoning, Decisionmaking, and Action: Thinking Critically and

    The American Philosophical Association (APA) defined critical thinking as purposeful, self-regulatory judgment that uses cognitive tools such as interpretation, analysis, evaluation, inference, and explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations on which judgment is based. 2 A more ...

  20. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process ...

  21. What Are Critical Thinking Skills and Why Are They Important?

    It makes you a well-rounded individual, one who has looked at all of their options and possible solutions before making a choice. According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills.

  22. Key elements of Critical Thinking

    Problem-solving: Applying critical thinking skills to identify and solve problems effectively. Scepticism: Questioning assumptions, biases, and preconceptions, and being open to alternative viewpoints and perspectives. Decision-making: Using critical thinking to make informed decisions based on a careful evaluation of the available information.

  23. The impact of critical policy evaluation professional development on

    As equity gaps persist in our public education systems, there is a need for a professional development training and policy evaluation tool that: (1) Expands policy practitioners' evaluative thinking skills and policy evaluation best practices. (2) Builds on the tenants of evaluation culture to grow evaluation capacity.

  24. Critical Thinking

    Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking ...

  25. OpenSAFELY: Effectiveness of COVID-19 vaccination in children and

    Methods With the approval of NHS England, we conducted an observational study in the OpenSAFELY-TPP database, including a) adolescents aged 12-15 years, and b) children aged 5-11 years and comparing individuals receiving i) first vaccination with unvaccinated controls and ii) second vaccination to single-vaccinated controls. We matched vaccinated individuals with controls on age, sex, region ...