• About Tech Times
  • Faculty/Staff
  • Administration
  • Fundraisers
  • Student Organization Meetings

critical thinking assessment test tennessee tech

CAT Virtual Training Workshop

The Center for Assessment & Improvement of Learning (CAIL) and the Center for Innovation in Teaching & Learning (CITL) are excited to offer a critical-thinking and real-world problem-solving workshop at Tennessee Tech University July 22nd & 23rd   to kick-off our Authentic Assessment Academic Learning Community (ALC). This 2 half-day virtual workshop is based around the Critical-thinking Assessment Test (CAT). The CAT was developed at TNTech with support from the National Science Foundation and is used across the country to support institutional assessment and academic research. During the first half-day workshop, we will introduce the CAT and take a look at the test and see how our TNTech students respond to critical-thinking questions. On the second half-day workshop, we will introduce critical-thinking frameworks used in the CAT, practice the development of interdisciplinary critical-thinking activities, and brainstorm/develop content specific critical-thinking activities (CAT Apps) for your courses.

Click HERE to register!

  • Institutional Assessment

Critical Thinking

California Critical Thinking Skills Test (CCTST) is used as the current senior exit exam. The test objectively and reliably measures core reasoning skills for reflective decision making, including analysis, inference, evaluation, induction, deduction, interpretation, explanation, and numeracy. Implemented fall and spring semesters in the academic year at the college level. Sample includes all senior graduating cohorts.

  • Considerations when Interpreting CCTST Scores
  • 2023-24 CCTST
  • 2022-23 CCTST
  • 2021-22 CCTST
  • 2020-21 CCTST
  • 2019-20 CCTST
  • 2018-19 CCTST
  • 2017-18 CCTST
  • 2016-17 CCTST
  • 2015-16 CCTST
  • 2014-15 CCTST
  • 2018-2023 CCTST by College
  • 2018-2023 CCTST by Major
  • Office of Institutional Assessment, Research, and Effectiveness
  • Course Evaluations
  • Adult Learners
  • Major Field Exams
  • Reports Archive

Experience Tech For Yourself

Visit us to see what sets us apart.

Quick Links

  • Tech at a Glance
  • Majors & Concentrations
  • Colleges & Schools
  • Student Life
  • Research at Tech
  • Tech Express
  • Current Students
  • Faculty & Staff
  • Mission and Vision
  • Facts about Tech
  • University Rankings
  • Accreditation & Memberships
  • Maps & Directions
  • Board of Trustees
  • Office of the President
  • Strategic Plan
  • History of Tech
  • Parents & Family
  • International
  • Military & Veteran Affairs
  • Tuition & Fees
  • Financial Aid
  • Visit Campus
  • Scholarships
  • Dual Enrollment
  • Request Information
  • Office of the Provost
  • Academic Calendar
  • Undergraduate Catalog
  • Graduate Catalog
  • Volpe Library
  • Student Success Centers
  • Honors Program
  • Study Abroad
  • Living On Campus
  • Health & Wellness
  • Get Involved
  • Student Organizations
  • Safety & Security
  • Services for Students
  • Upcoming Events
  • Diversity Resources
  • Student Affairs
  • Featured Researchers
  • Research Centers
  • ttusports.com
  • Social Media
  • Student Resources
  • Faculty & Staff Resources
  • Bookstore/Dining/Parking
  • Pay Online - Eagle Pay
  • IT Help Desk
  • Strategic Planning
  • Office of IARE
  • Student Complaints

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Microbiol Biol Educ
  • v.16(2); 2015 Dec

Targeting Critical Thinking Skills in a First-Year Undergraduate Research Course †

Associated data.

Appendix 2: Student responses to “Please describe the single most important assignment, strategy, or other course aspect that made a meaningful difference to your higher-order thinking abilities”

TH!NK is a new initiative at NC State University focused on enhancing students’ higher-order cognitive skills. As part of this initiative, I explicitly emphasized critical and creative thinking in an existing bacteriophage discovery first-year research course. In addition to the typical activities associated with undergraduate research such as review of primary literature and writing research papers, another strategy employed to enhance students’ critical thinking skills was the use of discipline-specific, real-world scenarios. This paper outlines a general “formula” for writing scenarios, as well as several specific scenarios created for the described course. I also present how embedding aspects of the scenarios in reviews of the primary literature enriched the activity. I assessed student gains in critical thinking skills using a pre-/posttest model of the Critical Thinking Assessment Test (CAT), developed by Tennessee Technological University. I observed a positive gain trend in most of the individual skills assessed in the CAT, with a statistically significant large effect on critical thinking skills overall in students in the test group. I also show that a higher level of critical thinking skills was demonstrated in research papers written by students who participated in the scenarios compared with similar students who did not participate in the scenario activities. The scenario strategy described here can be modified for use in biology and other STEM disciplines, as well as in diverse disciplines in the social sciences and humanities.

INTRODUCTION

Robert Ennis ( 4 ) defines critical thinking as reflective and reasonable thinking that is focused on deciding what to believe or do. He argues that critical thinking is a practical activity that includes creative actions such as raising questions, formulating hypotheses, generating alternatives, and making plans about gathering information. Unfortunately, there is ample evidence that many adults, including college students, consistently fall prey to flawed and biased thinking ( 7 , 10 , 9 ). The good news is that critical and creative thinking skills are not limited to the few geniuses who are “born with it”; these higher-order thinking skills can be developed through practice, feedback, and reflection ( 11 , 12 ).

The American Association for the Advancement of Science (AAAS) and the National Science Foundation (NSF) have issued a call to action in Vision and Change in Undergraduate Biology Education ( 1 ), recommending that all institutions of higher education focus on five core concepts for biological literacy and six core competencies. Interestingly, the definition of the first core competency, the “ability to apply the process of science” reflects the creative process in a vast array of disciplines. Much of the description, while written with a biological slant, is not unique to the biological sciences, nor even the natural sciences. Creative work in most disciplines requires the “creator” to raise vital questions, to gather relevant information, to generate multiple ideas (multiple possible hypotheses, multiple solutions, multiple interpretations, etc.), to interpret and evaluate information/data, and to draw appropriate conclusions.

Vision and Change recommends giving all students the opportunity to develop these competencies through authentic research experiences. Certainly, it would be ideal to provide every student with an authentic research experience, but there are real barriers to doing so. At many institutions, it is not feasible to provide every student majoring in the life sciences with an undergraduate research experience, let alone all of the non-major students who will be faced with biology-related decisions, such as whether to vaccinate their children, throughout their lives. Institutions of higher education should be striving to build inquiry into general biology lab courses, but not every school requires lab for non-majors. Furthermore, even laboratory activities that attempt to be inquiry-based do not always emphasize the higher-order thinking competencies, especially when huge numbers of students must be accommodated. While we should promote authentic research for as many students as we can, we must examine ways to enhance the thinking skills developed during the experience, as well as developing strategies to enhance critical thinking across the curriculum.

An observation I have made in student research papers and lab reports is that although many of my students reach feasible conclusions, they almost always provide only one interpretation of their data and draw a single conclusion from it, even when there are alternative explanations to account for the data. On a related note, before embarking on this study, I had never witnessed a first-year student in my class independently question an interpretation of data or conclusion reached in a scientific journal article, even when I purposefully selected a paper with an overblown conclusion. These skills are critical for scientists and, in fact, critical for anybody who ever needs to make a decision based on evidence. Because of the lack of demonstrated use of these skills in my own students, I set out to help them engage in the habits of mind that would lead them to think more critically. One way I did this was through the use of targeted discipline-specific scenarios.

The use of critical thinking scenarios is flexible both in terms of disciplinary content, as well as in terms of courses of different structure and class size. I utilized the strategy described in this paper in a first-year undergraduate research course focused on bacteriophage biology. However, the strategy could be easily implemented in a lecture course, a seminar course, or any other course structure or academic club. I used the activity as a springboard for discussion, but it could be used as a graded activity. Depending on the complexity of the scenario, this strategy could also be used at virtually any academic level.

Student learning outcomes

The course had many desired outcomes. The student learning outcomes described and assessed in this paper are:

  • Students will recognize inappropriate inferences and will question overstated conclusions in the works of others.
  • Students will avoid making inappropriate inferences from data.
  • Students will provide multiple explanations/interpretations of data.
  • Students will determine ways to assess the best explanation or interpretation of data.

Use of human subjects

This study was reviewed and granted exempt status by the North Carolina State University Institutional Review Board (IRB).

Student demographics

The test group students were enrolled in the SEA-PHAGES ( 8 ) first-year research course in fall 2014. This test group participated in critical thinking scenarios, two journal article discussions where elements of the scenarios were embedded in the discussion prompts, and a final research paper; the test group also was assessed pre and post by the non-discipline-specific Critical Thinking Assessment Test (CAT). The control group students were enrolled in the SEA-PHAGES first-year research course in fall 2013. This control group participated in two journal article discussions where elements of the scenarios were not embedded in the discussion prompts and a final research paper; the control group also was not assessed by the CAT.

The test group was composed of fourteen first-year, first-semester undergraduate students; five females and nine males. Eleven of the fourteen students in the course participated in the University Honors Program (UHP); on average, UHP students enter the university with a higher degree of academic achievement than our typical first-year student and have demonstrated intellectual curiosity in essays during the selection process. The minimum high school grade-point average (GPA) of entering UHP students is 3.75 unweighted or 4.5 weighted; the minimum scholastic aptitude test (SAT) score is 1,300 (critical reading and critical math only) and the minimum ACT is a 30 composite. For the fall 2014 incoming UHP class, the average unweighted GPA was 3.80 and the average SAT score (critical math/reading) was 1,376.

Control group

The control group was also composed of fourteen first-semester, first-year students enrolled in the same course the previous year, twelve females and two males. In this cohort, nine out of fourteen students participated in the UHP. The final research papers of this group were compared with those of the test group.

Critical thinking scenarios

Implementation.

Over the course of the semester, students in the test group engaged in five critical thinking scenarios. Some scenarios were written before the start of the course and planned in the syllabus, and some were created spontaneously during class to guide students when they were not thinking through experimental plans or data interpretation fully.

I organized the class activity similarly to a classic “think-pair-share.” In each case, the instructor provided the scenario, and students were given 10 to 15 minutes in class to write their response independently. Students then formed small groups of three to four and discussed their responses. The class then came back together to share their responses, and the instructor provided feedback to the class as a whole. Students were then permitted to flip their papers over and add to their response on the reverse side. Papers were not graded but were collected to ensure thoughtful participation.

Although we did the activity fully within the class period, it could be modified such that students completed the individual write-up, or even the paired/small group discussion, outside of class time. The important feature is that the open discussion and feedback from the faculty member occur in class. These modifications would allow the activity to be used in courses with higher enrollment or where more content had to be covered during classroom time.

The structure I used for the in-class scenarios is outlined below.

Lay out scenario, provide image, or present data.

  • Q1. “What does the author want you to infer?” or “Does the evidence strongly support the conclusion?”
  • Q2. “Provide alternative explanations.”
  • Q3. “What additional information would you need to draw a conclusion?”

I based scenarios on overstated conclusions I found in primary literature related to the course content, current events relevant to the course or to student life, or potential outcomes of experiments that the students designed.

The first scenario used in class, with aggregated student responses, is outlined in Figure 1 ( 16 ) and the text below. Additional scenarios used, without student responses, are provided in Appendix 1 .

An external file that holds a picture, illustration, etc.
Object name is jmbe-16-148f1.jpg

Data for MMR/autism critical thinking scenario.

MMR = mumps, measles, rubella.

Scenario 1: MMR (measles, mumps, rubella) vaccine and autism

Student response: The author wants me to infer that the MMR vaccine causes autism.

Student response: The increase in autism shows only occurrence, not incidences per X number in the population. Perhaps the increase in cases is explainable by population growth.

Student response: Could there have been gradual changes in the frequency with which autism was diagnosed or reported?

Student response: If MMR was given to almost all children by some point in time, shouldn’t new cases of autism have leveled off once that had occurred?

Student response: Even if neither population growth nor diagnosis/reporting can account for the increase in the incidence of autism, introduction of the MMR vaccine is definitely not the only change that occurred in our society during that time frame. The possibilities to investigate are endless. Was there a gradual increase in paternal age? Was there an increase in the use of anti-flammable chemicals in baby and toddler pajamas? Did pregnant mothers gradually eat less fish? Was the use of a pesticide in apples increased over time? Was there a new type of material used in food packaging? The list goes on and on.

Student response: Research the population growth and overlay a graph for the years represented in the graph. Create a chart that instead showed number of cases per 1,000 people.

Student response: Research whether there were any changes in the definition or diagnoses of autism during the given time period.

Student response: Even if a correlation can be made, causation is not proven. Experimental evidence in laboratory animals, or something like that, would need to be investigated to show causation.

Primary literature reviews

We previously described a method to systematically introduce primary scientific literature to first-year undergraduate students ( 3 ). In that paper, we provided a template for a written summary of journal articles that students brought to class to facilitate their active participation in classroom discussion. In the present work, I followed a similar strategy for introducing students to primary literature, but I revised the instructions to explicitly place additional emphasis on critical thinking goals. The additional prompts that students responded to were:

Did the conclusions follow logically from the data? Provide an example. Did the authors consider alternate conclusions of the data? Provide an example. Are there any interpretations that you thought about that the author did not consider? Given the findings and approach taken in this work, does it lead you to any questions you would like to address with respect to our research project? List as many research questions as you can think of. Select one research question and provide a possible methodology to pursue (briefly). Explain why you selected the above research question from among your alternatives.

In essence, the additional prompts in the journal article write-ups mimic the structure of the critical thinking scenarios and provide further opportunities for students to practice and receive feedback in the desired student learning outcomes that this study focuses on.

Students in the control group received the instructions described in Carson and Miller ( 3 ) without the additional above prompts.

Research papers

All students (test and control groups) wrote a final research paper in the format of a scientific journal article. The discussion section of the research papers provided an opportunity for students to demonstrate student learning outcomes B, C, and D in a real-world situation.

We included three measures of student learning, two of which are direct assessments of student learning. The Critical Thinking Assessment Test (CAT) is useful in that it allows for a pre/post assessment to look at higher-order thinking gains within the semester. Assessment at the start of a course (pre) can be difficult to implement in a discipline-specific way because, at the beginning of the semester, a student does not always have the knowledge to achieve higher-order thinking in the discipline. We complemented the use of CAT with a discipline-specific activity that requires higher-order thinking at the end of the semester, scored using a rubric. Another feature of the CAT is that it does show whether skills were transferred beyond disciplinary content.

Critical thinking assessment test

The Critical Thinking Assessment Test (CAT) was developed at Tennessee Technological University with funding by the National Science Foundation ( 13 ). The problem scenarios used in the CAT instrument are very general and do not require specialized discipline-specific knowledge. Therefore, using the CAT allows for assessment of higher-order thinking without being conflated by the presence or lack of disciplinary knowledge.

The CAT instrument has been assessed for validity and reliability ( 15 ). It contains 15 open-ended non-discipline-specific questions that assess higher-order thinking skills in the realms of evaluating and interpreting information, problem solving, creative thinking, and effective communication. The specific skill measured by each question is included in the results section in Table 1 .

Mean scores on pre/post critical thinking assessment test, N = 14.

CAT = Critical Thinking Assessment Test.

The CAT was employed to directly measure gains in higher-order thinking skills in a pre/post assessment approach where students took the exam in both the first week and the last week of the course. Both pre and posttests were coded and scored after the end of the semester by faculty at Tennessee Tech in order to minimize any unconscious biases. This assessment was used for the test group only.

Common rubric

I utilized a rubric to assess mastery of critical thinking skills in the final research paper for both the test group and control group. Students were not provided this rubric, although the instructions to students indicated that the discussion section must provide complete and thoughtful interpretation of data. The two rubric items relevant to this study were:

Does the student generate multiple potential ideas about the issue at hand?

  • Explores only a singular idea
  • Uses a few elements but with limited exploration
  • Experiments with multiple elements and variables but shows difficulty in addressing their appropriateness
  • Experiments with multiple elements and variables and successfully chooses an appropriate idea

How does the student make judgments about benefits and drawbacks of various ideas about the issue at hand?

  • Shows limited or no awareness of the benefits and drawbacks of ideas or defends them with unrelated criteria
  • Recognizes relevant benefits and drawbacks of ideas
  • Weighs the value of relevant benefits and drawbacks of ideas
  • Weighs the value of relevant benefits and drawbacks and selects appropriate ideas

Student reflections

In the last week of the semester, students in the test group responded to the following question prompt: Please describe the single most important assignment, strategy, or other course aspect that made a meaningful difference to your higher-order thinking abilities .

Direct assessment of critical thinking skills gains in test group

I directly measured gains in students’ critical thinking skills using the CAT in a pre/post testing model. Table 1 indicates the skill assessed by each CAT question, the maximum point value of each question, the mean score for each question on the pre and post assessment, probability of difference, and effect size. Because questions have different maximum point values, results are also displayed as a percent of total point value in Figure 2 .

An external file that holds a picture, illustration, etc.
Object name is jmbe-16-148f2.jpg

Critical Thinking Assessment test pre and post scores by question as percentages of total points. N = 14.

I observed gains in the mean scores of 12 out of 15 items on the CAT from the beginning of the semester to the end of the semester. Even with the small number of students in the course ( 14 ), there was a statistically significant large effect on the skills measured by the CAT overall, and statistically significant large effects on the specific items of “summarizing a pattern of results without making inappropriate inferences” and “providing alternative explanations for spurious associations.”

Notably, students in this study performed highly in many of the questions on the pretest. The average student score was greater than 85% on four of the questions in the pretest, and greater than 70% on seven of the questions in the pre-test. Because of this, a ceiling effect may have occurred where no statistically significant increase of skills could be measured in several areas because students already performed at the highest level measurable by the CAT instrument at the start of the semester.

Admittedly, student reflections do not directly measure ability, and individuals do not always reliably identify which experiences truly enhanced learning. However, gathering student reflections served two valuable purposes. First, it encouraged students to engage in metacognitive behavior, which was promoted over the course of the semester in various ways. And second, together with the direct measures of ability, student responses shed light on their perception of what worked in the classroom. In the last week of the semester, students were asked to respond to the following question prompt: Please describe the single most important assignment, strategy, or other course aspect that made a meaningful difference to your higher-order thinking abilities . Complete and unmodified student responses are displayed in Appendix 2 .

Two students specifically identified the critical thinking scenarios as the most impactful activity that enhanced their higher-order thinking skills, four students identified the primary literature reviews, and three students identified the research papers. Five students identified other aspects of the course, such as experimental design, keeping of lab notebooks, or general reference to class discussion. There was quite a bit of overlap in most of the specified activities. For example, critical thinking scenarios were utilized in guiding students in the experimental design process as described in the “Recombinant Bacteriophage Lysin Scenarios,” and scenarios were also embedded within the journal article write-ups and discussions. Based on the quality of the responses, it seems likely that students thoughtfully considered their responses.

Comparison with other high-impact learning interventions

Because we know that high-impact practices associated with undergraduate research are likely to affect higher-order thinking skills, we wanted to see whether the use of scenarios increased the effect. We compared our observed gains with other high-impact learning interventions such as the CREATE (Consider, Read, Elucidate hypothesis, Analyze and interpret data, Think of the next Experiment) strategy ( 6 ) and another undergraduate research initiative that focused on critical thinking ( 5 ). The overall pre and post scores between this study and these two other high-impact courses in the biological sciences that also assessed critical thinking using the CAT are compared in Figure 3 . The overall effect sizes (mean difference divided by pooled group standard deviation) of each study are as follow: this study (+0.67 = large effect), Gasper and Gardner (+0.3 = small/moderate effect) (personal communication S. Gardner, 26 February 2015), Gottesman and Hoskins (0.97 = large effect) ( 6 ). Effect sizes were categorized as follows: 0.1 to 0.3 = small effect; 0.3 to 0.5 = moderate effect; greater than 0.5 = large effect. The CREATE project had a very large effect, but based on pretesting, it is clear that the students in that initiative had a much lower baseline of critical thinking skills than the students in the initiative described in the present paper. In the study described in the present paper, ceiling effects occurred in several of the questions where students scored very highly on average in the baseline testing, leaving no opportunity to measure improvement. Comparison with the Gasper and Gardner authentic research experience is therefore more meaningful since students in both studies participated in an authentic research experience, and the baseline (pretest) critical thinking abilities of the two groups of students are comparable. Overall, our students benefitted, with a larger effect in their critical thinking skills than the students in the Gasper and Gardner study. Because of this difference, I hypothesize that when discipline-specific scenarios are included in an authentic research experience, students’ critical thinking skills are even more enhanced.

An external file that holds a picture, illustration, etc.
Object name is jmbe-16-148f3.jpg

Comparison of gains in critical thinking skills between this course and other high-impact courses.

Comparison of student research papers between the test group and the control group

In both fall 2013 and fall 2014, students in the SEA-PHAGES first-year research course wrote comprehensive research papers that were in the standard format of a scientific journal article in life science disciplines. Students were instructed that the discussion section must provide complete and thoughtful interpretation of data but were not provided with the critical thinking rubric utilized for this study.

The skills relevant to this study that were assessed by the rubric, with further explanations of the questions, are set out below. The ratings for each question are described in the materials and methods. Although this is a Likert-like scale, I utilized specific descriptions of the skill level in order to avoid subjective scaling.

  • Does the student generate multiple potential ideas about the issue at hand? This was assessed by whether the student explored more than one possible explanation or interpretation of data before drawing a conclusion where appropriate.
  • How does the student make judgments about benefits and drawbacks of various ideas about the issue at hand? It is important to note that if a student scored the lowest skill level on Question A, s/he could not score above the lowest level on this question. This was assessed by whether the student systematically and logically weighed the benefits and drawbacks of explanations/interpretations and his or her ability to reach an appropriate conclusion or suggest information that would be needed in order to reach a final conclusion.

The increase in the test group students’ likelihood to explore more than one potential explanation or interpretation of data compared with the control group was striking ( Fig. 4 ). The control group students generally explored only a singular idea or used a few elements with limited exploration, while the test group of students who participated in the critical thinking scenarios generally explored multiple explanations/interpretations of data and recognized relevant benefits and drawbacks of ideas, thus demonstrating student learning outcomes C and D.

An external file that holds a picture, illustration, etc.
Object name is jmbe-16-148f4.jpg

Mean rubric scores with standard error of control group versus test group. N = 14 for both groups. p = 0.0064 and p = 0.016 (2-tailed) for each question respectively.

Generally, few gains in higher-order thinking skills are accomplished in courses in the absence of a specific, targeted intervention. In the majority of college courses, there is no change pre to post on the Critical Thinking Assessment Test ( 13 ). In this study, a statistically significant large effect on students’ overall critical thinking skills was measured by the CAT in a single semester. It is likely that high-impact activities including participating in and discussing multiple critical thinking scenarios across the semester, together with analyzing and discussing primary literature, were the key factors leading to the increase. Research papers also provided opportunities for students to practice and receive feedback on skills.

In the introduction to this paper, I outlined the four student learning outcomes that I wanted students to achieve through the use of the critical thinking scenarios. Questions 1 to 9 in the CAT directly measure the skills that students practiced in the critical thinking scenarios and questions 5 and 6 in the CAT directly measure skills practiced in the literature write-ups. Table 2 maps the student learning outcomes with the assessment questions on the CAT.

Student learning outcomes mapped to CAT items 1 to 9.

Bold question numbers (5 and 7) indicate items with no observed increase.

The skills described in learning outcomes A and B are overlapping and are assessed by four questions on the CAT. Students demonstrated a statistically significant large gain on the specific skill of “summarizing a pattern of results without making inappropriate inferences (Q1),” and though not statistically significant, I observe increasing trends in the closely related skills of “evaluating how strongly correlational-type data supports a hypothesis (Q2)” and “determining whether an invited inference is supported by specific information (Q8).” I observed no change in “evaluating whether spurious information strongly supports a hypothesis (Q5),” likely due to a ceiling effect since students performed at a very high level on the pretest. I believe these gains were due to the practice and feedback students received through the scenarios and the literature write-ups and discussions. Perhaps also impactful was the fact that these activities gave students permission to question authority. As one student explained: “The article reviews also encouraged me to critically evaluate another person’s thinking. I realized the blind trust I gave to authors, and how little I stopped to question their methods and logic” ( Appendix 2 ).

Students also met learning outcome C. Students demonstrated a statistically significant large gain in “providing alternative explanations for spurious associations (Q6).” They also displayed a positive trend in “providing alternative explanations for a pattern of results that has many possible causes (Q3)” and “providing relevant alternative interpretations for a specific set of results (Q9).” Every scenario asked students to provide multiple possible explanations for data or evidence, and students identified where authors gave multiple possible interpretations when discussing primary literature. One student stated, “The Critical Thinking Experiment Scenarios made me consider alternate explanations for patterns, trends, and experimental results. I had to think about how to explain certain observations and how to make sure my assumptions and conclusions were right. This helped me when I was writing my midterm and final lab reports” ( Appendix 2 ).

Learning outcome D was the most difficult for students to achieve. Disappointingly, but not entirely surprisingly, there was no improvement in Question 7, “identify additional information needed to evaluate a hypothesis.” While students practiced this skill in scenarios throughout the semester, they still made the error of leaving out an important experimental control toward the end of the semester, and only figured it out upon the instructor’s creation of a specific scenario (see results section, recombinant bacteriophage lysin scenarios) and significant classroom discussion. Question 4 assesses the same skill, and students scored comparatively low on that item as well, although there was a small apparent gain. Overall, this was the least developed skill at the beginning of the semester, and it remained a challenge for students even though they had as many opportunities to practice it in the class as the others. Not surprisingly, this skill falls at the highest level of Bloom’s taxonomy: create ( 2 ). This outcome points to an increased need to provide students both in college and in primary and secondary school with more practice and feedback in creative thinking skills.

The CAT data demonstrated that students who participated in the scenario activities made large significant gains that were transferable to contexts outside of the realm of biology. Admittedly, analysis of primary literature and feedback on research papers also likely contributed to the gains. However, in previous semesters, prior to implementation of the scenarios, students were far less likely to question authors’ claims in the primary literature. Students who participated in the scenarios were also far more likely to give multiple explanations for results in research papers than the test group, indicating critical thinking gains in a domain-specific manner. Additionally, the difference in critical thinking gains of students in this study compared with the Gasper and Gardner study ( 5 ) points to the use of real-world scenarios as adding significant value to an authentic research experience.

One concern regarding this work is the number of students involved in the study. Unfortunately, it is not possible to increase the number of students in this course due to a change in teaching assignment. While it would be ideal to gather data on additional students for this study, the fact that there was a large statistically significant effect size in results on some items on the CAT as well as the rubric suggests quite large gains. There was an increasing trend on many other items in the CAT for which we could not show statistical significance. It is possible that with a larger number of subjects, these gains would have been demonstrated to be significant; the small N may be underestimating the effects of the strategies used in the course.

Since the use of scenarios is more readily transferable to different course structures than either discussion of primary literature or multiple research papers, I would like to glean how much of an effect the scenarios may have independently. Future plans include engaging faculty in diverse courses (i.e., lab, no lab, multiple disciplines, different course enrollment sizes, different student demographics) in developing and implementing discipline-specific scenarios to determine how well this strategy works in various types of courses and in isolation from the primary literature reviews, experimental design activities, and written lab reports used in this course. This follow-up study will include a much larger number of students. As an institution, we are also investigating means, including the use of a common rubric, to provide students with more opportunities to practice and more opportunities for feedback on their creative thinking skills.

SUPPLEMENTAL MATERIALS

Appendix 1: Additional critical thinking scenarios used during the course

ACKNOWLEDGMENTS

Bayer CropScience Bee Care Center provided financial support for this course. Tennessee Technical University provided a training workshop for designing discipline-specific, real-world scenarios ( https://www.tntech.edu/cat/training/ ). A potential conflict of interests associated with this paper is that the author, Dr. Susan Carson, was the developer and instructor of the course. However, scoring of the Critical Thinking Assessment Test, the instrument used for direct measure of student learning outcomes, was done at Tennessee Technological University by faculty who did not know which tests were pretests vs. posttests.

† Supplemental materials available at http://jmbe.asm.org

Center for Assessment & Improvement of Learning

  • Reports & Publications
  • Technical Information
  • Machine Scoring of Student Responses on the CAT
  • Assessing Critical Thinking Skills in STEM and Beyond (In M. Iskander (ed.), Innovations in E-learning, Instruction Technology, Assessment, and Engineering Education, 79-82. 2007, Springer)
  • Project CAT: Assessing Critical Thinking Skills (In D. Deeds & B. Callen (eds.), Proceedings of the National STEM Assessment Conference. NSF, 2007)
  • NSF Final Report (Project CAT: Assessing Critical Thinking Skills)
  • CAT National Dissemination: Assessment and Improvement of Learning (In Inventions and Impact 2: Building Excellence in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education. 2008, NSF/CCLI & AAAS.)
  • Faculty Driven Assessment of Critical Thinking (In Technological Developments in Networking, Education and Automation, 2010, Springer. Proceedings of the 2009 International Joint Conferences on Computer, Information, and System Sciences, and Engineering).
  • Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test (In Change: The Magazine of Higher Learning, 43:2, 44-49, 2011).
  • Getting Faculty Involved in Assessing and Improving Students' Critical Thinking (In A Collection of Papers on Self-Study and Institutional Improvement, 2011, Higher Learning Commission - North Central Association).
  • Identifying courses that improve students' critical thinking skills using the CAT instrument: A case study (In Proceedings of the 10th Annual International Joint Conferences in Computer, Information, System Sciences, & Engineering)
  • Changing How Faculty Assess Student Learning: Using the CAT as a Model for Designing Course Assessments . (In INQUIRY: Critical Thinking Across the Disciplines, 30 (3), 38-48, 2015)
  • National Dissemination of the CAT Instrument: Lessons Learned and Implications (In Proceedings of the AAAS/NSF Envisioning the Future of Undergraduate STEM Education: Research and Practice Symposium, 2016.)
  • Moving Beyond Assessment to Improving Students' Critical Thinking Skill: A Model for Implementing Change (In Journal of the Scholarship of Teaching and Learning, 16(4), 44-61, 2016)

Harris, Kevin R. Assessing Course Impacts on Critical Thinking: The Relationship between Self-Reported Learning Gains and Performance , Tennessee Technological University, Ann Arbor, 2015. ProQuest, https://search.proquest.com/docview/1691127261 .

Lisic, Elizabeth S. Creating Change: Implementing the Critical Thinking Assessment Test (CAT) as Faculty Development to Improve Instruction , Tennessee Technological University, Ann Arbor, 2015. ProQuest, https://search.proquest.com/docview/1728162696 . 

Presentations

  • 2019 SACS/COC Annual Meeting
  • 2018 AACU General Education and Assessment: Foundations for Democracy
  • 2018 WASC Academic Resource Conference
  • 2017 SACS/COC Annual Meeting
  • 2016 Drexel Conference on Teaching & Learning
  • 2015 National Academy of Science Conference on Assessing Hard-to-Measure Cognitive, Intrapersonal and Interpersonal Competencies
  • 2015 ISSOTL Conference
  • 2015 Drexel University Annual Conference on Teaching & Learning Assessment
  • 2015 Texas A&M Assessment Conference Concurrent Session
  • 2015 Texas A&M Assessment Conference Poster Session
  • 2014 SACS/COC Annual Meeting
  • 2014 WASC Academic Resource Conference
  • 2014 Texas A&M Assessment Conference
  • 2013 ISSOTL Conference
  • 2013 Higher Learning Commission Annual Conference
  • 2013 Texas A&M Assessment Conference
  • 2012 SACS/COC Annual Meeting
  • 2012 Reinvention Center Conference
  • 2012 WASC Academic Resource Conference
  • 2012 Higher Learning Commission Annual Conference
  • 2011 SACS/COC Annual Meeting
  • 2011 Higher Learning Commission Annual Conference
  • 2011 WASC Academic Resource Conference
  • 2010 SACS/COC Annual Meeting
  • 2010 WASC Academic Resource Conference
  • 2010 AACU Annual Meeting
  • 2009 ABET Best Assessment Processes Symposium XI

Links to User Experiences and Perspectives on the CAT

  • Arkansas State University - Faculty Driven Assessment
  • Montana State University - Utilizing Competing Narratives
  • Idaho State University - Assessing the Impact of Debates
  • San Francisco State University - Professional Development
  • Miyazaki International College - Active Learning
  • Texas A&M Assessment Center - Taking Assessment University-Wide

Other Useful Links

  • IDEAL Problem Solver

Publications by CAT users

  • Bielefeldt, A. R., Paterson, K. G., & Swan, C. W. (2010). Measuring the value added from service learning in project-based engineering education.  International Journal of Engineering Education , 26 (3), 535-546.
  • Collins, D., Davis, D., & Garbarino, J. (2017, February).  Learning-at-the-bench after-school program (LAB-ASP): Impacts of research experiences for students from low-resource backgrounds. Poster presented at the National Institute for Mathematical and Biological Synthesis (NIMBioS), Knoxville, TN.
  • Gasper, B., Minchella, D., Weaver, G., Csonka, L., & Gardner, S. (2012). Adapting to Osmotic Stress and the Process of Science. Science , 335 (6076), 1590-1591.
  • Grant, M., Smith, M. (2018). Quantifying assessment of undergraduate critical thinking. Journal of College Teaching & Learning , 15 (1), 27-38.
  • Grove, K., Dekens, P. S., & Dempsey, D. P. (2012, December). Sustaining professional development gains after the NSF-CCLI grant ends. In AGU Fall Meeting Abstracts (Vol. 1, p. 0728).
  • Partlow-Lefevre, S. T. (2012). Arguing for Debate: Introducing Key Components for Assessment of Intercollegiate Debate Programs.  Contemporary Argumentation & Debate , 33 .
  • Shannon, L. and Bennett, J. (2012). A Case Study: Applying Critical Thinking Skills to Computer Science and Technology. Information Systems Educational Journal, 10 (4), 41-48.
  • Frisch, J., Jackson, P., & Murray, M. (2013). WikiED: Using web 2.0 tools to teach content and critical thinking. Journal of College Science Teaching, 43 (1), 70-80.
  • Gasper, B., & Gardner, S. (2013). Engaging Students in Authentic Microbiology Research in an Introductory Biology Laboratory Course is Correlated with Gains in Student Understanding of the Nature of Authentic Research and Critical Thinking. Journal of Microbiology & Biology Education , 14 (1).
  • Gill, T. G., & Ritzhaupt, A. D. (2013). Systematically evaluating the effectiveness of an information systems capstone course: Implications for practice.  Journal of Information Technology Education: Research , 12 (1), 69-94.
  • Goldsmith, R. E. (2013). Encouraging Critical Thinking Skills among College Students.  The Exchange, 2 (2).
  • Gottesman, A., & Hoskins, S. (2013). CREATE Cornerstone: Introduction to Scientific Thinking, a new course for STEM-interested freshmen, demystifies scientific thinking through analysis of scientific literature. CBE-Life Sciences Education , 12 (1), 59-72.
  • Heft, I., & Scharff, L. (2017). Aligning best practices to develop targeted critical thinking skills and habits.  Journal of the Scholarship of Teaching and Learning, 17 (3), 48-67. doi: https://doi.org/10.14434/v17i3.22600
  • Hudson, T., & Sipes, S. M. (2014). Developing Critical Thinking Skills in a Mixed-Signal Test and Product Engineering Course Paper presented at 2014 ASEE Annual Conference, Indianapolis, Indiana. https://peer.asee.org/20287
  • Primm, T., Rowe, M., Gillespie, M., Rose, L., & Shannon, L. (2013). Extraordinary Claims: An Innovative Approach to Engage Student Interest and Enhance Critical Thinking Skills in General Education Science Courses. Tested Studies for Laboratory Teaching (ABLE Proceedings, volume 33).
  • Jones, S., Harrington, K., & Scharff, L., (2013). An integrated effort to develop and assess critical thinking skills. Paper presented at the NCA Higher Learning Commission Annual Conference, Chicago. http://hlcommission.org/
  • Wertz, R., Saragih, A., Fosmire, M., Purzer, S., & Van Epps, A. (2013). An evaluation of the Critical Engineering Literacy Test (CELT) Instrument through Item Analysis and Comparison to the Critical Assessment Test (CAT). Conference Paper Illinois/Indiana - ASEE Section Conference (Angola, IN).
  • Weatheron, Y., Mattingly, S., Kruzic, A., Frost, H., & Rahman, Z. (2014, June). Critical Thinking in the Curriculum: Making Better Decisions. In 121st ASEE Annual Conference and Exposition June 15-18, 2014.
  • Perry, D. K., Retallick, M. S., & Paulsen, T. H. (2014). A critical thinking benchmark for a department of agricultural education and studies.  Journal of Agricultural Education ,55 (5), 207.
  • Schneider, K., Bickel, A., & Morrison-Shetlar,A. (2014). Planning and Implementing a Comprehensive Student-Centered Research Program for First-Year STEM Undergraduates. Journal of College Science Teaching, 44 (3), 37-43.
  • Alvarez, C., Taylor, K., & Rauseo, N. (2015). Creating Thoughtful Salespeople: Experiential Learning to Improve Critical Thinking Skills. T raditional and Online Sales Education. Marketing Education Review , 25( 3), 233-243.
  • Carson, S. (2015). Targeting critical thinking skills in first year undergraduate research course. J ournal of Microbiology & Biology Education, 16 (2), 148-156.
  • Gavin, K., & Perry, D. K. (2015, May). Utilizing Competing Narratives to Increase Critical Thinking Abilities. In American Association for Agricultural Education Annual Conference May 19-22, 2015 (p. 326).
  • Kaupp, J., Simper, N., & Frank, B. (2015). Triangulated authentic assessment in the HEQCO Learning Outcomes Assessment Consortium. Proceedings of the Canadian Engineering Education Association .
  • Perry, D. K., Paulsen, T. H., & Retallick, M. S. (2015). The impact of a capstone farm management course on critical thinking abilities.  Journal of Agricultural Education ,56 (2), 13-26.
  • Rowe, M., Gillespie, M., Harris, K., Koether, S., Shannon, L. & Rose, L. (2015). Redesigning a general education science course to promote critical thinking. CBE-Life Sciences Education, 14 (3).
  • Basha, S., Drane, D., & Light, G. (2016). Adapting the Critical Thinking Assessment Test for Palestinian Universities.  Journal of Education and Learning ,5 (2), 60.
  • Ransdell, M. (2016). Design process rubrics: Identifying and enhancing critical thinking in creative problem solving. In Proceedings of the Interior Design Educators Council Conference.
  • Thompson, M., & Jung, I. (2016). Making the Global Local: Twenty Years at Miyazaki International College, Japan. In Liberal Arts Education and Colleges in East Asia (pp. 63-73). Springer Singapore.
  • Mork, C. M., & Howard, A. M. An Investigation into Active Learning at MIC: A Beginning and the Way Forward. 比較文化= Comparative culture, The Journal of Miyazaki International College, 20 , 67-86.
  • Al-Mazroa, S., Retallick, Michael, Miller, Gregory, & Skaar, Brad. (2017). Assessment of Critical Thinking Skills in Undergraduate Animal Science Students and Curriculum, ProQuest Dissertations and Theses. 
  • Gao, M. (2015). Using Inquiry Teaching to Promote Student Critical Thinking and Content Knowledge in a Nonmajors Biology Course.  Education Journal, 4 (4), 182.
  • Styers, M. L., Van Zandt, P. A., & Hayden, K. L. (2018). Active learning in flipped life science courses promotes development of critical thinking skills. CBE-Life Sciences Education, 17 (3). https://doi.org/10.1187/cbe.16-11-0332 .
  • Chase, A., Clancy, H., Lachance, R., Mathison, B., Chiu, M., & Weaver, G. (2016). Improving critical thinking via authenticity: The CASPiE research experience in a military academy chemistry course. Chemistry Education Research and Practice, 18 (1). DOI: 10.1039/C6RP00171H. 
  • Woodley, S.K., Freeman, P.E., & Ricketts, T.D. (2019). Combining novel research and community engaged learning in an undergraduate physiology laboratory course.  Advances in Physiology Education, 43,  110-120. https://doi.org/10.1152/advan.00177.2018 .
  • Getting Started
  • About the CAT
  • Administration Options
  • Faculty Development
  • Training & Services
  • Ordering CAT Materials
  • Returning CAT Materials
  • User Experiences & Successful Projects
  • Frequently Asked Questions

MORE INFORMATION

931-372-3252 [email protected]

cat logo

Experience Tech For Yourself

Visit us to see what sets us apart.

Quick Links

  • Tech at a Glance
  • Majors & Concentrations
  • Colleges & Schools
  • Student Life
  • Research at Tech
  • Tech Express
  • Current Students
  • Faculty & Staff
  • Mission and Vision
  • Facts about Tech
  • University Rankings
  • Accreditation & Memberships
  • Maps & Directions
  • Board of Trustees
  • Office of the President
  • Strategic Plan
  • History of Tech
  • Parents & Family
  • International
  • Military & Veteran Affairs
  • Tuition & Fees
  • Financial Aid
  • Visit Campus
  • Scholarships
  • Dual Enrollment
  • Request Information
  • Office of the Provost
  • Academic Calendar
  • Undergraduate Catalog
  • Graduate Catalog
  • Volpe Library
  • Student Success Centers
  • Honors Program
  • Study Abroad
  • Living On Campus
  • Health & Wellness
  • Get Involved
  • Student Organizations
  • Safety & Security
  • Services for Students
  • Upcoming Events
  • Diversity Resources
  • Student Affairs
  • Featured Researchers
  • Research Centers
  • ttusports.com
  • Social Media
  • Student Resources
  • Faculty & Staff Resources
  • Bookstore/Dining/Parking
  • Pay Online - Eagle Pay
  • IT Help Desk
  • Strategic Planning
  • Office of IARE
  • Student Complaints

IMAGES

  1. Critical Thinking Assessment Test Tennessee Tech University

    critical thinking assessment test tennessee tech

  2. PPT

    critical thinking assessment test tennessee tech

  3. Critical Thinking Assessment: 4 Ways to Test Applicants • Toggl Hire

    critical thinking assessment test tennessee tech

  4. PPT

    critical thinking assessment test tennessee tech

  5. Critical Thinking Test: Questions and Answers

    critical thinking assessment test tennessee tech

  6. Critical Thinking Employment Assessment Test: Questions and Answers

    critical thinking assessment test tennessee tech

VIDEO

  1. Critical Thinking Assessment Series [Disk 2] [Part 6]

  2. Critical Thinking Assessment Series [Disk 3] [Part 5]

  3. Critical Thinking Assessment Series [Disk 1] [Part 7]

  4. Critical Thinking Assessment Series [Disk 3] [Part 8]

  5. Ultimate Tech Trivia Challenge: Test Your Digital IQ! 🚀

  6. Critical Thinking Assessment Series [Disk 1] [Part 3]

COMMENTS

  1. Center for Assessment & Improvement of Learning

    The Critical Thinking Assessment Test was developed with support from the National Science Foundation TUES (CCLI) Division (under grants 0404911, 0717654, and 1022789 to Barry Stein, PI; Ada Haynes, Co-PI; & Michael Redding, Co-PI). ... Tennessee Tech does not condone and will not tolerate discrimination against any individual on the basis of ...

  2. PDF Critical-thinking Assessment Test:

    Critical-thinking Assessment Test: An Overview of the CAT National Science Foundation's IUSE Program under grant 1022789. Center for Assessment & Improvement of Learning Tennessee Tech University. Center for Assessment & ... Critical Thinking *NSSE Question (2a) Memorizing facts, ideas, or methods from your courses and ...

  3. PDF CAT (Critical-thinking Assessment Test) -:|:- Tennessee Tech

    CAT (Critical-thinking Assessment Test) The Critical-thinking Assessment Test (CAT) was developed with input from faculty across a wide range of institutions and disciplines, with guidance from colleagues in the cognitive/learning sciences and assessment and with support from the National Science Foundation (NSF). Developed

  4. PDF Faculty Driven Assessment of Critical Thinking: National ...

    Tennessee Technological University (TTU) has been en-gaged in an extended effort during the last 9 years to develop and refine an instrument to assess critical thinking that over-comes many of the weaknesses of other existing tools. Pree-minent theoreticians and educators in the area of learning sciences and assessment participated in the project.

  5. Critical-Thinking Assessment Test Available to Educators Teaching

    The Center for Assessment & Improvement of Learning at Tennessee Tech University is dedicated to helping projects and institutions navigate assessment as educators return to teaching during this challenging time. The National Science Foundation (NSF) has funded the dissemination of a unique tool, the Critical-thinking Assessment Test ...

  6. CAT Virtual Training Workshop

    The Center for Assessment & Improvement of Learning (CAIL) and the Center for Innovation in Teaching & Learning (CITL) are excited to offer a critical-thinking and real-world problem-solving workshop at Tennessee Tech University July 22nd & 23rd to kick-off our Authentic Assessment Academic Learning Community (ALC).This 2 half-day virtual workshop is based around the Critical-thinking ...

  7. (PDF) Faculty Driven Assessment of Critical Thinking: National

    The Critical-thinking Assessment Test (CAT) is a skills-based, inclusive, validated test that was developed by researchers at Tennessee Tech and funded by the National Science Foundation that ...

  8. Un Iv Ersity of Institutional Effectiveness and Reporting Texa S

    Assessment Instrument The Critical thinking Assessment Test (CAT) was developed by Tennessee Tech University and is funded by the National Science Foundation. The CAT consists of fifteen questions that cover four skill areas: Evaluating Information and Other Points of View, Creative Thinking, Learning and Problem Solving, and Communication.

  9. Center for Assessment & Improvement of Learning

    The Critical-thinking Assessment Test (CAT) was developed with input from faculty across a wide range of institutions and disciplines, with guidance from colleagues in the cognitive/ learning sciences and assessment and with support from the National Science Foundation (NSF).

  10. Engaging Faculty in the Assessment and Improvement of Students

    Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test. Barry Stein Tennessee Tech University & Ada Haynes Tennessee Tech University. Pages 44-49 | Published online: 15 Mar 2011. Cite this article

  11. Measuring and Enhancing Critical Thinking of Students

    Kevin Harris is the Associate Director of the Center for Assessment and Improvement of Learning (CAIL) at Tennessee Tech University. In his role at CAIL, Dr. Harris leads regional training workshops to prepare institutions for the implementation of the Critical-thinking Assessment Test as a performance measure of student critical thinking and as a faculty development tool.

  12. Using the Critical Thinking Assessment Test (CAT) as a Model for

    I assessed student gains in critical thinking skills using a pre-/posttest model of the Critical Thinking Assessment Test (CAT), developed by Tennessee Technological University.

  13. Building Critical Thinking Skills through Geotechnical CAT-Apps

    The Critical-Thinking Assessment Test (CAT) is an internationally recognized instrument for assessing critical thinking skills that has been developed at Tennessee Tech. The skills tested by the CAT include evaluation of information, creative thinking, learning and problem solving, and communication. In recent years, the Center for Assessment ...

  14. ThinkAchieve 2010

    Achievement of the ThinkAchieve goal and objectives is assessed primarily through the annual administration of the Critical Thinking Assessment test (CAT) developed by Tennessee Tech University (TTU). TTU's website defines the CAT as "a unique tool designed to assess and promote the improvement of critical thinking and real-world problem ...

  15. Institutional Assessment

    California Critical Thinking Skills Test (CCTST) is used as the current senior exit exam. The test objectively and reliably measures core reasoning skills for reflective decision making, including analysis, inference, evaluation, induction, deduction, interpretation, explanation, and numeracy. Implemented fall and spring semesters in the ...

  16. Targeting Critical Thinking Skills in a First-Year Undergraduate

    The Critical Thinking Assessment Test (CAT) was developed at Tennessee Technological University with funding by the National Science Foundation . The problem scenarios used in the CAT instrument are very general and do not require specialized discipline-specific knowledge.

  17. PDF Assessment of Critical Thinking ILO

    The Critical Thinking Assessment Test (CAT), developed by the Center for Assessment & Improvement of Learning at Tennessee Tech University, was administered to students enrolled in senior seminar classes from natural science, social science and humanities departments. Average GPA, ACT and SAT scores were compared between the CAT

  18. Center for Assessment & Improvement of Learning

    Reports & Publications. Technical Information; Machine Scoring of Student Responses on the CAT; Assessing Critical Thinking Skills in STEM and Beyond (In M. Iskander (ed.), Innovations in E-learning, Instruction Technology, Assessment, and Engineering Education, 79-82. 2007, Springer); Project CAT: Assessing Critical Thinking Skills (In D. Deeds & B. Callen (eds.), Proceedings of the National ...

  19. RISK: Innovative Assessments to Support Critical Thinking

    Designing and aligning assessment questions or activities with critical thinking allows faculty to assess the mastery of discipline content while emphasizing critical thinking. CAT Apps was developed by Tennessee Tech in conjunction with their Critical Thinking Assessment Test (CAT). ULM currently uses the CAT to measure the critical thinking skills of students involved in the QEP.

  20. Critical Thinking Assessment Test Tennessee Tech

    Keene State College uses the Critical thinking Assessment Test to assess critical thinking as one of our college-wide learning outcomes. We chose CAT over other options because we have local control over how we use the test. We chose to administer it as a pre- and posttest in selected first-year and upper-level courses across the curriculum.