psychology

Definition of Critical Thinking:

Description:

Critical thinking refers to the intellectual process of analyzing, evaluating, and interpreting information and arguments in a systematic and objective manner. It involves the careful examination of facts, evidence, and reasoning to form rational and well-informed judgments.

Components:

Critical thinking includes several essential components:

  • Analysis: The ability to break down complex information into its constituent parts and examine them systematically.
  • Evaluation: The capacity to assess the credibility, accuracy, and reliability of information and arguments.
  • Inference: The skill to draw logical and reasoned conclusions based on available evidence.
  • Interpretation: The aptitude to comprehend and explain the meaning and significance of information and evidence.
  • Explanation: The capability to clarify and justify one’s own thought processes and reasoning, explicitly stating the underlying assumptions and principles.
  • Self-regulation: The discipline to monitor one’s own thinking, recognizing and challenging biases, prejudices, and assumptions.
  • Open-mindedness: The willingness to consider alternative viewpoints, perspectives, and hypotheses without prejudice or preconceived notions.

Importance:

Critical thinking plays a vital role in various aspects of life, including education, personal and professional relationships, problem-solving, decision-making, and understanding complex issues. It enables individuals to think independently, make informed judgments, evaluate the reliability of information, and develop well-reasoned arguments.

Developing and applying critical thinking skills can lead to numerous benefits, such as:

  • Improved problem-solving abilities and decision-making skills.
  • Enhanced communication and argumentation skills.
  • Strengthened comprehension and interpretation of information.
  • Increased objectivity and rationality in thinking.
  • Heightened creativity and innovation.
  • Reduced vulnerability to manipulation and misinformation.
  • Greater self-awareness and personal growth.

APS

  • Teaching Tips

A Brief Guide for Teaching and Assessing Critical Thinking in Psychology

In my first year of college teaching, a student approached me one day after class and politely asked, “What did you mean by the word ‘evidence’?” I tried to hide my shock at what I took to be a very naive question. Upon further reflection, however, I realized that this was actually a good question, for which the usual approaches to teaching psychology provided too few answers. During the next several years, I developed lessons and techniques to help psychology students learn how to evaluate the strengths and weaknesses of scientific and nonscientific kinds of evidence and to help them draw sound conclusions. It seemed to me that learning about the quality of evidence and drawing appropriate conclusions from scientific research were central to teaching critical thinking (CT) in psychology.

In this article, I have attempted to provide guidelines to psychol­ogy instructors on how to teach CT, describing techniques I devel­oped over 20 years of teaching. More importantly, the techniques and approach described below are ones that are supported by scientific research. Classroom examples illustrate the use of the guidelines and how assessment can be integrated into CT skill instruction.

Overview of the Guidelines

Confusion about the definition of CT has been a major obstacle to teaching and assessing it (Halonen, 1995; Williams, 1999). To deal with this problem, we have defined CT as reflective think­ing involved in the evaluation of evidence relevant to a claim so that a sound or good conclusion can be drawn from the evidence (Bensley, 1998). One virtue of this definition is it can be applied to many thinking tasks in psychology. The claims and conclusions psychological scientists make include hypotheses, theoretical state­ments, interpretation of research findings, or diagnoses of mental disorders. Evidence can be the results of an experiment, case study, naturalistic observation study, or psychological test. Less formally, evidence can be anecdotes, introspective reports, commonsense beliefs, or statements of authority. Evaluating evidence and drawing appropriate conclusions along with other skills, such as distin­guishing arguments from nonarguments and finding assumptions, are collectively called argument analysis skills. Many CT experts take argument analysis skills to be fundamental CT skills (e.g., Ennis, 1987; Halpern, 1998). Psychology students need argument analysis skills to evaluate psychological claims in their work and in everyday discourse.

Some instructors expect their students will improve CT skills like argument analysis skills by simply immersing them in challenging course work. Others expect improvement because they use a textbook with special CT questions or modules, give lectures that critically review the literature, or have students complete written assignments. While these and other traditional techniques may help, a growing body of research suggests they are not sufficient to efficiently produce measurable changes in CT skills. Our research on acquisition of argument analysis skills in psychology (Bensley, Crowe, Bernhardt, Buchner, & Allman, in press) and on critical reading skills (Bensley & Haynes, 1995; Spero & Bensley, 2009) suggests that more explicit, direct instruction of CT skills is necessary. These results concur with results of an earlier review of CT programs by Chance (1986) and a recent meta-analysis by Abrami et al., (2008).

Based on these and other findings, the following guidelines describe an approach to explicit instruction in which instructors can directly infuse CT skills and assessment into their courses. With infusion, instructors can use relevant content to teach CT rules and concepts along with the subject matter. Directly infus­ing CT skills into course work involves targeting specific CT skills, making CT rules, criteria, and methods explicit, providing guided practice in the form of exercises focused on assessing skills, and giving feedback on practice and assessments. These components are similar to ones found in effective, direct instruc­tion approaches (Walberg, 2006). They also resemble approaches to teaching CT proposed by Angelo (1995), Beyer (1997), and Halpern (1998). Importantly, this approach has been successful in teaching CT skills in psychology (e.g., Bensley, et al., in press; Bensley & Haynes, 1995; Nieto & Saiz, 2008; Penningroth, Despain, & Gray, 2007). Directly infusing CT skill instruction can also enrich content instruction without sacrificing learning of subject matter (Solon, 2003). The following seven guidelines, illustrated by CT lessons and assessments, explicate this process.

Seven Guidelines for Teaching and Assessing Critical Thinking

1. Motivate your students to think critically

Critical thinking takes effort. Without proper motivation, students are less inclined to engage in it. Therefore, it is good to arouse interest right away and foster commitment to improving CT throughout a course. One motivational strategy is to explain why CT is important to effective, professional behavior. Often, telling a compelling story that illustrates the consequences of failing to think critically can mo­tivate students. For example, the tragic death of 10-year-old Candace Newmaker at the hands of her therapists practicing attachment therapy illustrates the perils of using a therapy that has not been supported by good empirical evidence (Lilienfeld, 2007).

Instructors can also pique interest by taking a class poll posing an interesting question on which students are likely to have an opinion. For example, asking students how many think that the full moon can lead to increases in abnormal behavior can be used to introduce the difference between empirical fact and opinion or common sense belief. After asking students how psychologists answer such questions, instructors might go over the meta-analysis of Rotton and Kelly (1985). Their review found that almost all of the 37 studies they reviewed showed no association between the phase of the moon and abnormal behavior with only a few, usually poorly, controlled studies supporting it. Effect size over all stud­ies was very small (.01). Instructors can use this to illustrate how psychologists draw a conclusion based on the quality and quantity of research studies as opposed to what many people commonly believe. For other interesting thinking errors and misconceptions related to psychology, see Bensley (1998; 2002; 2008), Halpern (2003), Ruscio (2006), Stanovich (2007), and Sternberg (2007).

Attitudes and dispositions can also affect motivation to think critically. If students lack certain CT dispositions such as open-mindedness, fair-mindedness, and skepticism, they will be less likely to think critically even if they have CT skills (Halpern, 1998). Instructors might point out that even great scientists noted for their powers of reasoning sometimes fail to think critically when they are not disposed to use their skills. For example, Alfred Russel Wallace who used his considerable CT skills to help develop the concept of natural selection also believed in spiritualistic contact with the dead. Despite considerable evidence that mediums claiming to contact the dead were really faking such contact, Wallace continued to believe in it (Bensley, 2006). Likewise, the great American psychologist William James, whose reasoning skills helped him develop the seeds of important contemporary theories, believed in spiritualism despite evidence to the contrary.

2. Clearly state the CT goals and objectives for your class

Once students are motivated, the instructor should focus them on what skills they will work on during the course. The APA task force on learning goals and objectives for psychology listed CT as one of 10 major goals for students (Halonen et al., 2002). Under critical thinking they have further specified outcomes such as evaluating the quality of information, identifying and evaluating the source and credibility of information, recognizing and defending against think­ing errors and fallacies. Instructors should publish goals like these in their CT course objectives in their syllabi and more specifically as assignment objectives in their assignments. Given the pragmatic penchant of students for studying what is needed to succeed in a course, this should help motivate and focus them.

To make instruction efficient, course objectives and lesson ob­jectives should explicitly target CT skills to be improved. Objectives should specify the behavior that will change in a way that can be measured. A course objective might read, “After taking this course, you will be able to analyze arguments found in psychological and everyday discussions.” When the goal of a lesson is to practice and improve specific microskills that make up argument analysis, an assignment objective might read “After successfully completing this assignment, you will be able to identify different kinds of evidence in a psychological discussion.” Or another might read “After suc­cessfully completing this assignment, you will be able to distinguish arguments from nonarguments.” Students might demonstrate they have reached these objectives by showing the behavior of correctly labeling the kinds of evidence presented in a passage or by indicating whether an argument or merely a claim has been made. By stating objectives in the form of assessable behaviors, the instructor can test these as assessment hypotheses.

Sometimes when the goal is to teach students how to decide which CT skills are appropriate in a situation, the instructor may not want to identify specific skills. Instead, a lesson objective might read, “After successfully completing this assignment, you will be able to decide which skills and knowledge are appropriate for criti­cally analyzing a discussion in psychology.”

3. Find opportunities to infuse CT that fit content and skill requirements of your course

To improve their CT skills, students must be given opportunities to practice them. Different courses present different opportunities for infusion and practice. Stand-alone CT courses usually provide the most opportunities to infuse CT. For example, the Frostburg State University Psychology Department has a senior seminar called “Thinking like a Psychologist” in which students complete lessons giving them practice in argument analysis, critical reading, critically evaluating information on the Internet, distinguishing science from pseudoscience, applying their knowledge and CT skills in simula­tions of psychological practice, and other activities.

In more typical subject-oriented courses, instructors must find specific content and types of tasks conducive to explicit CT skill instruction. For example, research methods courses present several opportunities to teach argument analysis skills. Instructors can have students critically evaluate the quality of evidence provided by studies using different research methods and designs they find in PsycINFO and Internet sources. This, in turn, could help students write better critical evaluations of research for research reports.

A cognitive psychology teacher might assign a critical evalu­ation of the evidence on an interesting question discussed in text­book literature reviews. For example, students might evaluate the evidence relevant to the question of whether people have flashbulb memories such as accurately remembering the 9-11 attack. This provides the opportunity to teach them that many of the studies, although informative, are quasi-experimental and cannot show causation. Or, students might analyze the arguments in a TV pro­gram such as the fascinating Nova program Kidnapped by Aliens on people who recall having been abducted by aliens.

4. Use guided practice, explicitly modeling and scaffolding CT.

Guided practice involves modeling and supporting the practice of target skills, and providing feedback on progress towards skill attainment. Research has shown that guided practice helps student more efficiently acquire thinking skills than unguided and discovery approaches (Meyer, 2004).

Instructors can model the use of CT rules, criteria, and proce­dures for evaluating evidence and drawing conclusions in many ways. They could provide worked examples of problems, writing samples displaying good CT, or real-world examples of good and bad thinking found in the media. They might also think out loud as they evaluate arguments in class to model the process of thinking.

To help students learn to use complex rules in thinking, instruc­tors should initially scaffold student thinking. Scaffolding involves providing product guidelines, rules, and other frameworks to support the process of thinking. Table 1 shows guidelines like those found in Bensley (1998) describing nonscientific kinds of evidence that can support student efforts to evaluate evidence in everyday psychologi­cal discussions. Likewise, Table 2 provides guidelines like those found in Bensley (1998) and Wade and Tavris (2005) describing various kinds of scientific research methods and designs that differ in the quality of evidence they provide for psychological arguments.

In the cognitive lesson on flashbulb memory described earlier, students use the framework in Table 2 to evaluate the kinds of evidence in the literature review. Table 1 can help them evaluate the kinds of evidence found in the Nova video Kidnapped by Aliens . Specifically, they could use it to contrast scientific authority with less credible authority. The video includes statements by scientific authorities like Elizabeth Loftus based on her extensive research contrasted with the nonscientific authority of Bud Hopkins, an artist turned hypnotherapist and author of popular books on alien abduction. Loftus argues that the memories of alien abduction in the children interviewed by Hopkins were reconstructed around the suggestive interview questions he posed. Therefore, his conclu­sion that the children and other people in the video were recalling actual abduction experiences was based on anecdotes, unreliable self-reports, and other weak evidence.

Modeling, scaffolding, and guided practice are especially useful in helping students first acquire CT skills. After sufficient practice, however, instructors should fade these and have students do more challenging assignments without these supports to promote transfer.

5. Align assessment with practice of specific CT skills

Test questions and other assessments of performance should be similar to practice questions and problems in the skills targeted but differ in content. For example, we have developed a series of practice and quiz questions about the kinds of evidence found in Table 1 used in everyday situations but which differ in subject matter from practice to quiz. Likewise, other questions employ research evidence examples corresponding to Table 2. Questions ask students to identify kinds of evidence, evaluate the quality of the evidence, distinguish arguments from nonarguments, and find assumptions in the examples with practice examples differing in content from assessment items.

6. Provide feedback and encourage students to reflect on it

Instructors should focus feedback on the degree of attainment of CT skill objectives in the lesson or assessment. The purpose of feedback is to help students learn how to correct faulty thinking so that in the future they monitor their thinking and avoid such problems. This should increase their metacognition or awareness and control of their thinking, an important goal of CT instruction (Halpern, 1998).

Students must use their feedback for it to improve their CT skills. In the CT exercises and critical reading assignments, students receive feedback in the form of corrected responses and written feedback on open-ended questions. They should be advised that paying attention to feedback on earlier work and assessments should improve their performance on later assessments.

7. Reflect on feedback and assessment results to improve CT instruction

Instructors should use the feedback they provide to students and the results of ongoing assessments to ‘close the loop,’ that is, use these outcomes to address deficiencies in performance and improve instruction. In actual practice, teaching and assessment strategies rarely work optimally the first time. Instructors must be willing to tinker with these to make needed improvements. Reflec­tion on reliable and valid assessment results provides a scientific means to systematically improve instruction and assessment.

Instructors may find the direct infusion approach as summarized in the seven guidelines to be efficient, especially in helping students acquire basic CT skills, as research has shown. They may especially appreciate how it allows them to take a scientific approach to the improvement of instruction. Although the direct infusion approach seems to efficiently promote acquisition of CT skills, more research is needed to find out if students transfer their skills outside of the class­room or whether this approach needs adjustment to promote transfer.

Table 1. Strengths and Weaknesses of Nonscientific Sources and Kinds of Evidence

Table 2. Strengths and Weaknesses of Scientific Research Methods/Designs Used as Sources of Evidence

Abrami, P. C., Bernard, R. M., Borokhovhovski, E., Wade, A., Surkes, M. A., Tamim, R., et al., (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research, 4 , 1102–1134.

Angelo, T. A. (1995). Classroom assessment for critical thinking. Teaching of Psychology , 22(1), 6–7.

Bensley, D.A. (1998). Critical thinking in psychology: A unified skills approach. Pacific Grove, CA: Brooks/Cole.

Bensley, D.A. (2002). Science and pseudoscience: A critical thinking primer. In M. Shermer (Ed.), The Skeptic encyclopedia of pseudoscience. (pp. 195–203). Santa Barbara, CA: ABC–CLIO.

Bensley, D.A. (2006). Why great thinkers sometimes fail to think critically. Skeptical Inquirer, 30, 47–52.

Bensley, D.A. (2008). Can you learn to think more like a psychologist? The Psychologist, 21, 128–129.

Bensley, D.A., Crowe, D., Bernhardt, P., Buckner, C., & Allman, A. (in press). Teaching and assessing critical thinking skills for argument analysis in psychology. Teaching of Psychology .

Bensley, D.A. & Haynes, C. (1995). The acquisition of general purpose strategic knowledge for argumentation. Teaching of Psychology, 22 , 41–45.

Beyer, B.K. (1997). Improving student thinking: A comprehensive approach . Boston: Allyn & Bacon.

Chance, P. (1986) Thinking in the classroom: A review of programs . New York: Instructors College Press.

Ennis, R.H. (1987). A taxonomy of critical thinking dispositions and abilities. In J. B. Baron & R. F. Sternberg (Eds.). Teaching thinking skills: Theory and practice (pp. 9–26). New York: Freeman.

Halonen, J.S. (1995). Demystifying critical thinking. Teaching of Psychology, 22 , 75–81.

Halonen, J.S., Appleby, D.C., Brewer, C.L., Buskist, W., Gillem, A. R., Halpern, D. F., et al. (APA Task Force on Undergraduate Major Competencies). (2002) Undergraduate psychology major learning goals and outcomes: A report. Washington, DC: American Psychological Association. Retrieved August 27, 2008, from http://www.apa.org/ed/pcue/reports.html .

Halpern, D.F. (1998). Teaching critical thinking for transfer across domains: Dispositions, skills, structure training, and metacognitive monitoring. American Psychologist , 53 , 449–455.

Halpern, D.F. (2003). Thought and knowledge: An introduction to critical thinking . (3rd ed.). Mahwah, NJ: Erlbaum.

Lilienfeld, S.O. (2007). Psychological treatments that cause harm. Perspectives on Psychological Science , 2 , 53–70.

Meyer, R.E. (2004). Should there be a three-strikes rule against pure discovery learning? The case for guided methods of instruction. American Psychologist , 59 , 14–19.

Nieto, A.M., & Saiz, C. (2008). Evaluation of Halpern’s “structural component” for improving critical thinking. The Spanish Journal of Psychology , 11 ( 1 ), 266–274.

Penningroth, S.L., Despain, L.H., & Gray, M.J. (2007). A course designed to improve psychological critical thinking. Teaching of Psychology , 34 , 153–157.

Rotton, J., & Kelly, I. (1985). Much ado about the full moon: A meta-analysis of lunar-lunacy research. Psychological Bulletin , 97 , 286–306.

Ruscio, J. (2006). Critical thinking in psychology: Separating sense from nonsense. Belmont, CA: Wadsworth.

Solon, T. (2007). Generic critical thinking infusion and course content learning in introductory psychology. Journal of Instructional Psychology , 34(2), 972–987.

Stanovich, K.E. (2007). How to think straight about psychology . (8th ed.). Boston: Pearson.

Sternberg, R.J. (2007). Critical thinking in psychology: It really is critical. In R. J. Sternberg, H. L. Roediger, & D. F. Halpern (Eds.), Critical thinking in psychology. (pp. 289–296) . Cambridge, UK: Cambridge University Press.

Wade, C., & Tavris, C. (2005) Invitation to psychology. (3rd ed.). Upper Saddle River, NJ: Prentice Hall.

Walberg, H.J. (2006). Improving educational productivity: A review of extant research. In R. F. Subotnik & H. J. Walberg (Eds.), The scientific basis of educational productivity (pp. 103–159). Greenwich, CT: Information Age.

Williams, R.L. (1999). Operational definitions and assessment of higher-order cognitive constructs. Educational Psychology Review , 11 , 411–427.

' src=

Excellent article.

' src=

Interesting and helpful!

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines .

Please login with your APS account to comment.

About the Author

D. Alan Bensley is Professor of Psychology at Frostburg State University. He received his Master’s and PhD degrees in cognitive psychology from Rutgers University. His main teaching and research interests concern the improvement of critical thinking and other cognitive skills. He coordinates assessment for his department and is developing a battery of instruments to assess critical thinking in psychology. He can be reached by email at [email protected] Association for Psychological Science December 2010 — Vol. 23, No. 10

what is the role of critical thinking in psychology

Student Notebook: Five Tips for Working with Teaching Assistants in Online Classes

Sarah C. Turner suggests it’s best to follow the golden rule: Treat your TA’s time as you would your own.

Teaching Current Directions in Psychological Science

Aimed at integrating cutting-edge psychological science into the classroom, Teaching Current Directions in Psychological Science offers advice and how-to guidance about teaching a particular area of research or topic in psychological science that has been

European Psychology Learning and Teaching Conference

The School of Education of the Paris Lodron University of Salzburg is hosting the next European Psychology Learning and Teaching (EUROPLAT) Conference on September 18–20, 2017 in Salzburg, Austria. The main theme of the conference

My OpenLearn Profile

Personalise your OpenLearn profile, save your favourite content and get recognition for your learning

About this free course

Become an ou student, download this course, share this free course.

Critically exploring psychology

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

2 Critical thinking in psychology

The central task for psychology is to try to explain human behaviour and experience; that is, to explain all the things that people do, think, feel and say. However, psychology is not restricted to human behaviour; it includes non-human animals in its field of study too. But human and non-human behaviour and experience are amazingly diverse and frequently complex. Consequently, researchers in psychology have developed a wide range of different methods to help them understand this vast topic. Indeed, of all the human and social sciences, psychology probably uses a bigger variety of research methods than any other discipline.

Within the field of psychology, it is important to engage in some critical evaluation of the methodology and methods that can be used. At the very least, psychologists need to identify some strengths and limitations of their research.

In other words,

  • how do I know what I know?
  • what approach can be used to get there?
  • what’s the best approach to get there?

Strong research comes out of sustained critical reflexive evaluation of what you are doing. A piece of research in which the researcher does not show awareness of competing epistemological/ methodological perspectives (something you will look at in more detail in Section 3), and where they are insufficiently critical of their research, is likely not be considered as a strong piece of research. It is particularly important for researchers to engage in critical discussion about their epistemological and methodological commitments. Before this though, you will consider in more detail what critical thinking is.

Previous

Guy P. Harrison

Why the Clock Counts with Critical Thinking

Timing matters when it comes to accepting extraordinary claims..

Posted September 24, 2023 | Reviewed by Devon Frye

  • Embracing a strong belief in the right thing at the wrong time is a deceptive victory.
  • Strange possibilities should not be ruled out.
  • The core power and most exciting aspect of science is not what we know now, but what we might learn next.
  • The best we can do is strive to be correct according to the best evidence available now.

Source: Guy P. Harrison

It may seem counterintuitive, but being correct in the long run is not the only consideration when it comes to extraordinary claims. It’s a detail often missed, but when one decides to accept or believe something matters—even if it eventually proves true.

This is important to recognize because embracing a strong belief in the right thing at the wrong time is a deceptive victory. It can encourage overconfidence in unreliable hunches and obscure flawed and dangerous thinking processes, all of which are likely to create problems throughout life.

Consider the factor of timing regarding UFOs. Anyone who “knows” today that some of them are extraterrestrial visitors has had their mind probed and abducted by an irrational belief because there is nothing close to credible confirmation for it. But what if aliens were to land on the rooftop of the United Nations building tomorrow and confess that they have been buzzing us for decades?

UFO believers would say, “told you so,” and deservedly so. But their prior position still would have been the result of extraordinarily poor thinking skills. And those skills won’t improve without a personal reckoning that includes acknowledging the significance of timing and a new commitment to thinking before believing.

It would be no different if Bigfoot were captured or a quirk of quantum physics proved the claims of homeopathy. Feelings of vindication aside, the unjustified embrace of an extremely dubious position that later turns out to be correct is not much more impressive than that of a broken clock being precisely accurate twice per day. A supervolcano might choke out civilization next year, but it wouldn’t mean the guy on a street corner yelling, “The end is near,” knew what he was talking about.

Some will argue that being proven right over time is enough, regardless of how unjustified the conclusion or belief once was. But this ignores the dangers of habitual sloppy thinking. If skepticism and quality of evidence are unimportant for one claim, then what is the standard for others? If one believes the Apollo Moon landings were faked, why not trust a chiropractor to treat a serious health issue? If reflexology is valid, why not Assyrian haruspicy, too? Where does it end? Sadly, of course, there is no end for some who seem to live almost entirely in a state of cognitive chaos.

To help premature believers, advocates of critical thinking might add the role of timing to their list of essential talking points. I consistently emphasize to others that the safer and more efficient way to mentally navigate the world is to consistently side with the best knowledge currently available—and be prepared to change course the moment new evidence demands it. I also make a point to concede that a given extraordinary and unlikely claim could be true, but quickly add that it doesn’t matter if currently there are no good reasons to believe it.

I understand that this burden of waiting for sufficient evidence can be inconvenient or uncomfortable, but it is crucial when it comes to important and unusual claims. There are exceptions, of course. Sometimes the stakes are high, there is legitimate urgency, and a hunch is all you have. For example, if I’m walking in a dark alley and someone in the shadows appears to be waving a knife and seems to be whispering something about my wallet, I’m running and not hanging around for scientific confirmation. In most cases, however, we have the luxury of waiting to see if good evidence ever arrives.

Drawing attention to this timing component of critical thinking is not a blanket rejection of fringe ideas. It is important to consider unlikely things and maintain appropriate humility before strange possibilities. The core power and most exciting aspect of science is not what we know now, but what we might learn next. A nagging intuition , compelling flash of insight, or gut feeling can be a fruitful starting point toward spectacular discovery.

But the hunch itself is not enough, and certainly should not be the endpoint. For example, my love of science fiction and the compressed version of the Drake Equation that lives in my head biases me with a strong inclination to think that we are not alone in a universe with this much time, space, matter, and energy. But until SETI holds the greatest press conference in history, it would be an appalling breach of reason if I were to take any stance other than “I don’t know.” The critical-thinking clock is clear on this. It’s too early to be sure.

what is the role of critical thinking in psychology

An important technical point is that waiting for sufficient evidence is not an absolute denial of the claim. Neither is it a sign of being closed-minded, the standard cheap shot lobbed at critical thinkers. I suppose it can feel like a contradiction, but good thinking demands that we c onsider anything and doubt everything .

The late astronomer Carl Sagan mentioned this in his book The Demon Haunted World : “As I’ve tried to stress , at the heart of science is an essential balance between two seemingly contradictory attitudes—an openness to new ideas, no matter how bizarre or counterintuitive, and the most ruthlessly skeptical scrutiny of all ideas, old and new. This is how deep truths are winnowed from deep nonsense.”

I have learned from experience that openly noting the possibility of improbable things can aid communication between believer and skeptic. I readily admit that giant primates and interstellar visitors are not impossible, only that declaring them to be real phenomena right now is a problem. It demonstrates the same kind of muddled judgment that leads people into dangerous medical quackery, financial scams, predatory organizations, and destructive political loyalties.

The best we can do is strive to be correct according to the best evidence available now . Mind the clock and keep steering toward the best current version of reality. Take positions that are most reasonable today . We can always change our minds tomorrow if the aliens land and say hello.

Guy P. Harrison

Guy P. Harrison is the author of Think: Why You Should Question Everything.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.7(12); 2021 Dec

Logo of heliyon

Psychology students' attitudes towards research: the role of critical thinking, epistemic orientation, and satisfaction with research courses

Miguel landa-blanco.

a Universidad Internacional Iberoamericana, Mexico

b Universidad Nacional Autónoma de Honduras, Honduras

Antonio Cortés-Ramos

c Universidad de Málaga, Spain

Associated Data

Data will be made available on request.

The current study aimed to determine how attitudes towards research are related to epistemic orientation, critical thinking, and satisfaction with research courses in psychology university students. Control variables included respondents' gender, current academic degree (undergraduate or postgraduate), number of research methods courses completed, number of research projects completed, and academic score. A quantitative, cross-sectional design was used, with a non-probabilistic sample size of 137 students. Correlational findings suggest that students with high scores in critical thinking domains and empiric and rational dispositions, tend to achieve higher academic grades. Rationality and reflexive skepticism were related to the number of research projects completed by the student. While an intuitive disposition is inversely related to academic scores and the number of research courses completed. Results from a hierarchical linear regression model suggest that attitudes towards research are significantly and positively affected by students' satisfaction with research courses, empiric epistemic orientation, and critical openness. On the other hand, an intuitive epistemic orientation has significant detrimental effects on attitudes towards research. Rational epistemic orientation and skeptic reflexiveness yielded non-significant coefficients. Overall, the model containing all independent variables accounted for 47.4% of the variance in attitudinal scores; this constitutes a large effect size. Results are discussed in light of previous research and their implications for the teaching of psychology in higher education.

Scientific attitudes, Critical thinking, Epistemology, Student research.

1. Introduction

Attitudes are defined as a cognitive preference and behavioral predisposition towards an object, thus resulting in a favorable or unfavorable evaluation regarding a specific stimulus ( Eagly and Chaiken, 1993 ). Attitudes play an important role in predicting behavior ( Glasman and Albarracín, 2006 ), and consequently are a recurrent topic in educational and psychological studies. The present article will focus specifically on psychology students' attitudes towards research.

Research skills play an important role in higher education ( Lambie et al., 2014 ) and the psychological sciences ( Veilleux and Chapman, 2017 ). In higher education, specific competencies within psychology include the epistemic comprehension of science, critical scientific thinking, as well as the capability to design, execute and understand research ( American Psychological Association, 2011 ). However, on many occasions, psychology students dislike research methods courses ( Ciarocco et al., 2012 ). This might be due to the fact that students perceived disconnection between research courses content and its applicability to their professional field. A semantic analysis found that university students tend to consider psychology as a science, but less than natural sciences. Moreover, the term "psychology" and "science" were semantically linked by concepts related to research ( Richardson and Lacroix, 2021 ). Additionally, undergraduate psychology students tend to be more interested in practitioner activities than in scientific/research activities ( Holmes, 2014 ).

Students report several factors that dissuade them from doing research; these include considering that research activities are time-consuming, difficulties associated with the lack of mentorship and funding ( AlGhamdi et al., 2014 ; Siemens et al., 2010 ). Instructors of research methods classes often report that students have negative attitudes and disinterest in such courses ( Gurung and Stoa, 2020 ). In part, attitudes towards research can be explained by variables such as research anxiety, the perceived importance and usefulness students attribute to research, and believing that research has an unbiased nature ( Gredig and Bartelsen-Raemy, 2018 ). In this last regard, it is important to consider students' epistemic orientation.

Epistemic orientation refers to the individuals' preferences on how to gain and use knowledge ( Silva Palma et al., 2018 ). One taxonomy of epistemic orientations identifies three main preferences ( Royce, 1975 ; Silva Palma et al., 2018 ; Wilkinson and Migotsky, 1994 ): intuitive, rational, and empirical. The intuitive orientation assumes that knowledge is subjective and might be attained through metaphors and symbolisms. On the other hand, a rational orientation uses logic to evaluate arguments as true or false. An empiric orientation assumes that knowledge can only be attained through structured observations and experimentation. Science is greatly based on a combination of rational and empiric orientations.

Critical thinking is the process in which a person elaborates conclusions based on evidence ( Wallmann and Hoover, 2012 ), focusing on argumentation and reasoning. This requires synthetic, introspective skills, skepticism, openness to new arguments or evidence, evaluating different options and their ramifications, dialogical thinking, self-questioning, self-monitoring, self-criticism ( Garrett and Cutting, 2017 ; Reznitskaya and Sternberg, 2012 ; Sosu, 2013 ; Sternberg, 1987 ). Critical thinking is an essential element of scientific thinking ( Shargel and Twiss, 2019 ), and an essential skill in the academic formation of psychologists. Consequently, students are, ideally, trained to admit the role of randomness, evaluate the methodological quality of arguments, understand the differences between correlation and causality, acknowledge the complex and multicausal nature of events, and understand the importance of falsification ( Lawson, 1999 ; Lawson et al., 2015 ). Therefore it is evident that there is a link between critical thinking and research within the psychological sciences ( Meltzoff and Cooper, 2018 ).

Recent studies have found that research and statistics courses may enhance students' knowledge of the topic without increasing their interest ( Sizemore and Lewandowski, 2009 ). Specifically, teachers play an important role in developing students' research competencies, including its attitudinal component ( Udompong et al., 2014 ). Students' satisfaction with university courses is related to teaching quality and expertise ( Green et al., 2015 ). As such, it is vital to determine the role satisfaction with research courses plays in students' attitudes towards research.

The National Autonomous University of Honduras (UNAH) offers psychology programs in undergraduate (BA) and postgraduate (master's) degrees. The undergraduate program consists of 45 courses, of which 4 are mandatory-sequential Research Methods classes ( UNAH, 2019 ). By the end of the degree, students are expected to be competent in elaborating research proposals, literature reviews, the basic design of quantitative and qualitative instruments, applying descriptive and basic inferential statistics, and writing technical reports. On the other hand, the postgraduate degree has 18 compulsory courses, of which 4 are mandatory research classes ( UNAH, 2021 ). Their content is thesis-oriented, as it is a graduation requirement for the postgraduate programs of the UNAH.

Considering this, the purpose of our exploratory study was to test the following hypothesis: attitudes towards research are related to epistemic orientation, critical thinking, and satisfaction with research courses in psychology university students of Honduras. This while controlling for respondents' gender, current academic degree (undergraduate or postgraduate), number of research methods courses completed, number of research projects completed, and academic score.

2.1. Participants

The current study included students in the final year of their bachelor's degree, and students enrolled in a master's degree psychology program at a public university in Honduras. The sample was collected online through a non-probabilistic approach using volunteer and snowball sampling. Due to the COVID-19 pandemic, all university courses are held exclusively online. Considering this, invitations to participate in the study were sent via email to all 603 undergraduate students coursing final year classes and internships. Similarly, emails invitations were sent to all 62 masters' degree students. However, due to low response rates, students who completed the survey were also asked to send the email invitation to fellow students.

This resulted in a final sample size of 137 participants, of which 75.91% ( n = 104) were undergraduate students, accounting for 17.24% of the population of undergraduate students. On the other hand, 24.09% ( n = 33) were enrolled in a master's degree, representing 53.22% of the postgraduate population. Most respondents ( n = 113; 82.48%) were female, while male students only accounted for 17.52% of the total sample ( n = 24). The gender distribution in the sample is coherent with the population's demographic characteristics, in which 76.24% are female students, and 23.76% are male ( National Autonomous University of Honduras, 2021 ).

The mean academic score was 83.75% ( SD = 7.11); this represents the weighted average from all academic courses completed by the students. Students had completed an average of 5.08 research courses ( SD = 1.96) and participated in an average of 4.18 research projects ( SD = 2.86). The overall age of the respondents was 28.20 years ( SD = 7.61). Specifically, undergraduate students had a mean age of 26 years ( SD = 5.46), while master's degree students had a mean age of 35.12 years ( SD = 9.23).

2.2. Variables and measures

2.2.1. attitudes towards research.

Data was collected using the Attitudes Towards Research Scale-Revised (EACIN-R) ( Aldana de Becerra et al., 2020 ), a revised version of the original EACIN ( Aldana de Becerra et al., 2016 ). It consists of 28 items, with a five-point Likert-type response set, with scores varying from 1 (completely disagree) to 5 (completely agree), with higher scores indicating more favorable attitudes towards research. Some items included in the EACIN-R are: "All professionals should know how to do research", "I do not believe research should be taught at universities" and "I am interested in doing research activities". As measured by Cronbach's Alpha, the internal consistency for this sample was 0.89, 95% CI [0.86; 0.91].

2.2.2. Epistemic orientation

The Epistemic Orientation Short Scale (EOSS) consists of 11 items with a five-point Likert-type response set, with scores varying from 1 (completely disagree) to 5 (completely agree), with higher scores indicating a more prevalent epistemic orientation. The EOSS measures the following subscales: rationalism ( α = 0.71), intuitivism ( α = 0.77), and empiricism ( α = 0.72) ( Silva Palma et al., 2018 ). The current study determined the internal consistency coefficients for each dimension: rationalism, α = 0.83, 95% CI [0.78; 0.87]; intuitivism, α = 0.65, 95% CI [0.54; 0.73]; empiricism α = 0.64, 95% CI [0.52; 0.73]. Some items from the EOSS include: "My opinions are commonly based on feelings and intuitions" (intuitivism), "I tend to make decisions based on reasons I can clearly explain" (rationalism), and "I tend to make decisions based on my experiences and practical situations" (empiricism).

2.2.3. Critical thinking

The Critical Thinking Disposition Scale (CTDS) is an 11-item instrument with a five-point Likert-type response set, with scores varying from 1 (completely disagree) to 5 (completely agree), with higher scores indicating higher self-reported critical thinking disposition. The CTDS has a bi-dimensional structure consisting of two factors: critical openness and reflective skepticism. Previous research reported an overall Cronbach's alpha of 0.81 ( Sosu, 2013 ), similar to the one found in the current study, α = 0.86, 95% CI [0.82; 0.89]. Some items included in the CTDS are: "I sometimes find a good argument that challenges some of my firmly held beliefs" (Critical Openness) and "I usually check the credibility of the source of information before making judgments" (reflective skepticism).

2.2.4. Satisfaction with University Research Courses

The authors of the current study elaborated the Satisfaction with University Research Courses Scale (SURCS). Items were built by the authors and later sent to three Research Methods university professors who revised the wording and validity of every item. The experts rated each question on a 5-point scale according to their importance, pertinence, and wording; items with low scores were rephrased according to the experts' opinions. The final version of the SURCS consists of 12 Likert-type items with a five-point response set, with scores varying from 1 (completely disagree) to 5 (completely agree) . Higher scores indicate higher satisfaction with research university courses. The items reflects course content-related satisfaction, teacher satisfaction, perceived importance of the Research Methods courses, and personal satisfaction with such courses.

The instrument had an overall internal reliability of 0.91, 95% CI [0.89; 0.93], the average inter-item correlation was of 0.48, 95% CI [0.41; 0.54], Table 1 details the reliability for each item included in the SRUCS. Some of the items included in the SURCS are: "I enjoyed taking the Research Methods courses", "I believe my teachers of Research Methods courses had plenty experience as researchers", "I believe the content of the Research Methods courses is relevant".

Table 1

Item reliability for the satisfaction with University Research Courses Scale.

Note. Item 5 was inversely recoded.

2.2.5. Demographic and educational questionnaire

Additional demographic and educational data were collected through a questionnaire that gathered information regarding respondents' gender (0 = male, 1 = female), age, current academic degree (0 = undergraduate, 1 = postgraduate), number of research methods courses completed, number of research projects completed, and self-reported academic grade. On this last point, students were asked to enter the academic grade as reported in their official university online certification. The academic grade is a score that ranges between 0 and 100.

2.3. Data analysis

Items were averaged to determine the total for each scale. An exploratory correlational analysis, using Pearson's r , was used to assess inter-variable dynamics. Comparisons between undergraduate and postgraduate students were made by using Student's t-test, a power analysis with its corresponding confidence intervals was also made. Later, a hierarchical linear regression model was used to explain the scores students achieved at the Attitudes Towards Research Scale-Revised (EACIN-R). The independent variables tested included: EOSS-rational, EOSS-intuitive, EOSS-empiric, CTDS-critical openness, CTDS-reflexive skepticism, and satisfaction with research courses. This while controlling for: gender, current academic degree, number of research methods courses completed, number of research projects completed, and academic grade. A post-hoc analysis was used to determine the achieved power of the regression model. An α = 0.05 was used as a significance threshold. Participants were required to answer all items; therefore, no missing data were included in the study. All statistical analyses were made using JASP ( JASP Team, 2020 ).

2.4. Ethical considerations

The study design and execution were approved by the Ethical Committee of the Universidad Internacional Iberoamericana (UNINI), under certificate N˚ CE-025. All potential participants were presented with an Informed Consent form that included the study's purpose, confidentiality agreement, voluntary participation clause, data management, etc. Agreeing to the Informed Consent was required to allow students to participate in the study.

Results indicate that students had an average score of 3.87 ( SD = 0.50) in the Attitudes Towards Research Scale-Revised. The mean of the Satisfaction with Research University Courses Scale was 4.04 ( SD = 0.71). The most prevalent epistemic orientation was the EOSS-Empiric disposition ( M = 4.09; SD = 0.65), followed by EOSS-Rational ( M = 3.87; SD = 0.73), and EOSS-Intuitive as less prevalent disposition ( M = 3.40; SD = 0.75). Regarding critical thinking, CTDS-Reflexive-Skepticism scores ( M = 4.31; SD = 0.69) were higher than CTDS-Critical Openness ( M = 4.19; SD = 0.54).

Satisfaction with research courses and attitudes towards research were significantly higher for postgraduate students than for undergraduate respondents. Such differences are not only statistically significant ( p < 0.01), but also achieve medium effect sizes ( d = -0.64). Empiric and rational epistemic orientations are similarly scored by undergraduate and postgraduate students ( p > .05); however, intuitive orientation is significantly lower for postgraduate respondents ( p = 0.04). Critical thinking disposition subscales do not vary significantly between undergraduate and postgraduate students ( p > .05). Table 2 provides a detailed description of mean differences, significance, and effect size.

Table 2

Score comparisons between undergraduate and postgraduate students.

Note. df = 135.

A relational analysis determined that academic score is significantly and positively correlated ( p < 0.05) with CTDS-Critical Openness, CTDS Reflexive Skepticism, EOSS-Empiric, EOSS-Rational, satisfaction with research courses, and attitudes towards research. On the other hand, EOSS-Intuitive is inversely related to academic scores and the number of research courses completed. The number of research projects completed was significantly and positively associated with CTDS-Reflexive Skepticism, EOSS-Rational, satisfaction with research courses, and attitudes towards research. Additionally, both rational and empiric orientations correlate positively with critical thinking domains. Attitudes towards research also have positive relationships with EOSS-Rational and EOSS Empiric, but are inversely related with EOSS-Intuitive, see Table 3 .

Table 3

Correlational analysis between educational variables, critical thinking, and epistemic orientation.

Note. Correlation coefficients were calculated through Pearson's r . Significant p -values (<0.05) are presented in bold letters.

Furthermore, a hierarchical regression model was used to determine how attitudes towards research are explained by critical thinking, epistemic orientation, and satisfaction with research courses. The base model, containing control variables, had an r 2 of .197, F (5, 131) = 6.411, p < .001. The final model, containing all independent variables, had an r 2 of .474, F (11, 125) = 10.229, p < 0.001, this constitutes a large effect size ( Cohen, 1992 ), f 2 = .901, with a high power >0.99. The changes between the base and final model are statistically significant, r 2 Δ = .277, FΔ = 3.818, p < 0.001.

While controlling for the academic degree, number of research courses completed, number of completed research projects, academic grade and gender, the following independent variables had a significant effect on attitudes towards research: satisfaction with research courses ( β = 0.256, p = 0.001), empiric epistemic orientation ( β = 0.254, p = 0.003), intuitive epistemic orientation ( β = -0.149, p = 0.039) and critical openness ( β = 0.197, p = 0.049). Rational epistemic orientation ( β = 088, p = 0.32) and skeptic reflexiveness ( β = -0.043, p = 0.665) yielded non-significant coefficients ( p > 0.05), see Table 4 .

Table 4

Regression model explaining students' attitudes towards research.

Note. Significant p -values (<0.05) are presented in bold letters. All Variance Inflation Factors (VIF) scores range from 1.09 to 2.35, indicating no collinearity issues.

4. Discussion

The current research provides evidence that suggests that attitudes towards research are positively and significantly affected by students' satisfaction with research courses, empiric epistemic orientation, and critical openness. On the other hand, an intuitive epistemic orientation has significant detrimental effects on attitudes towards research. Students with high scores in critical thinking domains and empiric and rational dispositions, tend to achieve higher academic grades. Rationality and reflexive skepticism were related to the number of research projects completed by the student. While an intuitive disposition is inversely related to academic scores and the number of research courses completed.

Considering this, our study indicates that students' attitudes towards research could improve by reinforcing the quality of research methods courses, promoting empirical epistemic values and critical openness. On the first topic, knowledge of research methods is a premise of scientific thinking; therefore, effective research training should promote scientific thinking skills while considering students' epistemic beliefs ( Murtonen and Salmento, 2019 ). Teaching students how to evaluate the credibility and validity of information sources is a key component to promote critical thinking ( Carlson, 1995 ). Teachers should also promote inquiry-based activities in their classes; these include: students creating and answering their own questions, reciprocal peer questioning and, including questions that require holistic-integrative responses ( King, 1995 ). Such methods should enhance critical thinking and rational epistemic orientation.

Defining questions and hypotheses, critical thinking, and epistemic understanding are vital to overcoming intuitive-based decisions and non-scientific beliefs, leading to an evidence-based approach to problem-solving ( Murtonen and Salmento, 2019 ). An empiric epistemic orientation has significant effects on attitude towards research. Empiricism is highly driven by observational and experimental reports ( American Psychological Association, 2020 ), and is an essential pillar of scientific research.

Our study provides evidence that an intuitive epistemic orientation has detrimental effects on students' attitudes towards scientific research. This finding is coherent with previous research made in a sample of psychotherapists, in which intuitive thinking was related to negative attitudes towards research, as well as more resistance to adopting evidence-based treatments in their professional practice. Psychotherapists with higher intuitive thinking were more willing to endorse alternative therapies and misconceptions about health ( Gaudiano et al., 2011 ).

Likewise, critical openness was found to be a significant predictor of students' attitudes towards research. Considering that critical openness refers to the willingness to explore new or alternative arguments ( Sosu, 2013 ), it is logical that such openness was a significant predictor of students' attitudes towards research. In this sense, prior research has determined that scientists, in contrast to non-scientists, report significantly higher scores on openness ( Sato, 2016 ). Contemplating and evaluating new or alternative arguments is a key component to promote scientific development, and as such, these skills should be promoted in higher education settings. Teachers play an important part in enhancing students' critical thinking skills, playing a facilitator role, emphasizing the analytical process related to decision making, promoting discussion among peers, autonomous learning, and dialogical thinking ( Reznitskaya and Sternberg, 2012 ; Sternberg, 1987 ).

Our findings indicate that the number of research courses completed by the students does not influence their attitudes towards research. This finding is coherent with Sizemore and Lewandowsk (2009) , who concluded that completing research and statistics courses may enhance students' knowledge on the topic, without necessarily increasing their interest. Therefore, to better understand students' attitudes towards research, the focus should not reside on the number of research courses completed by the students, but rather on their satisfaction with such classes.

Satisfaction with research courses plays an important role in developing students' attitudes towards research. Thus, such courses should be taught by teachers highly trained in research and teaching skills, with updated, relevant, and applicable content that captures students' interest in research methods. This suggestion is in line with previous research, which identifies that teaching quality and expertise promote students' satisfaction with research courses ( Green et al., 2015 ). In this sense, teacher engagement has significant effects on student engagement ( Cardwell, 2011 ).

Overall, teachers should explicitly state and evidence the relationship between scientific thinking and research skills, as well as their application beyond academic activities. Students should also have clarity about the research process and what is expected of them as researchers. In this sense, quality feedback, adequate mentorship, peer support, and collaborative learning may enhance favorable attitudes towards the research process ( Balloo, 2019 ).

Future studies should consider using qualitative and mixed methods designs to understand students' epistemic beliefs better, further exploring the meaning of psychology as a science. On the other hand, additional studies could specifically focus on postgraduate students and their attitudes and experiences on research activities, such as thesis writing.

The present study is not without limitations. The non-probabilistic selection process and the limited sample size may restrict the representativeness of the results. The nature of the epistemic, scientific, and attitudinal variables also possess an issue because it requires the respondents to have acquired a certain level of epistemic maturity ( Murtonen and Salmento, 2019 ). Such awareness and metacognitive capabilities might not be adequately developed in all students. Additionally, the relatively low reliability of the EOSS subscales of Intuitivism ( α = 0.65) and Empiricism ( α = 0.64) is a limitation to consider when interpreting our research results. Future studies should also investigate further the psychometric properties of the SURCS. Finally, high scores in the EACIN-R indicate favorable attitudes towards research, and low scores indicate unfavorable attitudes. However, the EACIN-R lacks a system to categorize attitudinal scores through cut-off values ( Aldana de Becerra et al., 2020 ). In this sense, more research is yet needed to further validate the scale in university populations.

Declarations

Author contribution statement.

Miguel Landa-Blanco and Antonio Cortés-Ramos: Conceived and designed the experiments; Performed the experiments; Analyzed and interpreted the data; Contributed reagents, materials, analysis tools or data; Wrote the paper.

Funding statement

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Data availability statement

Declaration of interests statement.

The authors declare no conflict of interest.

Additional information

No additional information is available for this paper.

  • Aldana de Becerra G.M., Babativa Novoa D.A., Caraballo Martínez G.J., Rey Anacona C.A. Attitudes towards research scale: evaluation of its psychometric properties in a Colombian sample. CES Psicología. 2020; 13 (1):89–103. https://revistas.ces.edu.co/index.php/psicologia/article/view/4828/3121 [ Google Scholar ]
  • Aldana de Becerra G.M., Caraballo Martínez G.J., Babativa Novoa D.A. Escala para medir actitudes hacia la investigación (EACIN): validación de contenido y confiabilidad. Aletheia. 2016; 8 (2) [ Google Scholar ]
  • AlGhamdi K.M., Moussa N.A., AlEssa D.S., AlOthimeen N., Al-Saud A.S. Perceptions, attitudes and practices toward research among senior medical students. Saudi Pharmaceut. J. 2014; 22 (2):113–117. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • American Psychological Association . 2011. Revised Competency Benchmark for Professional Psychology. https://www.apa.org/ed/graduate/competency Competency Initiatives in Professional Psychology. [ Google Scholar ]
  • American Psychological Association . 2020. Databases Methodology Field Values. https://www.apa.org/pubs/databases/training/method-values.pdf [ Google Scholar ]
  • Balloo K. In: Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills. Murtonen M., Balloo K., editors. Palgrave Macmillan; 2019. Students' difficulties during research methods training acting as potential barriers to their development of scientific thinking; pp. 107–137. [ Google Scholar ]
  • Cardwell M.E. St. John Fisher College; 2011. Patterns Of Relationships between Teacher Engagement And Student Engagement. https://core.ac.uk/download/pdf/48615453.pdf [ Google Scholar ]
  • Carlson E.R. Evaluating the credibility of sources: a missing link in the teaching of critical thinking. Teach. Psychol. 1995; 22 (1):39–41. [ Google Scholar ]
  • Ciarocco N.J., Lewandowski G.W., Van Volkom M. The impact of a multifaceted approach to teaching research methods on students' attitudes. Teach. Psychol. 2012; 40 (1):20–25. [ Google Scholar ]
  • Cohen J. A power primer. Psychol. Bull. 1992; 112 (1):155–159. [ PubMed ] [ Google Scholar ]
  • Eagly A.H., Chaiken S. The Psychology of Attitudes. Harcourt Brace Jovanovich College Publishers; 1993. The psychology of attitudes. https://psycnet.apa.org/record/1992-98849-000 [ Google Scholar ]
  • Garrett B.M., Cutting R.L. Magical beliefs and discriminating science from pseudoscience in undergraduate professional students. Heliyon. 2017; 3 (11) [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Gaudiano B.A., Brown L.A., Miller I.W. Let your intuition be your guide? Individual differences in the evidence-based practice attitudes of psychotherapists. J. Eval. Clin. Pract. 2011; 17 (4):628–634. [ PubMed ] [ Google Scholar ]
  • Glasman L.R., Albarracín D. Forming attitudes that predict future behavior: a meta-analysis of the attitude-behavior relation. Psychol. Bull. 2006; 132 (Issue 5):778–822. American Psychological Association. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Gredig D., Bartelsen-Raemy A. Exploring social work students' attitudes toward research courses: predictors of interest in research-related courses among first year students enrolled in a bachelor's programme in Switzerland. Soc. Work. Educ. 2018; 37 (2):190–208. [ Google Scholar ]
  • Green H.J., Hood M., Neumann D.L. Predictors of student satisfaction with university psychology courses: a review. Psychol. Learn. Teach. 2015; 14 (2):131–146. [ Google Scholar ]
  • Gurung R.A.R., Stoa R. A National survey of teaching and learning research methods: important concepts and faculty and student perspectives. Teach. Psychol. 2020; 47 (2):111–120. [ Google Scholar ]
  • Holmes J.D. Undergraduate psychology's scientific identity dilemma: student and instructor interests and attitudes. Teach. Psychol. 2014; 41 (2):104–109. [ Google Scholar ]
  • JASP Team . 2020. JASP Version 0.14. www.jasp-stats.org [computer software] [ Google Scholar ]
  • King A. Designing the instructional process to enhance critical thinking across the curriculum: inquiring minds really do want to know: using questioning to teach critical thinking. Teach. Psychol. 1995; 22 (1):13–17. [ Google Scholar ]
  • Lambie G.W., Hayes B.G., Griffith C., Limberg D., Mullen P.R. An exploratory investigation of the research self-efficacy, interest in research, and research knowledge of Ph.D. In education students. Innovat. High. Educ. 2014; 39 (2):139–153. [ Google Scholar ]
  • Lawson T.J. Assessing psychological critical thinking as a learning outcome for psychology majors. Teach. Psychol. 1999; 26 (3):207–226. [ Google Scholar ]
  • Lawson T.J., Jordan-Fleming M.K., Bodle J.H. Measuring psychological critical thinking: an update. Teach. Psychol. 2015; 42 (3):248–253. [ Google Scholar ]
  • Meltzoff J., Cooper H. second ed. American Psychological Association; 2018. Critical thinking about research: psychology and related fields. (Critical Thinking about Research: Psychology and Related fields). [ Google Scholar ]
  • Murtonen M., Salmento H. In: Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills. Murtonen M., Balloo K., editors. Palgrave Macmillan; 2019. Broadening the theory of scientific thinking for higher education; pp. 3–29. [ Google Scholar ]
  • National Autonomous University of Honduras . 2021. Enrollment I PAC. https://estadistica.unah.edu.hn/dmsdocument/10739-matricula-i-pac-2021-preliminar [ Google Scholar ]
  • Reznitskaya A., Sternberg R.J. In: Positive Psychology in Practice. Linley P., Joseph S., editors. 2012. Teaching students to make wise judgments: the "teaching for wisdom" program; pp. 181–196. [ Google Scholar ]
  • Richardson L., Lacroix G. What do students think when asked about psychology as a science? Teach. Psychol. 2021; 48 (1):80–89. [ Google Scholar ]
  • Royce J. In: Perspectives in Information Science. NATO Advances Study Institutes Series (Series E: Applied Science) Debons A., Cameron W., editors. Springer; 1975. Epistemic styles, individuality, and world-view; pp. 259–295. [ Google Scholar ]
  • Sato W. Scientists' personality, values, and well-being. SpringerPlus. 2016; 5 :613. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Shargel R., Twiss L. In: Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, EvidenceBased Reasoning and Research Skills. Murtonen M., Balloo K., editors. Palgrave Macmillan; 2019. Evidenced-based thinking for scientific thinking; pp. 79–103. [ Google Scholar ]
  • Siemens D.R., Punnen S., Wong J., Kanji N. A survey on the attitudes towards research in medical school. BMC Med. Educ. 2010; 10 (1):4. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Silva Palma E.M., Guedes Gondim S.M., Nova Aguiar C.V. Epistemic orientation short scale: development and validity evidence in a sample of psychotherapists 1. Paideia. 2018; 28 (69) [ Google Scholar ]
  • Sizemore O.J., Lewandowski G.W. Learning might not equal liking: research methods course changes knowledge but not attitudes. Teach. Psychol. 2009; 36 (2):90–95. [ Google Scholar ]
  • Sosu E.M. The development and psychometric validation of a critical thinking disposition scale. Think. Skills Creativ. 2013; 9 :107–119. [ Google Scholar ]
  • Sternberg R.J. Teaching critical thinking: eight easy ways to fail before you begin. Phi Delta Kappan. 1987; 68 (6):456–459. http://www.jstor.org/stable/20403395 [ Google Scholar ]
  • Udompong L., Traiwichitkhun D., Wongwanich S. Causal model of research competency via scientific literacy of teacher and student. Proced. Soci. Behav. Sci. 2014; 116 :1581–1586. [ Google Scholar ]
  • UNAH . 2019. Outline of the Undergraduate Degree in Psychology. https://www.unah.edu.hn/assets/Admisiones/plan-de-estudios/Psicologia-2019.pdf [ Google Scholar ]
  • UNAH . 2021. Faculty of Social Sciences-Postgraduate Degrees. https://cienciassociales.unah.edu.hn/postgrados/ [ Google Scholar ]
  • Veilleux J.C., Chapman K.M. Development of a research methods and statistics concept inventory. Teach. Psychol. 2017; 44 (3):203–211. [ Google Scholar ]
  • Wallmann H.W., Hoover D.L. Research and critical thinking : an important link for exercise science students transitioning to physical therapy. Int. J. Exerc. Sci. 2012; 5 (2):93–96. https://pubmed.ncbi.nlm.nih.gov/27182378 [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Wilkinson W.K., Migotsky C.P. A factor Analytic study of epistemological style inventories. J. Psychol. 1994; 128 (5):499–516. [ Google Scholar ]

IMAGES

  1. How to promote Critical Thinking Skills

    what is the role of critical thinking in psychology

  2. Critical Thinking Definition, Skills, and Examples

    what is the role of critical thinking in psychology

  3. 6 Main Types of Critical Thinking Skills (With Examples)

    what is the role of critical thinking in psychology

  4. Critical Thinking Skills

    what is the role of critical thinking in psychology

  5. Critical Thinking Skills: Definitions, Examples, and How to Improve

    what is the role of critical thinking in psychology

  6. The benefits of critical thinking for students and how to develop it

    what is the role of critical thinking in psychology

VIDEO

  1. Critical Thinking

  2. Types of Thinking in Psychology in Urdu & Hindi

  3. Critical thinking will become "psychologically impossible"

  4. How to Actually Change Your Life?

  5. COINCIDENCE THEORISTS: ALLERGIC TO LOGIC?

  6. Critical thinking and deferring to experts

COMMENTS

  1. Why is critical thinking important for Psychology students?

    Critical thinking is objective and requires you to analyse and evaluate information to form a sound judgement. It is a cornerstone of evidence-based arguments and forming an evidence-based argument is essential in Psychology. That is why we, your tutors, as well as your future employers, want you to develop this skill effectively.

  2. On Critical Thinking

    Theoretical critical thinking involves helping the student develop an appreciation for scientific explanations of behavior. This means learning not just the content of psychology but how and why psychology is organized into concepts, principles, laws, and theories. Developing theoretical skills begins in the introductory course where the ...

  3. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  4. Critical Thinking

    Critical thinking plays a vital role in various aspects of life, including education, personal and professional relationships, problem-solving, decision-making, and understanding complex issues. It enables individuals to think independently, make informed judgments, evaluate the reliability of information, and develop well-reasoned arguments.

  5. A Brief Guide for Teaching and Assessing Critical Thinking in Psychology

    Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research, 4, 1102-1134. Angelo, T. A. (1995). Classroom assessment for critical thinking. Teaching of Psychology, 22(1), 6-7. Bensley, D.A. (1998). Critical thinking in psychology: A unified skills approach.

  6. Bridging critical thinking and transformative learning: The role of

    The role of perspective-taking within a theory of critical thinking can thereby help address oversights in our thinking by bringing problems into the light. This broadened perspective can in turn facilitate transformative learning whereby we reorient our beliefs, actions, and way of being in the world.

  7. Critical Thinking in Psychology

    Written by leading experts in critical thinking in psychology, each chapter contains useful pedagogical features, such as critical-thinking questions, brief summaries, and definitions of key terms. It also supplies descriptions of each chapter author's critical-thinking experience, which evidences how critical thinking has made a difference to ...

  8. PDF Critical Thinking in Psychology

    This book is an introductory text on critical thinking for upper-level undergraduates and graduate students. It shows students how to think critically about key topics such as experimental research, statistical inference, case studies, logical fallacies, and ethical judgments. Robert J. Sternberg is Dean of Arts and Sciences at Tufts University.

  9. Critical thinking in psychology, 2nd ed.

    Critical thinking about critical thinking questions (at least five per chapter), so that readers can think critically about what they have learned with regard to critical thinking. A section in each chapter on how critical thinking has played an important role in the author's own professional career, so that readers can see how the authors ...

  10. 17

    Summary. In this book, distinguished theorists and researchers in psychology have explored the role of critical thinking in psychology. The conclusion I come to is that critical thinking is critical in and to psychology. In this final chapter, I summarize some of the "critical" lessons readers can learn from having read the book.

  11. PDF CRITICAL THINKING IN PSYCHOLOGY

    CRITICAL THINKING IN PSYCHOLOGY Good scienti c research depends on critical thinking at least as much as factual knowledge; psychology is no exception to this rule. And yet, despite the importance of critical thinking, psychology students are rarely taught how to think critically about the theories, methods, and concepts they must use.

  12. Critically exploring psychology: 2.1 What is critical thinking

    Critical thinking involves making an assessment of something, and then providing a critique of that position and putting forward new positions. For example, flip flops may be comfortable for the first part of the hike, in hot weather. However, the top of the mountain is rocky so a more substantial trainer might be needed to get to the summit ...

  13. Critical Thinking in Psychology

    Good scientific research depends on critical thinking at least as much as factual knowledge; psychology is no exception to this rule. And yet, despite the importance of critical thinking, psychology students are rarely taught how to think critically about the theories, methods, and concepts they must use. This book shows students and researchers how to think critically about key topics such as ...

  14. Critical Thinking in Psychology

    A Role for Reasoning in a Dialogic Approach to Critical Thinking. Topoi, Vol. 37, Issue. 1, p. 121. ... psychology is no exception to this rule. And yet, despite the importance of critical thinking, psychology students are rarely taught how to think critically about the theories, methods, and concepts they must use. This book shows students and ...

  15. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  16. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  17. 3 Core Critical Thinking Skills Every Thinker Should Have

    Critical thinking (CT) is a metacognitive process, consisting of a number of skills and dispositions, that when used through self-regulatory reflective judgment, increases the chances of producing ...

  18. Critical Thinking

    Critical Thinking, Cognitive Psychology of. D.F. Halpern, in International Encyclopedia of the Social & Behavioral Sciences, 2001. Critical thinking is the use of cognitive skills or strategies that increase the probability of a desirable outcome. Although much of the theory and research in critical thinking comes from cognitive psychology, it ...

  19. Educating Critical Thinkers: The Role of Epistemic Cognition

    Proliferating information and viewpoints in the 21st century require an educated citizenry with the ability to think critically about complex, controversial issues. Critical thinking requires epistemic cognition: the ability to construct, evaluate, and use knowledge. Epistemic dispositions and beliefs predict many academic outcomes, as well as ...

  20. 2 Critical thinking in psychology

    2 Critical thinking in psychology. The central task for psychology is to try to explain human behaviour and experience; that is, to explain all the things that people do, think, feel and say. However, psychology is not restricted to human behaviour; it includes non-human animals in its field of study too. But human and non-human behaviour and ...

  21. Why the Clock Counts with Critical Thinking

    When we believe is a crucial component of critical thinking because it reveals much about how we think. Source: Guy P. Harrison. It may seem counterintuitive, but being correct in the long run is ...

  22. Psychology students' attitudes towards research: the role of critical

    The Critical Thinking Disposition Scale (CTDS) is an 11-item instrument with a five-point Likert-type response set, with scores varying from 1 (completely disagree) to 5 (completely agree), with higher scores indicating higher self-reported critical thinking disposition. The CTDS has a bi-dimensional structure consisting of two factors ...

  23. Psychology students' attitudes towards research: the role of critical

    Critical thinking. The Critical Thinking Disposition Scale (CTDS) is an 11-item instrument with a five-point Likert-type response set, with scores varying from 1 (completely disagree) to 5 (completely agree), with higher scores indicating higher self-reported critical thinking disposition. The CTDS has a bi-dimensional structure consisting of ...