• Grades 6-12
  • School Leaders

How do You Use Social Media? Be entered to win a $50 gift card!

What Is Critical Thinking and Why Do We Need To Teach It?

Question the world and sort out fact from opinion.

What is critical thinking? #buzzwordsexplained

The world is full of information (and misinformation) from books, TV, magazines, newspapers, online articles, social media, and more. Everyone has their own opinions, and these opinions are frequently presented as facts. Making informed choices is more important than ever, and that takes strong critical thinking skills. But what exactly is critical thinking? Why should we teach it to our students? Read on to find out.

What is critical thinking?

Critical Thinking Skills infographic detailing observation, analysis, inference, communication, and problem solving

Source: Indeed

Critical thinking is the ability to examine a subject and develop an informed opinion about it. It’s about asking questions, then looking closely at the answers to form conclusions that are backed by provable facts, not just “gut feelings” and opinion. These skills allow us to confidently navigate a world full of persuasive advertisements, opinions presented as facts, and confusing and contradictory information.

The Foundation for Critical Thinking says, “Critical thinking can be seen as having two components: 1) a set of information and belief-generating and processing skills, and 2) the habit, based on intellectual commitment, of using those skills to guide behavior.”

In other words, good critical thinkers know how to analyze and evaluate information, breaking it down to separate fact from opinion. After a thorough analysis, they feel confident forming their own opinions on a subject. And what’s more, critical thinkers use these skills regularly in their daily lives. Rather than jumping to conclusions or being guided by initial reactions, they’ve formed the habit of applying their critical thinking skills to all new information and topics.

Why is critical thinking so important?

education is not the learning of facts but the training of the mind to think. -Albert Einstein

Imagine you’re shopping for a new car. It’s a big purchase, so you want to do your research thoroughly. There’s a lot of information out there, and it’s up to you to sort through it all.

  • You’ve seen TV commercials for a couple of car models that look really cool and have features you like, such as good gas mileage. Plus, your favorite celebrity drives that car!
  • The manufacturer’s website has a lot of information, like cost, MPG, and other details. It also mentions that this car has been ranked “best in its class.”
  • Your neighbor down the street used to have this kind of car, but he tells you that he eventually got rid of it because he didn’t think it was comfortable to drive. Plus, he heard that brand of car isn’t as good as it used to be.
  • Three independent organizations have done test-drives and published their findings online. They all agree that the car has good gas mileage and a sleek design. But they each have their own concerns or complaints about the car, including one that found it might not be safe in high winds.

So much information! It’s tempting to just go with your gut and buy the car that looks the coolest (or is the cheapest, or says it has the best gas mileage). Ultimately, though, you know you need to slow down and take your time, or you could wind up making a mistake that costs you thousands of dollars. You need to think critically to make an informed choice.

What does critical thinking look like?

Infographic of 8 scientifically proven strategies for critical thinking

Source: TeachThought

Let’s continue with the car analogy, and apply some critical thinking to the situation.

  • Critical thinkers know they can’t trust TV commercials to help them make smart choices, since every single one wants you to think their car is the best option.
  • The manufacturer’s website will have some details that are proven facts, but other statements that are hard to prove or clearly just opinions. Which information is factual, and even more important, relevant to your choice?
  • A neighbor’s stories are anecdotal, so they may or may not be useful. They’re the opinions and experiences of just one person and might not be representative of a whole. Can you find other people with similar experiences that point to a pattern?
  • The independent studies could be trustworthy, although it depends on who conducted them and why. Closer analysis might show that the most positive study was conducted by a company hired by the car manufacturer itself. Who conducted each study, and why?

Did you notice all the questions that started to pop up? That’s what critical thinking is about: asking the right questions, and knowing how to find and evaluate the answers to those questions.

Good critical thinkers do this sort of analysis every day, on all sorts of subjects. They seek out proven facts and trusted sources, weigh the options, and then make a choice and form their own opinions. It’s a process that becomes automatic over time; experienced critical thinkers question everything thoughtfully, with purpose. This helps them feel confident that their informed opinions and choices are the right ones for them.

Key Critical Thinking Skills

There’s no official list, but many people use Bloom’s Taxonomy to help lay out the skills kids should develop as they grow up.

A diagram showing Bloom's Taxonomy (Critical Thinking Skills)

Source: Vanderbilt University

Bloom’s Taxonomy is laid out as a pyramid, with foundational skills at the bottom providing a base for more advanced skills higher up. The lowest phase, “Remember,” doesn’t require much critical thinking. These are skills like memorizing math facts, defining vocabulary words, or knowing the main characters and basic plot points of a story.

Higher skills on Bloom’s list incorporate more critical thinking.

True understanding is more than memorization or reciting facts. It’s the difference between a child reciting by rote “one times four is four, two times four is eight, three times four is twelve,” versus recognizing that multiplication is the same as adding a number to itself a certain number of times. When you understand a concept, you can explain how it works to someone else.

When you apply your knowledge, you take a concept you’ve already mastered and apply it to new situations. For instance, a student learning to read doesn’t need to memorize every word. Instead, they use their skills in sounding out letters to tackle each new word as they come across it.

When we analyze something, we don’t take it at face value. Analysis requires us to find facts that stand up to inquiry. We put aside personal feelings or beliefs, and instead identify and scrutinize primary sources for information. This is a complex skill, one we hone throughout our entire lives.

Evaluating means reflecting on analyzed information, selecting the most relevant and reliable facts to help us make choices or form opinions. True evaluation requires us to put aside our own biases and accept that there may be other valid points of view, even if we don’t necessarily agree with them.

Finally, critical thinkers are ready to create their own result. They can make a choice, form an opinion, cast a vote, write a thesis, debate a topic, and more. And they can do it with the confidence that comes from approaching the topic critically.

How do you teach critical thinking skills?

The best way to create a future generation of critical thinkers is to encourage them to ask lots of questions. Then, show them how to find the answers by choosing reliable primary sources. Require them to justify their opinions with provable facts, and help them identify bias in themselves and others. Try some of these resources to get started.

  • 5 Critical Thinking Skills Every Kid Needs To Learn (And How To Teach Them)
  • 100+ Critical Thinking Questions for Students To Ask About Anything
  • 10 Tips for Teaching Kids To Be Awesome Critical Thinkers
  • Free Critical Thinking Poster, Rubric, and Assessment Ideas

More Critical Thinking Resources

The answer to “What is critical thinking?” is a complex one. These resources can help you dig more deeply into the concept and hone your own skills.

  • The Foundation for Critical Thinking
  • Cultivating a Critical Thinking Mindset (PDF)
  • Asking the Right Questions: A Guide to Critical Thinking (Browne/Keeley, 2014)

Have more questions about what critical thinking is or how to teach it in your classroom? Join the WeAreTeachers HELPLINE group on Facebook to ask for advice and share ideas!

Plus, 12 skills students can work on now to help them in careers later ..

What is critical thinking? It's the ability to thoughtfully question the world and sort out fact from opinion, and it's a key life skill.

You Might Also Like

Text that says What is SEL? on maroon background with #buzzwordsexplained logo

What Is Social-Emotional Learning (SEL)?

Teach kids the "soft skills" they need to know. Continue Reading

Copyright © 2023. All rights reserved. 5335 Gate Parkway, Jacksonville, FL 32256

Critical thinking – Why is it so hard to teach?

Critical thinking is not a set of skills that can be deployed at any time, in any context. It is a type of thought that even 3-year-olds can engage in – and even trained scientists can fail in.

Willingham, D. T. (2008). Critical thinking: Why is it so hard to teach? Arts Education Policy Review, 109 (4), 21–32. https://doi.org/10.3200/AEPR.109.4.21-32

Sign up for the latest news, announcements & information!

  • Events & Conferences
  • Publications

Site Search

Use the below form to search the website.

Jump to navigation

SEED PAK logo

Critical Thinking: Why Is It So Hard to Teach?

Teacher quality standards.

  • Element D: Teachers establish and communicate high expectations and use processes to support the development of critical-thinking and problem-solving skills.

PD resource content

This article takes a deep dive into what it really means for students to think critically. Professor of Cognitive Psychology, Daniel Willingham, details the latest research in an effort support teachers as they explicitly model and scaffold critical thinking strategies with their students.

Resource Links

Willingham, D. T. (2007). Critical thinking: Why is it so hard to teach ?  American Educator . Retrieved from http://www.aft.org/sites/default/files/periodicals/Crit_Thinking.pdf

You must log in in order to respond to resource questions.

Submit your work

After completing your responses and resource rating, please submit your work for credit.

Search form

  • Community Garden
  • Cultivated sets
  • Developed in collaboration between Dyslexia Canada and IDA Ontario

critical thinking why is it so hard to teach

Critical Thinking: Why is it so hard to teach?

By :   

  • Daniel Willingham, Reading Rockets

Grade(s) :

Topic(s) :

  • Comprehension

Description

  • A1. Transferable Skills

In this Reading Rockets article, Daniel Willingham explores the topic of critical thinking and how to foster metacognitive skills in the classroom. Willingham notes that background knowledge plays a role in critical thinking and understanding the surface structure of problems. He provides evidence-based strategies, such as promoting thinking within a particular domain to bring everyday thinking into the classroom.

Curriculum Connection

Critical thinking is essential in all areas of the curriculum and this website broadly supports the successful implementation of fostering metacognitive skills to enhance critical thinking. 

Related Posts

icon-doc

  • C1. Knowledge about Texts
  • D2. Creating Texts

Teaching TV: Learning With Tel...

  • By Elizabeth Verrall
  • A2. Digital Media Literacy
  • D1. Developing Ideas and Organizing Content

Teaching TV: Television Techni...

Teaching tv: film production: ....

This is a good article and it really makes you think about the big push that is always come from the top-down to make critical thinkers in our education systems. After you read this, it makes a lot of sense why students have a difficult time doing math word problems, looking at multiple perspectives in middle school geography and being able to connect the dots in science. Great read for intermediate teachers.

These are great resources. I would love to see research and resources from Canadian researchers as well.

Leave a Comment Cancel reply

You must be logged in to post a comment.

[email protected]

Produced By

DC_Blue

Funding for ONlit.org is provided by the Ministry of Education. Please note that the views expressed in these resources are the views of ONlit and do not necessarily reflect those of the Ministry of Education.

© 2023 ONlit. All rights reserved.

Half portrait of a young girl with glasses looking up stock photo

Critical Thinking: Why Is It So Hard to Teach?

Learning critical thinking skills can only take a student so far. Critical thinking depends on knowing relevant content very well and thinking about it, repeatedly. Here are five strategies, consistent with the research, to help bring critical thinking into the everyday classroom.

On this page:

Why is thinking critically so hard, thinking tends to focus on a problem's "surface structure", with deep knowledge, thinking can penetrate beyond surface structure, looking for a deep structure helps, but it only takes you so far, is thinking like a scientist easier, why scientific thinking depends on scientific knowledge.

Virtually everyone would agree that a primary, yet insufficiently met, goal of schooling is to enable students to think critically. In layperson’s terms, critical thinking consists of seeing both sides of an issue, being open to new evidence that disconfirms your ideas, reasoning dispassionately, demanding that claims be backed by evidence, deducing and inferring conclusions from available facts, solving problems, and so forth. Then too, there are specific types of critical thinking that are characteristic of different subject matter: That’s what we mean when we refer to “thinking like a scientist” or “thinking like a historian.”

This proper and commonsensical goal has very often been translated into calls to teach “critical thinking skills” and “higher-order thinking skills” and into generic calls for teaching students to make better judgments, reason more logically, and so forth. In a recent survey of human resource officials 1 and in testimony delivered just a few months ago before the Senate Finance Committee, 2 business leaders have repeatedly exhorted schools to do a better job of teaching students to think critically. And they are not alone. Organizations and initiatives involved in education reform, such as the National Center on Education and the Economy, the American Diploma Project, and the Aspen Institute, have pointed out the need for students to think and/or reason critically. The College Board recently revamped the SAT to better assess students’ critical thinking and ACT, Inc. offers a test of critical thinking for college students.

These calls are not new. In 1983, A Nation At Risk , a report by the National Commission on Excellence in Education, found that many 17-year-olds did not possess the “ ‘higher-order’ intellectual skills” this country needed. It claimed that nearly 40 percent could not draw inferences from written material and only onefifth could write a persuasive essay.

Following the release of A Nation At Risk , programs designed to teach students to think critically across the curriculum became extremely popular. By 1990, most states had initiatives designed to encourage educators to teach critical thinking, and one of the most widely used programs, Tactics for Thinking, sold 70,000 teacher guides. 3 But, for reasons I’ll explain, the programs were not very effective — and today we still lament students’ lack of critical thinking.

After more than 20 years of lamentation, exhortation, and little improvement, maybe it’s time to ask a fundamental question: Can critical thinking actually be taught? Decades of cognitive research point to a disappointing answer: not really. People who have sought to teach critical thinking have assumed that it is a skill, like riding a bicycle, and that, like other skills, once you learn it, you can apply it in any situation. Research from cognitive science shows that thinking is not that sort of skill. The processes of thinking are intertwined with the content of thought (that is, domain knowledge). Thus, if you remind a student to “look at an issue from multiple perspectives” often enough, he will learn that he ought to do so, but if he doesn’t know much about an issue, he can’t think about it from multiple perspectives. You can teach students maxims about how they ought to think, but without background knowledge and practice, they probably will not be able to implement the advice they memorize. Just as it makes no sense to try to teach factual content without giving students opportunities to practice using it, it also makes no sense to try to teach critical thinking devoid of factual content.

In this article, I will describe the nature of critical thinking, explain why it is so hard to do and to teach, and explore how students acquire a specific type of critical thinking: thinking scientifically. Along the way, we’ll see that critical thinking is not a set of skills that can be deployed at any time, in any context. It is a type of thought that even 3-year-olds can engage in — and even trained scientists can fail in. And it is very much dependent on domain knowledge and practice.

Educators have long noted that school attendance and even academic success are no guarantee that a student will graduate an effective thinker in all situations. There is an odd tendency for rigorous thinking to cling to particular examples or types of problems. Thus, a student may have learned to estimate the answer to a math problem before beginning calculations as a way of checking the accuracy of his answer, but in the chemistry lab, the same student calculates the components of a compound without noticing that his estimates sum to more than 100%. And a student who has learned to thoughtfully discuss the causes of the American Revolution from both the British and American perspectives doesn’t even think to question how the Germans viewed World War II. Why are students able to think critically in one situation, but not in another? The brief answer is: Thought processes are intertwined with what is being thought about. Let’s explore this in depth by looking at a particular kind of critical thinking that has been studied extensively: problem solving.

Imagine a seventh-grade math class immersed in word problems. How is it that students will be able to answer one problem, but not the next, even though mathematically both word problems are the same, that is, they rely on the same mathematical knowledge? Typically, the students are focusing on the scenario that the word problem describes (its surface structure) instead of on the mathematics required to solve it (its deep structure). So even though students have been taught how to solve a particular type of word problem, when the teacher or textbook changes the scenario, students still struggle to apply the solution because they don’t recognize that the problems are mathematically the same.

To understand why the surface structure of a problem is so distracting and, as a result, why it’s so hard to apply familiar solutions to problems that appear new, let’s first consider how you understand what’s being asked when you are given a problem. Anything you hear or read is automatically interpreted in light of what you already know about similar subjects. For example, suppose you read these two sentences: “After years of pressure from the film and television industry, the President has filed a formal complaint with China over what U.S. firms say is copyright infringement. These firms assert that the Chinese government sets stringent trade restrictions for U.S. entertainment products, even as it turns a blind eye to Chinese companies that copy American movies and television shows and sell them on the black market.” Background knowledge not only allows you to comprehend the sentences, it also has a powerful effect as you continue to read because it narrows the interpretations of new text that you will entertain. For example, if you later read the word “Bush,” it would not make you think of a small shrub, nor would you wonder whether it referred to the former President Bush, the rock band, or a term for rural hinterlands. If you read “piracy,” you would not think of eye-patched swabbies shouting “shiver me timbers!” The cognitive system gambles that incoming information will be related to what you’ve just been thinking about. Thus, it significantly narrows the scope of possible interpretations of words, sentences, and ideas. The benefit is that comprehension proceeds faster and more smoothly; the cost is that the deep structure of a problem is harder to recognize.

The narrowing of ideas that occurs while you read (or listen) means that you tend to focus on the surface structure, rather than on the underlying structure of the problem. For example, in one experiment, 4 subjects saw a problem like this one:

Members of the West High School Band were hard at work practicing for the annual Homecoming Parade. First they tried marching in rows of 12, but Andrew was left by himself to bring up the rear. Then the director told the band members to march in columns of eight, but Andrew was still left to march alone. Even when the band marched in rows of three, Andrew was left out. Finally, in exasperation, Andrew told the band director that they should march in rows of five in order to have all the rows filled. He was right. Given that there were at least 45 musicians on the field but fewer than 200 musicians, how many students were there in the West High School Band?

Earlier in the experiment, subjects had read four problems along with detailed explanations of how to solve each one, ostensibly to rate them for the clarity of the writing. One of the four problems concerned the number of vegetables to buy for a garden, and it relied on the same type of solution necessary for the band problem-calculation of the least common multiple. Yet, few subjects — just 19 percent — saw that the band problem was similar and that they could use the garden problem solution. Why?

When a student reads a word problem, her mind interprets the problem in light of her prior knowledge, as happened when you read the two sentences about copyrights and China. The difficulty is that the knowledge that seems relevant relates to the surface structure — in this problem, the reader dredges up knowledge about bands, high school, musicians, and so forth. The student is unlikely to read the problem and think of it in terms of its deep structure — using the least common multiple. The surface structure of the problem is overt, but the deep structure of the problem is not. Thus, people fail to use the first problem to help them solve the second: In their minds, the first was about vegetables in a garden and the second was about rows of band marchers.

If knowledge of how to solve a problem never transferred to problems with new surface structures, schooling would be inefficient or even futile — but of course, such transfer does occur. When and why is complex, 5 but two factors are especially relevant for educators: familiarity with a problem’s deep structure and the knowledge that one should look for a deep structure. I’ll address each in turn. When one is very familiar with a problem’s deep structure, knowledge about how to solve it transfers well. That familiarity can come from long-term, repeated experience with one problem, or with various manifestations of one type of problem (i.e., many problems that have different surface structures, but the same deep structure). After repeated exposure to either or both, the subject simply perceives the deep structure as part of the problem description. Here’s an example:

A treasure hunter is going to explore a cave up on a hill near a beach. He suspected there might be many paths inside the cave so he was afraid he might get lost. Obviously, he did not have a map of the cave; all he had with him were some common items such as a flashlight and a bag. What could he do to make sure he did not get lost trying to get back out of the cave later?

The solution is to carry some sand with you in the bag, and leave a trail as you go, so you can trace your path back when you’re ready to leave the cave. About 75% of American college students thought of this solution — but only 25% of Chinese students solved it. 6 The experimenters suggested that Americans solved it because most grew up hearing the story of Hansel and Gretel which includes the idea of leaving a trail as you travel to an unknown place in order to find your way back. The experimenters also gave subjects another puzzle based on a common Chinese folk tale, and the percentage of solvers from each culture reversed. www.aft.org/pubs-reports/american_educator/index.htm”>Read the puzzle based on the Chinese folk tale, and the tale itself.

It takes a good deal of practice with a problem type before students know it well enough to immediately recognize its deep structure, irrespective of the surface structure, as Americans did for the Hansel and Gretel problem. American subjects didn’t think of the problem in terms of sand, caves, and treasure; they thought of it in terms of finding something with which to leave a trail. The deep structure of the problem is so well represented in their memory, that they immediately saw that structure when they read the problem.

Now let’s turn to the second factor that aids in transfer despite distracting differences in surface structure — knowing to look for a deep structure. Consider what would happen if I said to a student working on the band problem, “this one is similar to the garden problem.” The student would understand that the problems must share a deep structure and would try to figure out what it is. Students can do something similar without the hint. A student might think “I’m seeing this problem in a math class, so there must be a math formula that will solve this problem.” Then he could scan his memory (or textbook) for candidates, and see if one of them helps. This is an example of what psychologists call metacognition, or regulating one’s thoughts. In the introduction, I mentioned that you can teach students maxims about how they ought to think. Cognitive scientists refer to these maxims as metacognitive strategies. They are little chunks of knowledge — like “look for a problem’s deep structure” or “consider both sides of an issue” — that students can learn and then use to steer their thoughts in more productive directions.

Helping students become better at regulating their thoughts was one of the goals of the critical thinking programs that were popular 20 years ago. These programs are not very effective. Their modest benefit is likely due to teaching students to effectively use metacognitive strategies. Students learn to avoid biases that most of us are prey to when we think, such as settling on the first conclusion that seems reasonable, only seeking evidence that confirms one’s beliefs, ignoring countervailing evidence, overconfidence, and others. 7 Thus, a student who has been encouraged many times to see both sides of an issue, for example, is probably more likely to spontaneously think “I should look at both sides of this issue” when working on a problem.

Unfortunately, metacognitive strategies can only take you so far. Although they suggest what you ought to do, they don’t provide the knowledge necessary to implement the strategy. For example, when experimenters told subjects working on the band problem that it was similar to the garden problem, more subjects solved the problem (35% compared to 19% without the hint), but most subjects, even when told what to do, weren’t able to do it. Likewise, you may know that you ought not accept the first reasonable-sounding solution to a problem, but that doesn’t mean you know how to come up with alterative solutions or weigh how reasonable each one is. That requires domain knowledge and practice in putting that knowledge to work.

Since critical thinking relies so heavily on domain knowledge, educators may wonder if thinking critically in a particular domain is easier to learn. The quick answer is yes, it’s a little easier. To understand why, let’s focus on one domain, science, and examine the development of scientific thinking.

Teaching science has been the focus of intensive study for decades, and the research can be usefully categorized into two strands. The first examines how children acquire scientific concepts; for example, how they come to forgo naive conceptions of motion and replace them with an understanding of physics. The second strand is what we would call thinking scientifically, that is, the mental procedures by which science is conducted: developing a model, deriving a hypothesis from the model, designing an experiment to test the hypothesis, gathering data from the experiment, interpreting the data in light of the model, and so forth.† Most researchers believe that scientific thinking is really a subset of reasoning that is not different in kind from other types of reasoning that children and adults do. 8 What makes it scientific thinking is knowing when to engage in such reasoning, and having accumulated enough relevant knowledge and spent enough time practicing to do so.

Recognizing when to engage in scientific reasoning is so important because the evidence shows that being able to reason is not enough; children and adults use and fail to use the proper reasoning processes on problems that seem similar. For example, consider a type of reasoning about cause and effect that is very important in science: conditional probabilities. If two things go together, it’s possible that one causes the other. Suppose you start a new medicine and notice that you seem to be getting headaches more often than usual. You would infer that the medication influenced your chances of getting a headache. But it could also be that the medication increases your chances of getting a headache only in certain circumstances or conditions. In conditional probability, the relationship between two things (e.g., medication and headaches) is dependent on a third factor. For example, the medication might increase the probability of a headache only when you’ve had a cup of coffee. The relationship of the medication and headaches is conditional on the presence of coffee.

Understanding and using conditional probabilities is essential to scientific thinking because it is so important in reasoning about what causes what. But people’s success in thinking this way depends on the particulars of how the question is presented. Studies show that adults sometimes use conditional probabilities successfully, 9 but fail to do so with many problems that call for it. 10 Even trained scientists are open to pitfalls in reasoning about conditional probabilities (as well as other types of reasoning). Physicians are known to discount or misinterpret new patient data that conflict with a diagnosis they have in mind, 11 and Ph.D.- level scientists are prey to faulty reasoning when faced with a problem embedded in an unfamiliar context. 12

And yet, young children are sometimes able to reason about conditional probabilities. In one experiment, 13 the researchers showed 3-year-olds a box and told them it was a “blicket detector” that would play music if a blicket were placed on top. The child then saw one of the two sequences shown below in which blocks are placed on the blicket detector. At the end of the sequence, the child was asked whether each block was a blicket. In other words, the child was to use conditional reasoning to infer which block caused the music to play.

Note that the relationship between each individual block (yellow cube and blue cylinder) and the music is the same in sequences 1 and 2. In either sequence, the child sees the yellow cube associated with music three times, and the blue cylinder associated with the absence of music once and the presence of music twice. What differs between the first and second sequence is the relationship between the blue and yellow blocks, and therefore, the conditional probability of each block being a blicket. Three-year-olds understood the importance of conditional probabilities.For sequence 1, they said the yellow cube was a blicket, but the blue cylinder was not; for sequence 2, they chose equally between the two blocks.

This body of studies has been summarized simply: Children are not as dumb as you might think, and adults (even trained scientists) are not as smart as you might think.What’s going on? One issue is that the common conception of critical thinking or scientific thinking (or historical thinking) as a set of skills is not accurate. Critical thinking does not have certain characteristics normally associated with skills — in particular, being able to use that skill at any time. If I told you that I learned to read music, for example, you would expect, correctly, that I could use my new skill (i.e., read music) whenever I wanted. But critical thinking is very different. As we saw in the discussion of conditional probabilities, people can engage in some types of critical thinking without training, but even with extensive training, they will sometimes fail to think critically. This understanding that critical thinking is not a skill is vital.‡ It tells us that teaching students to think critically probably lies in small part in showing them new ways of thinking, and in large part in enabling them to deploy the right type of thinking at the right time.

Returning to our focus on science, we’re ready to address a key question: Can students be taught when to engage in scientific thinking? Sort of. It is easier than trying to teach general critical thinking, but not as easy as we would like. Recall that when we were discussing problem solving, we found that students can learn metacognitive strategies that help them look past the surface structure of a problem and identify its deep structure, thereby getting them a step closer to figuring out a solution. Essentially the same thing can happen with scientific thinking. Students can learn certain metacognitive strategies that will cue them to think scientifically. But, as with problem solving, the metacognitive strategies only tell the students what they should do — they do not provide the knowledge that students need to actually do it. The good news is that within a content area like science, students have more context cues to help them figure out which metacognitive strategy to use, and teachers have a clearer idea of what domain knowledge they must teach to enable students to do what the strategy calls for.

For example, two researchers 14 taught second-, third-, and fourth-graders the scientific concept behind controlling variables; that is, of keeping everything in two comparison conditions the same, except for the one variable that is the focus of investigation. The experimenters gave explicit instruction about this strategy for conducting experiments and then had students practice with a set of materials (e.g., springs) to answer a specific question (e.g., which of these factors determine how far a spring will stretch: length, coil diameter, wire diameter, or weight?). The experimenters found that students not only understood the concept of controlling variables, they were able to apply it seven months later with different materials and a different experimenter, although the older children showed more robust transfer than the younger children. In this case, the students recognized that they were designing an experiment and that cued them to recall the metacognitive strategy, “When I design experiments, I should try to control variables.” Of course, succeeding in controlling all of the relevant variables is another matter-that depends on knowing which variables may matter and how they could vary.

Experts in teaching science recommend that scientific reasoning be taught in the context of rich subject matter knowledge. A committee of prominent science educators brought together by the National Research Council put it plainly: “Teaching content alone is not likely to lead to proficiency in science, nor is engaging in inquiry experiences devoid of meaningful science content.”

The committee drew this conclusion based on evidence that background knowledge is necessary to engage in scientific thinking. For example, knowing that one needs a control group in an experiment is important. Like having two comparison conditions, having a control group in addition to an experimental group helps you focus on the variable you want to study. But knowing that you need a control group is not the same as being able to create one. Since it’s not always possible to have two groups that are exactly alike, knowing which factors can vary between groups and which must not vary is one example of necessary background knowledge. In experiments measuring how quickly subjects can respond, for example, control groups must be matched for age, because age affects response speed, but they need not be perfectly matched for gender.

More formal experimental work verifies that background knowledge is necessary to reason scientifically. For example, consider devising a research hypothesis. One could generate multiple hypotheses for any given situation. Suppose you know that car A gets better gas mileage than car B and you’d like to know why. There are many differences between the cars, so which will you investigate first? Engine size? Tire pressure? A key determinant of the hypothesis you select is plausibility. You won’t choose to investigate a difference between cars A and B that you think is unlikely to contribute to gas mileage (e.g., paint color), but if someone provides a reason to make this factor more plausible (e.g., the way your teenage son’s driving habits changed after he painted his car red), you are more likely to say that this now-plausible factor should be investigated. 16 One’s judgment about the plausibility of a factor being important is based on one’s knowledge of the domain.

Other data indicate that familiarity with the domain makes it easier to juggle different factors simultaneously, which in turn allows you to construct experiments that simultaneously control for more factors. For example, in one experiment, 17 eighth-graders completed two tasks. In one, they were to manipulate conditions in a computer simulation to keep imaginary creatures alive. In the other, they were told that they had been hired by a swimming pool company to evaluate how the surface area of swimming pools was related to the cooling rate of its water. Students were more adept at designing experiments for the first task than the second, which the researchers interpreted as being due to students’ familiarity with the relevant variables. Students are used to thinking about factors that might influence creatures’ health (e.g., food, predators), but have less experience working with factors that might influence water temperature (e.g., volume, surface area). Hence, it is not the case that “controlling variables in an experiment” is a pure process that is not affected by subjects’ knowledge of those variables.

Prior knowledge and beliefs not only influence which hypotheses one chooses to test, they influence how one interprets data from an experiment. In one experiment, 18 undergraduates were evaluated for their knowledge of electrical circuits. Then they participated in three weekly, 1.5-hour sessions during which they designed and conducted experiments using a computer simulation of circuitry, with the goal of learning how circuitry works. The results showed a strong relationship between subjects’ initial knowledge and how much subjects learned in future sessions, in part due to how the subjects interpreted the data from the experiments they had conducted. Subjects who started with more and better integrated knowledge planned more informative experiments and made better use of experimental outcomes.

Other studies have found similar results, and have found that anomalous, or unexpected, outcomes may be particularly important in creating new knowledge-and particularly dependent upon prior knowledge. 19 Data that seem odd because they don’t fit one’s mental model of the phenomenon under investigation are highly informative. They tell you that your understanding is incomplete, and they guide the development of new hypotheses. But you could only recognize the outcome of an experiment as anomalous if you had some expectation of how it would turn out. And that expectation would be based on domain knowledge, as would your ability to create a new hypothesis that takes the anomalous outcome into account.

The idea that scientific thinking must be taught hand in hand with scientific content is further supported by research on scientific problem solving; that is, when students calculate an answer to a textbook-like problem, rather than design their own experiment. A meta-analysis 20 of 40 experiments investigating methods for teaching scientific problem solving showed that effective approaches were those that focused on building complex, integrated knowledge bases as part of problem solving, for example by including exercises like concept mapping. Ineffective approaches focused exclusively on the strategies to be used in problem solving while ignoring the knowledge necessary for the solution.

What do all these studies boil down to? First, critical thinking (as well as scientific thinking and other domain-based thinking) is not a skill. There is not a set of critical thinking skills that can be acquired and deployed regardless of context. Second, there are metacognitive strategies that, once learned, make critical thinking more likely. Third, the ability to think critically (to actually do what the metacognitive strategies call for) depends on domain knowledge and practice. For teachers, the situation is not hopeless, but no one should underestimate the difficulty of teaching students to think critically.

Liked it? Share it!

Visit our sister websites:, reading rockets launching young readers (opens in a new window), start with a book read. explore. learn (opens in a new window), colorín colorado helping ells succeed (opens in a new window), ld online all about learning disabilities (opens in a new window), reading universe all about teaching reading and writing (opens in a new window).

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform

Getting Students Comfortable with Critical Thinking

author avatar

Three Strategies to Support Critical Thinking

Toward more joyful learning.

premium resources logo

Premium Resource

Getting Students Comfortable with Critical Thinking Header Image

Critical thinking skills do not develop via osmosis or incidental exposure to critical thought.

Author Image

  • Critical thinking is subject-specific. Art critics and football analysts are, arguably, both critical thinkers in their own right. Yet we wouldn't want an art critic to offer color commentary for the Super Bowl or a sportscaster to serve as a docent at the Museum of Modern Art. The same principle applies to academics. As cognitive scientist Daniel Willingham (2007) notes, being able to solve a complex mathematics problem does not transfer to being able to analyze a historical text or construct a well-reasoned essay. We must help students develop distinct critical thinking skills in all subject areas.
  • We must teach critical thinking skills directly. As I've noted previously in this column, (Goodwin, 2017), critical thinking skills do not develop via osmosis or incidental exposure to critical thought. A study that compared the effects of providing one group of college students with direct instruction in critical thinking and another with texts that reflected analytical and evaluative thinking found the first group to be far more capable of demonstrating critical thinking afterward (Marin & Halpern, 2011).
  • Mental models are key to critical thinking. In many pursuits, critical thought depends on our ability to use and reflect on mental models, or schema . Research has shown that one of the biggest differences between experts and novices is that experts use well-developed mental models to solve problems by categorizing them, creating a mental representation of them, retrieving strategies for solving them, and reflecting afterward on the validity of their answers (Nokes, Schunn, & Chi, 2010). So, to help students develop critical thinking skills, we must help them develop mental schema for solving problems.

1. Structured problem solving

2. cognitive writing, 3. guided investigations, the new classroom instruction that works.

An all-new third edition provides a rigorous research base for instructional strategies proven to promote meaningful learning.

The New Classroom Instruction That Works

Collins, J. L., Lee, J., Fox, J. D., & Madigan, T. P. (2017). Bringing together reading and writing: An experimental study of writing intensive reading comprehension in low-performing urban elementary schools.  Reading Research Quarterly ,  52 (3), 311–332.

Goodwin, B. (2017). Critical thinking won't develop through osmosis.  Educational Leadership ,  74 (5). 80–81.

Kahneman, D. (2011).  Thinking fast and slow . Farrar, Straus & Giroux.

Lynch, S., Taymans, J., Watson, W. A., Ochsendorf, R. J., Pyke, C., & Szesze, M. J. (2007). Effectiveness of a highly rated science curriculum unit for students with disabilities in general education classrooms.  Exceptional Children ,  73 (2), 202–223.

Marin, L. M., & Halpern, D. F. (2011). Pedagogy for developing critical thinking in adolescents: Explicit instruction produces greatest gains.  Thinking skills and creativity ,  6 (2011), 1–13.

Nokes, T. J., Schunn, C. D., & Chi, M. T. (2010). Problem solving and human expertise.  International Encyclopedia of Education , vol. 5, pp. 265–272.

Olson, C. B., Matuchniak, T., Chung, H. Q., Stumpf, R., & Farkas, G. (2017). Reducing achievement gaps in academic writing for Latinos and English Learners in grades 7-12.  Journal of Educational Psychology ,  109 (1), 1–21.

Willingham, D. T. (2007). Critical thinking: Why is it so hard to teach?  American Educator ,  31 , 8–19.

critical thinking why is it so hard to teach

Bryan Goodwin is the president and CEO of McREL International, a Denver-based nonprofit education research and development organization. Goodwin, a former teacher and journalist, has been at McREL for more than 20 years, serving previously as chief operating officer and director of communications and marketing. Goodwin writes a monthly research column for Educational Leadership and presents research findings and insights to audiences across the United States and in Canada, the Middle East, and Australia.

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action., related articles.

undefined

Picture Books Aren’t Just for the Youngest Students

undefined

The Hidden Rigors of Data Science

undefined

Transforming STEM by Focusing on Justice

undefined

STEM Doesn’t Have to Be Rocket Science

undefined

How to Start with STEM

From our issue.

April 2023 Header Image

To process a transaction with a Purchase Order please send to [email protected]

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

What influences students’ abilities to critically evaluate scientific investigations?

Ashley b. heim.

1 Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America

2 Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

David Esparza

Michelle k. smith, n. g. holmes, associated data.

All raw data files are available from the Cornell Institute for Social and Economic Research (CISER) data and reproduction archive ( https://archive.ciser.cornell.edu/studies/2881 ).

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Introduction

Critical thinking and its importance.

Critical thinking, defined here as “the ways in which one uses data and evidence to make decisions about what to trust and what to do” [ 1 ], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum. Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible [ 2 , 3 ]. Furthermore, critical thinking is consistently ranked as one of the most necessary outcomes of post-secondary education for career advancement by employers [ 4 ]. In the workplace, those with critical thinking skills are more competitive because employers assume they can make evidence-based decisions based on multiple perspectives, keep an open mind, and acknowledge personal limitations [ 5 , 6 ]. Despite the importance of critical thinking skills, there are mixed recommendations on how to elicit and assess critical thinking during and as a result of instruction. In response, here we evaluate the degree to which different critical thinking questions elicit students’ critical thinking skills.

Assessing critical thinking in STEM

Across STEM (i.e., science, technology, engineering, and mathematics) disciplines, several standardized assessments probe critical thinking skills. These assessments focus on aspects of critical thinking and ask students to evaluate experimental methods [ 7 – 11 ], form hypotheses and make predictions [ 12 , 13 ], evaluate data [ 2 , 12 – 14 ], or draw conclusions based on a scenario or figure [ 2 , 12 – 14 ]. Many of these assessments are open-response, so they can be difficult to score, and several are not freely available.

In addition, there is an ongoing debate regarding whether critical thinking is a domain-general or context-specific skill. That is, can someone transfer their critical thinking skills from one domain or context to another (domain-general) or do their critical thinking skills only apply in their domain or context of expertise (context-specific)? Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [ 15 , 16 ]. Some argue that critical thinking is domain-general—or what Ennis refers to as the “general approach”—because it is an overlapping skill that people use in various aspects of their lives [ 17 ]. In contrast, others argue that critical thinking must be elicited in a context-specific domain, as prior knowledge is needed to make informed decisions in one’s discipline [ 18 , 19 ]. Current assessments include domain-general components [ 2 , 7 , 8 , 14 , 20 , 21 ], asking students to evaluate, for instance, experiments on the effectiveness of dietary supplements in athletes [ 20 ] and context-specific components, such as to measure students’ abilities to think critically in domains such as neuroscience [ 9 ] and biology [ 10 ].

Others maintain the view that critical thinking is a context-specific skill for the purpose of undergraduate education, but argue that it should be content accessible [ 22 – 24 ], as “thought processes are intertwined with what is being thought about” [ 23 ]. From this viewpoint, the context of the assessment would need to be embedded in a relatively accessible context to assess critical thinking independent of students’ content knowledge. Thus, to effectively elicit critical thinking among students, instructors should use assessments that present students with accessible domain-specific information needed to think deeply about the questions being asked [ 24 , 25 ].

Within the context of STEM, current critical thinking assessments primarily ask students to evaluate a single experimental scenario (e.g., [ 10 , 20 ]), though compare-and-contrast questions about more than one scenario can be a powerful way to elicit critical thinking [ 26 , 27 ]. Generally included in the “Analysis” level of Bloom’s taxonomy [ 28 – 30 ], compare-and-contrast questions encourage students to recognize, distinguish between, and relate features between scenarios and discern relevant patterns or trends, rather than compile lists of important features [ 26 ]. For example, a compare-and-contrast assessment may ask students to compare the hypotheses and research methods used in two different experimental scenarios, instead of having them evaluate the research methods of a single experiment. Alternatively, students may inherently recall and use experimental scenarios based on their prior experiences and knowledge as they evaluate an individual scenario. In addition, evaluating a single experimental scenario individually may act as metacognitive scaffolding [ 31 , 32 ]—a process which “guides students by asking questions about the task or suggesting relevant domain-independent strategies [ 32 ]—to support students in their compare-and-contrast thinking.

Purpose and research questions

Our primary objective of this study was to better understand what features of assessment questions elicit student critical thinking using two existing instruments in STEM: the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). We focused on biology and physics since critical thinking assessments were already available for these disciplines. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time or comparing and contrasting two studies and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Providing undergraduates with ample opportunities to practice critical thinking skills in the classroom is necessary for evidence-based critical thinking in their future careers and everyday life. While most critical thinking instruments in biology and physics contexts have undergone some form of validation to ensure they are accurately measuring the intended construct, to our knowledge none have explored how different question types influence students’ critical thinking. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to measure cognitive student outcomes and incorporate more effective critical thinking opportunities in the classroom.

Ethics statement

The procedures for this study were approved by the Institutional Review Board of Cornell University (Eco-BLIC: #1904008779; PLIC: #1608006532). Informed consent was obtained by all participating students via online consent forms at the beginning of the study, and students did not receive compensation for participating in this study unless their instructor offered credit for completing the assessment.

Participants and assessment distribution

We administered the Eco-BLIC to undergraduate students across 26 courses at 11 institutions (six doctoral-granting, three Master’s-granting, and two Baccalaureate-granting) in Fall 2020 and Spring 2021 and received 1612 usable responses. Additionally, we administered the PLIC to undergraduate students across 21 courses at 11 institutions (six doctoral-granting, one Master’s-granting, three four-year colleges, and one 2-year college) in Fall 2020 and Spring 2021 and received 1839 usable responses. We recruited participants via convenience sampling by emailing instructors of primarily introductory ecology-focused courses or introductory physics courses who expressed potential interest in implementing our instrument in their course(s). Both instruments were administered online via Qualtrics and students were allowed to complete the assessments outside of class. The demographic distribution of the response data is presented in Table 1 , all of which were self-reported by students. The values presented in this table represent all responses we received.

Instrument description

Question types.

Though the content and concepts featured in the Eco-BLIC and PLIC are distinct, both instruments share a similar structure and set of question types. The Eco-BLIC—which was developed using a structure similar to that of the PLIC [ 1 ]—includes two predator-prey scenarios based on relationships between (a) smallmouth bass and mayflies and (b) great-horned owls and house mice. Within each scenario, students are presented with a field-based study and a laboratory-based study focused on a common research question about feeding behaviors of smallmouth bass or house mice, respectively. The prompts for these two Eco-BLIC scenarios are available in S1 and S2 Appendices. The PLIC focuses on two research groups conducting different experiments to test the relationship between oscillation periods of masses hanging on springs [ 1 ]; the prompts for this scenario can be found in S3 Appendix . The descriptive prompts in both the Eco-BLIC and PLIC also include a figure presenting data collected by each research group, from which students are expected to draw conclusions. The research scenarios (e.g., field-based group and lab-based group on the Eco-BLIC) are written so that each group has both strengths and weaknesses in their experimental designs.

After reading the prompt for the first experimental group (Group 1) in each instrument, students are asked to identify possible claims from Group 1’s data (data evaluation questions). Students next evaluate the strengths and weaknesses of various study features for Group 1 (individual evaluation questions). Examples of these individual evaluation questions are in Table 2 . They then suggest next steps the group should pursue (next steps items). Students are then asked to read about the prompt describing the second experimental group’s study (Group 2) and again answer questions about the possible claims, strengths and weaknesses, and next steps of Group 2’s study (data evaluation questions, individual evaluation questions, and next steps items). Once students have independently evaluated Groups 1 and 2, they answer a series of questions to compare the study approaches of Group 1 versus Group 2 (group comparison items). In this study, we focus our analysis on the individual evaluation questions and group comparison items.

The Eco-BLIC examples are derived from the owl/mouse scenario.

Instrument versions

To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.

Think-aloud interviews

We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.

Data analyses

Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g001.jpg

The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.

We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.

We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g002.jpg

Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.

Individual evaluation versus compare-and-contrast evaluation

Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g003.jpg

The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.

These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”

Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.

Individual evaluation questions to support compare and contrast thinking

Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g004.jpg

The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.

We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).

The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.

We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.

To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Students are more discerning when making comparisons

We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.

Individual evaluation questions do not influence answers to compare and contrast questions

We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.

Practical implications

Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.

Future directions and limitations

When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.

As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?

While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?

Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.

Conclusions

Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.

Supporting information

S1 appendix, s2 appendix, s3 appendix, acknowledgments.

We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.

Funding Statement

This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

APA Division 15

Think Critically Before Thinking Critically

The source of information is often as important as the information itself..

Posted February 11, 2020 | Reviewed by Daniel Lyons M.A.

This post is by Jeffrey A. Greene and Brian M. Cartiff of the University of North Carolina at Chapel Hill.

The Internet’s superabundance of information (Lankshear et al., 2000) has led to a “data smog” (Shenk, 1998) of mis- and dis-information (Wardle, 2019). This vast proliferation of dangerous information is particularly concerning given more and more people are primarily getting their news online (Fedeli & Matsa, 2018).

To help people cut through the smog, policy-makers, educators, and parents have called for a greater focus on teaching critical thinking in schools. But what is critical thinking, and can we really expect people to engage in such thinking consistently and successfully across the many topics they encounter every day? In short, the answer is no.

Concerns about people’s critical thinking, or lack thereof, extend back to the time of Plato and his stories of Socrates as the gadfly of the Athenian state and marketplace, stinging and questioning people to make them aware of their lazy and complacent thought processes. In the early 20th century, the pragmatic philosopher John Dewey pointed out that American schools were not helping students learn how to think deeply and reflectively about ideas; instead, he argued they overemphasized specific content knowledge.

Dewey claimed that the major aim of schools should be to teach critical thinking, which he defined as the “active, persistent, and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey, 1933, p. 9). Modern scholars including Paul (1992), Facione (1990), and Ennis (1991, 1996) have argued that critical thinking involves dispositions such as being open-minded and intellectually flexible (i.e., being willing to look at ideas from multiple perspectives) and skills such as being able to reflect on ideas and one’s own biases.

Other definitions of critical thinking focus on the “ability to engage in purposeful, self-regulatory judgment” necessary for problem-solving, reasoning, and conceptual understanding (Abrami et al., 2008, p. 1102). Regardless of the definition used, researchers have shown that people struggle to learn how to think critically, particularly when they are taught those skills outside of an academic discipline or setting, such as in “general critical thinking” courses (Abrami et al., 2015; Willingham, 2007).

Why is critical thinking so difficult for people to do well? Perhaps it is because critically evaluating information requires a tremendous amount of prior knowledge and a disposition for questioning the data and oneself, neither of which is easy to acquire. Even relatively knowledgeable people can struggle to think critically. Medical students are prone to start diagnosing themselves with the illnesses they are learning about (i.e., medical student disease; Hunter et al., 1964; Woods et al., 1966). With extensive training and experience, medical students gain knowledge to appropriately contextualize and interpret symptoms and other related health information; that is, they learn to think critically about the evidence to make appropriate diagnoses.

Similarly, the proliferation of medical information sites like WebMD has led people to diagnose themselves in ways similar to medical students (Starcevic & Berle, 2013). However, research has shown that online symptom checkers are accurate only about one-third of the time (Semigran et al., 2015), leading doctors and scholars to recommend that most people avoid using the Internet for researching illness-related information altogether (Doherty-Torstrick et al., 2016). Thus, expecting people to think critically about medical or technical, scientific information is unrealistic because most people are not medical experts; they do not have the appropriate training nor the necessary vast amounts of specific, medical knowledge.

The modern world requires critical thinking about a large variety of topics, ranging from biology (e.g., vaccines) to political science (e.g., constitutional procedures) to psychology (e.g., confirmation bias ). Yet, research has shown that it is difficult to become an expert in even one area, let alone many (Collins, 2014; Ericsson et al., 2018). So, how can we help people successfully deal with all the information they encounter, and often seek out, online and elsewhere?

The answer lies in redefining critical thinking. Good critical thinkers know when they have the disciplinary knowledge necessary to directly evaluate reasoning and evidence (i.e., first-order reasoning; Chinn & Duncan, 2018). Likewise, good critical thinkers have the self-knowledge and metacognitive skills to know when they do not possess the necessary knowledge, skills, or training to directly evaluate the evidence, and instead should shift to determining which experts or sources to believe about the topic (i.e., second-order reasoning; Chinn & Duncan, 2018).

critical thinking why is it so hard to teach

Thus, good critical thinking sometimes requires only first-order reasoning but more often needs both the metacognitive skills to determine when second-order reasoning is required instead (Barzilai & Chinn, 2018), as well as the skills to determine reliable sources (Brante & Strømsø, 2018; Greene, 2016). Second-order reasoning skills can be taught and learned. As but one example, the Stanford History Education Group has developed a Civic Online Reasoning website with tools and curricula.

In sum, many modern scholars, employers, policymakers, and educators (e.g., Tsui, 2002) agree with Dewey that critical thinking should be a “fundamental aim and an overriding ideal of education” (Bailin & Siegel, 2003, p. 188). However, the “data smog” created by the vast amounts of often contradictory information found on the Internet calls for new views of what critical thinking involves. If people happen to have the disciplinary expertise, background knowledge, and skills to competently evaluate information and evidence about a particular topic, then they can engage first-order reasoning, which includes enacting the dispositions and cognitive skills that many critical thinking scholars have discussed in the past.

At the same time, when people do not possess such knowledge and skills, which describes most of us much of the time, apt critical thinking would involve realizing the need to switch to second-order reasoning: comparing and evaluating the sources of the information using these same dispositions and skills (Barzilai & Chinn, 2018; Wineburg & McGrew, 2017). Thus, people should think critically about thinking critically, and in many cases, evaluate the sources of information rather than the information itself.

Abrami, P. C., Bernard, R. M., Borokhovski, E., Waddington, D. I., Wade, C. A., & Persson, T. (2015). Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research, 85 (2), 275-314. https://doi.org/10.3102/0034654314551063

Bailin, S., & Siegel, H. (2003). Critical thinking. In N. Blake, P. Smeyers, R. Smith, & P. Standish (Eds.), The Blackwell guide to the philosophy of education (pp. 181–193). Oxford, UK: Blackwell.

Barzilai, S., & Chinn, C. A. (2018). On the goals of epistemic education: Promoting apt epistemic performance. Journal of the Learning Sciences, 27 (3), 353–389. doi:10.1080/10508406.2017.1392968

Brante, E. W., & Strømsø, H. I. (2018). Sourcing in text comprehension: A review of interventions targeting sourcing skills. Educational Psychology Review, 30 (3), 773-799.

Chinn, C. A., & Duncan, R. G. (2018). What is the value of general knowledge of scientific reasoning? In K. Engelmann, F. Fischer, J. Osborne, & C. A. Chinn (Eds.), Scientific reasoning and argumentation: The role of domain-specific and domain-general knowledge (pp. 460-478). New York, NY: Routledge.

Collins, H. (2014). Are we all scientific experts now? Cambridge, UK: Polity.

Dewey, J. (1933). How we think: A restatement of the relation of reflective thinking to the educative process. Boston, MA: D.C. Heath and company.

Doherty-Torstrick, E. R., Walton, K. E., & Fallon, B. A. (2016). Cyberchondria: Parsing health anxiety from online behavior. Psychosomatics, 57 (4), 390–400. https://doi.org/10/ggcm5z

Ennis, R. H. (1991). Critical thinking: A streamlined conception. Teaching Philosophy, 14 (1), 5-24. https://doi.org/10.5840/teachphil19911412

Ennis, R. H. (1996). Critical thinking dispositions: Their nature and assessability. Informal Logic, 18 (2-3), 165-182. https://doi.org/10.22329/il.v18i2.2378

Ericsson, K. A., Hoffman, R. R., Kozbelt, A., & Williams, A. M. (Eds.). (2018). The Cambridge handbook of expertise and expert performance. Cambridge, UK: Cambridge University Press.

Facione, P. A. (1990). The Delphi report: Committee on pre-college philosophy. Millbrae, CA: California Academic Press.

Fedeli, S., & Matsa, K. E. (2018, July 17). Use of mobile devices for news continues to grow, outpacing desktops and laptops. Retrieved from https://www.pewresearch.org/fact-tank/2018/07/17/use-of-mobile-devices-…

Greene, J. A. (2016). Interacting epistemic systems within and beyond the classroom. In J. A. Greene, W. A. Sandoval, & I. Bråten (Eds.). Handbook of epistemic cognition (pp. 265-278). New York: Routledge.

Hunter, R. C. A., Lohrenz, J. G., & Schwartzman, A. E. (1964). Nosophobia and hypochondriasis in medical students. The Journal of Nervous and Mental Disease, 139 (2), 147-152. https://doi.org/10.1097/00005053-196408000-00008

Lankshear, C., Peters, M., & Knobel, M. (2000). Information, knowledge and learning: Some issues facing epistemology and education in a digital age. Journal of the Philosophy of Education, 34 (1), 17–39. https://doi.org/10/bkn52d

Paul, R. (1992). Critical thinking: What every person needs to survive in a rapidly changing world (2nd edition). Rohnert Park, CA: Foundation for Critical Thinking.

Semigran, H. L., Linder, J. A., Gidengil, C., & Mehrotra, A. (2015). Evaluation of symptom checkers for self diagnosis and triage: Audit study. The BMJ , h3480. https://doi.org/10/gb3sw7

Shenk, D. (1997). Data smog: Surviving the information glut . San Francisco, CA: Harper Edge.

Starcevic, V., & Berle, D. (2013). Cyberchondria: Towards a better understanding of excessive health-related Internet use. Expert Review of Neurotherapeutics, 13 (2), 205–213. https://doi.org/10/f4pknn

Tsui, L. (2002). Fostering critical thinking through effective pedagogy: Evidence from four institutional case studies. Journal of Higher Education, 73 (6), 740–763. https://doi.org/10.1080/00221546.2002.11777179

Wardle, C. (2019, September). Misinformation has created a new world disorder. Scientific American , 88-93.

Willingham, D. T. (2007). Critical thinking: Why is it so hard to teach? American Educator , 8-19.

Wineburg, S., & McGrew, S. (2017). Lateral reading: Reading less and learning more when evaluating digital information. SSRN Electronic Journal . doi:10.2139/ssrn.3048994

Woods, S. M., Natterson, J., & Silverman, J. (1966). Medical students’ disease: Hypochondriasis in medical education. Journal of Medical Education, 41 (8), 785-790. https://doi.org/10.1097/00001888-196608000-00006

APA Division 15

The American Psychological Association’s Division 15 is a global association of educational psychologists

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

Teaching + Learning

Critical Thinking: Why Is It So Hard To Teach? – link

Helpful links.

  • Critical Thinking: Why Is It So Hard To Teach?

Comments are closed.

T+L Logo

Teaching Resources

  • Level Up Teaching
  • Innovate & Investigate
  • Events & Conferences
  • Show & Tell
  • Connect With Us!
  • Curriculum Design
  • Online Teaching + Learning
  • Planning Instruction
  • Facilitating Learning
  • Academic Integrity
  • Current Topics in Teaching + Learning

Related Pages

  • Upgrade to Blackboard Ultra
  • Instructional Support Studio
  • Faculty Blackboard Help
  • Certificates
  • On-Demand Sessions
  • Online Communities of Practice
  • Conferences
  • SoTL Projects
  • The Staff Lounge (Podcast)
  • Meet The Team
  • Sign Up to Receive Updates
  • Teaching + Learning Newsletter

T + L Updates

Switch to Blackboard Ultra:

The Move to Ultra

© 2024 Teaching + Learning. © Humber College - Teaching + Learning

  • The Move to Blackboard ULTRA
  • NEW! Blackboard Ultra Training Sessions
  • Blackboard Login
  • Online Innovative Learning
  • Creating Videos, Recordings and Course Trailers
  • Designing, Building & Teaching Online
  • Making Your Course Accessible
  • Innovate & Investigate
  • Orientation Week 2021
  • Part-Time Teachers Conference
  • Faculty Handbook
  • Learning Continuity Kit
  • Faculty Portal
  • Blackboard Help
  • Connect With Us
  • CREATING VIDEOS
  • ED-Venture Week
  • ED-Venture Week (Winter 2021)
  • Events & Conferences
  • Mini Series
  • Digital Fluency
  • Inclusive Design
  • Request a Topic
  • SoTL Research
  • Teaching Excellence
  • On-Demand Sessions: Blackboard
  • On-Demand Sessions: Creating Videos, Recordings and Course Trailers (Panopto + Lumen5)
  • On-Demand Sessions: Designing, Building & Teaching an Online Course
  • On-Demand Sessions: EdTech
  • On-Demand Sessions: Making Your Course Accessible
  • Show & Tell
  • Showcase Call For Proposals – Marketplace
  • Showcase Call For Proposals – Workshop
  • Showcase Conditional Field Form Testing
  • Showcase FAQ
  • Sign Up for Updates
  • T+L Newsletters
  • Teaching with Technology
  • The Science of Learning

Richard_kingston

Richard Kingston

Faculty & program area:.

Faculty of Business; Business Management, Financial Services Diploma and Financial Planning Postgraduate Programs

What is your background?

I have an MBA from the University of Western and a BA (Hons.) from Queen’s University. I spent 25 years in the financial services industry in administrative banking, consumer and commercial credit and investment brokerage. I first arrived at Humber in August 2003 as a partial-load Professor in the Business School and in August 2006 I became a full-time Professor and Program Coordinator for the Business Management – Financial Services Diploma program. Later, Peter Madott and I developed the Financial Planning Postgraduate program which was added to my portfolio.

How have you enriched the experiences of your learners?

Since COVID-19 impacted the way in which we teach and learn, I have been conducting my classes as close to the live, in-class model as possible. Some of the activities I have integrated into my classes have been:

  • Case study group work and role plays on Blackboard for BMFS 100 – Interpersonal Communications.
  • Mock interviews on Blackboard for WORK 5002 – Career Development.
  • Pre-screening interviews on Blackboard and MS Teams for my Placement courses between financial services industry recruiters and my graduating students in both the Financial Services Diploma and Financial Planning Graduate Certificate programs.

Before our circumstances changed due to COVID-19, I improved the BMFS (Business Management – Financial Services diploma) program by adding Professional Financial Planning courses to the program and having financial services industry recruiters come to the placement classes. These recruiters conduct pre-screening interviews with my students, giving them the valuable experience of going through this process and, in many cases, leading to permanent jobs for them upon graduation.

I also established the Financial Planning Graduate Certificate program; doing so made Humber an Authorized Education Institution of the licensing body FP Canada. This program also provides graduating students with pre-screening interviews by recruiters.

Why have you incorporated these experiential learning opportunities for your learners?

My primary purpose in doing the items above is to make it easier for my students to get meaningful careers in the financial services industry. The measures mentioned above benefit my students in the following ways;

  • Mock Interviews – gives students more confidence to be successful for the real interviews in their final semester.
  • Role Plays – in which my students portray a financial adviser and a client – mimics a real life encounter with a client and how to handle various situations. This also reduces the student’s nervousness after completed thereby giving them more confidence.
  • Case Study Groups – provides a format for students to exercise and develop their critical thinking skills in the submission of their analysis and recommendations for the case. It also provides them with the ability to work effectively in groups (a key component of the skills required by the financial services industry).
  • Pre-screening Interviews – allows students to demonstrate for real recruiters their knowledge, skills, and personalities that will determine their success in obtaining meaningful employment upon graduation.

Amanda Baskwill

Amanda Baskwill

Katie Billard

Katie Billard

patricia belli

Patricia Belli

Exploring the impact of integrative learning.

In the summer of 2015, the MT faculty team discussed concerns with students’ demonstration of complex competencies involving critical thinking, especially in the clinical environment.  In particular, there was a concern that students were unable to use previously acquired knowledge and skills in new and complicated environments.  This desire to improve critical thinking, and more broadly capability, in students and graduates stimulated a change in the approach to learning within the MT program.

It was decided that an authentic and integrated approach to evaluation was needed to reinforce curriculum change.  In response to this decision, integrative learning activities (ILAs) were created.  In the Fall 2017 semester, there were two ILAs, one to introduce students to the evaluation and a second to reinforce the importance of the foundational courses and of the integration of concepts to enhance learning and future patient care.

This project attempted to answer the following:

  • What is the impact of the Integrative Learning Activities (ILAs) in Semester 1 of the Massage Therapy Program on student engagement, perception of and attitudes toward the ILAs, capability, and integration of concepts?

image placeholder

Sandra Secord

image placeholder

Margot Rykhoff

Nursing student’s perceptions of simulation gaming and high fidelity.

High fidelity patient simulation (HFPS) is widely used in nursing curricula as an alternative to clinical placement and/or targeted learning experience.  Efficacy in HFPS based learning is measured in student level of confidence and competence (Ashley, 2014).  Although HFPS is recognized as an effective learning environment, student formative feedback often reports negative feelings such as being judged, anxiety and fear. Student perceptions of psychological safety and self-confidence has direct impact on learning (Kang & Min, 2019; Turner & Harder, 2018). The use of simulation gaming is a novel pedagogical approach in nursing education. The limited literature on simulation gaming suggests that the team approach to learning, inherent in gaming, supports student perceptions of self-confidence and improved learning outcomes

Nicola Winstanley

Nicola Winstanley

Anne Zbitnew

Anne Zbitnew

I did it; i just didn’t hand it in.

Currently there is little literature on homework completion in the post-secondary environment, and what there is does not distinguish clearly between homework as non-graded study and homework as completion of assignments for assessment. Our research looks to fill this gap. According to the Higher Education College Quality Council of Ontario’s website, retention rates for college students in Ontario hovered around 65% for the ten years between 2004 and 2014.

In the Media Foundation program, the investigators notice that some students attend class regularly but do not hand in assignments worth a portion of their grade, or hand assignments in so late that they receive a significant grade penalty (sometimes zero.) Anecdotally, students report that they do not really know why they have not completed assignments, or if they have completed them, why they do not hand them in. Some students suggest they are too busy, did not understand the assignment and did not think to seek help, or did not feel that their work was good enough to submit. This study investigated what might be the root cause, or causes, of this behavior, which in many cases leads to course (and therefore program) failure.

image placeholder

Susan Bessonette

image placeholder

David Chandross

Gamification of a simulated care path for massage therapy students.

Evolving patient simulations are those in which a student manages a virtual patient over a period of time to develop the capacity to identify and respond to emerging clinical problems after an injury. Meta-analysis of student cognition, affect and learning outcomes have shown that skill acquisition gains are higher in gamified  learning then in conventional instruction (Lamb et al, 2018).

The rationalization for pursuing simulation gamification is based on the increasing need for alignment between pedagogical practice and long term, job skill-related outcomes. The document “A New Vision for Higher Education (http://www.collegesontario.org/policy-positions/position-papers/new-vision-for-higher-education.pdf) cites in section 3 that “even with strong immigration levels the Conference Board of Canada estimates Ontario will face a shortage of more than 360,000 skilled employees by 2025 and a shortage of more than 560,000 employees by 2030.” (Page 12 of source document). This report explicitly states that the college system must re-align to meet the new economy. Shortages of apprenticeships are of great cost to the economy. The pedagogical model we are seeking to develop here will provide proof of concept for integration of serious games and massage therapy science content.  This method of serious game integration can then be applied across many subject areas both within the RMT program and across Humber.

Simulation gamification is part of the solution to this problem so that we can make advances in the use of simulations for training in place of traditional apprenticeships and face-to-face job experience for entry into the professions. This study created a simulation training game that can be adapted across wider user groups both internally and externally. It aligns well with the work in virtual reality occurring at Humber, where the role of immersive simulation is being explored. It created a game system that can be used alongside immersive technology to enhance teaching and learning. The focus of this work is on integrative, cross-course learning, where clinical and basic scientific information needs to be aligned.

image placeholder

Vidya Rampersad

Using technology to build relationships, share ideas, & collaborate.

One of the goals in our Early Childhood Education Program focus on preparing exiting students for success in various career pathways. Embedded within our courses are team building skills, communication skills and building professional relationships.

Collaboration is a fundamental part of program/course goals, generic skills and is an integral part in preparing students for success in the field of Early Childhood Education. Efforts made to embed experiences within assignments offer opportunities for students to engage in practices that will allow for developing and refining collaboration skills while networking and building stronger relationships. Collaboration skills, although essential, can pose many challenges from a requirement of course content to faculty expectations of what students can do (Tucker, 2012).

  • How do students perceive their skills in navigating online platforms, specifically Google Docs and Mindmeister, for collaboration, engagement and learning?
  • Have students’ perception/attitudes changed, about using on-line platforms in completing group assignments, engagement and learning, after they were introduced to Google Docs and Mindmeister?
  • The type of strategies students uses to complete group assignments.
  • How individual group members’ function or demonstrate collaboration skills, engagement and learning of assignment content?

image placeholder

Tammy Cameron

Nazlin Hijri

Nazlin Hirji

adriana sylvia

Adriana Salvia

Exam proctor training.

Since 2013, the investigators have administered two different invigilation-training methods.  The initial training model invited newly hired exam invigilators to an in-person discussion of the Test and Exam Policy; this was followed by a second training model, consisting of an in-person training session, and drawing on both case-based and discussion approaches to instruction. Each invigilator was supplied with a mind map: a one page, at-a-glance document that reminds invigilators of the most common circumstances they are likely to encounter during the examination process (e.g. how to handle students’ electronics, personal items, or breaches of Academic Integrity).

While resources have been built to assist invigilators in identifying academic dishonesty, invigilators have stated in feedback discussion that they are uncertain when dealing with situations that are not specifically outlined in them.

This research study aimed to identify an effective training method to develop stringent and active invigilation skills and to address the processes involved to instill both knowledge and confidence in compliance with academic integrity policies outlined in the Test and Exam Policy, School of Health Sciences.

Invigilators went through a training process. The effectiveness was assessed as follows in combination: objective measures, subjective measures, self-evaluation measures (as outlined in the measures section).

Sylwia Wojtalik

Sylwia Wojtalik

image placeholder

Janet Jeffery

Nursing students, cultural humility & sickle cell disease.

Healthcare professionals are at the forefront of delivery of care. This fact of ignorance among health care professionals with this disease has translated into horrendous experiences, characterized by stigmatization for patients in the healthcare settings (Haywood et al, 2009). Patients with SCD face the greatest risk in health care settings due to the lack of appropriate and timely care. They often state that they know more than the health professionals who are providing care (Hill,1994, Anionwu & Atkin, 2002).Minimal knowledge about SCD among healthcare professionals have led to fatal consequences.

For the past five years, practical nursing students and other students in the School of Health Sciences have attended a Sickle Cell Symposium. Here, expert healthcare providers, patients living with Sickle Cell Disease and members of Sickle Cell community share raw experiences and invaluable knowledge with the students.

This project evaluated if students’ knowledge, attitude, and perceptions of patients with Sickle Disease (SCD) changed after attending this symposium.

Shawn Richards

Shawn Richards

Measuring post-secondary students’ engagement and performance with learning technologies.

Students at the collegiate level are not engaged in instructor-facilitated courses. Some recent studies suggest that large numbers of college students appear to be either academically or socially disengaged or both (Kuh, 2002). It is the opinion of the investigator that there are too many distractions, from social media to the proximity of friends or colleagues among other extraneous factors. Instructors try to solve this issue by using a plethora of tools/aids to garner students’ attention. However, at the same time, these technologies can be harnessed for positive educational outcomes, they can also distract and impair performance when students use them for purposes related to the lesson (L. Darling-Hammond, 2014). Pairing internet-connected classroom tools with computers and mobile devices with classroom teachers who provide real-time support and encouragement boosts engagement and produces significant gains in student achievement. As a result, it is imperative that in this highly distracted era, we need a tool to incorporate, manage and lead lessons to enhance engagement during a lesson by broadcasting content and interactive learning elements to students’ devices in real-time. This project utilized NearPod ( https://nearpod.com/ ) as a tool to do this.

  • Does having integrated learning tools along with instructor-led interactive technologies increase the levels of engagement and improve performance in college-based courses?

image placeholder

Rossie Kadiyska

image placeholder

Vladimira Steffek

Writing humber fashion specific cases.

To date, most educational institutions focus on delivering the classroom experience though traditional models such as lectures, debates, and exercises without a focus on innovative thinking. However, for students to be competitive in their future endeavors, not only their analytical skills but also their creative and innovative thinking must be honed. The current evidence suggests that “the creative competence (…) is one way to help prepare students for an uncertain future” (Beghetto, 2010, p. 447). The study explored the question of how to better prepare students for the practical world while increasing their innovative thinking, creativity, engagement and satisfaction with the learning experience.

To date, most educational institutions focus on delivering the classroom experience though traditional models such as lectures, debates, and exercises without a focus on innovative thinking. However, for students to be competitive in their future endeavors, not only their analytical skills but also their creative and innovative thinking must be honed. The current evidence suggests that “the creative competence (…) is one way to help prepare students for an uncertain future” (Beghetto, 2010, p. 447). The study aims to explore the question of how to better prepare students for the practical world while increasing their innovative thinking, creativity, engagement and satisfaction with the learning experience.

image placeholder

Blackboard Content in Mass Media Classes

Humber, like other post-secondary institutions, provides an online learning management system for students, and requires professors to populate the sites with course-related content.  Humber has embraced concepts such as Learner-centered Learning, Design for Differentiated Learning, and the Flipped Classroom.  The institution itself advises faculty that students are technically-savvy, use a variety of technology including smart phones, and learn through a variety of media and techniques.

Students, instructed by a professor that preparatory materials will be posted in advance of a class and that use of those materials is necessary in order to be ready for classroom discussion of course content, are largely coming to class without having used the materials.  Lack of use of preparation material leaves students unprepared to engage in the related content, delays the start of a class, disrupts the flow of the planned session, and likely makes the professor seem unprepared for the class – because the professor is expecting to have students pursue the concepts introduced in the advance materials; not to introduce the introductory materials.

This research attempted to answer the following:

Given an expectation that they are required to do some preparatory work before a class, how do students prepare?  When offered a selected piece or set of preparation materials, what types of media are they most likely to engage in as a means of preparation for class?  Do students who engage in preparation materials before class have more successful outcomes in that course than students who do not prepare by using the offered materials?

image placeholder

Nadine Ijaz

Expanding research capacity in tcm practitioner education using a pedagogical focus on model validity.

Expansion of research capacity among practitioners of traditional and complementary medicine (T&CM) — such as Chinese medicine and acupuncture — has been identified as a key priority by researchers, clinicians and educators in the field. Existing research in this area has focused on promoting and “evidence-based approach to [T&CM] medical education … including use of the evidence pyramid”. However, scholars have pointed to the limitations of evidence-based medicine (EBM)’s scientific conceptualizations with reference to T&CM. For example, the classical randomized controlled trial (RCT) was designed to study the therapeutic impacts of standard, single-constituent pharmaceuticals on specific biological markers, T&CM therapies are typically complex (multi-modal), individualized, and evaluated in relation to patient-reported outcomes. Previous studies of T&CM professional trainees enrolled in research literacy courses have pointed to this methodological incompatibility as a barrier to practitioner research engagement.

Over the last two decades, an international group of T&CM researchers have undertaken to innovate modifications upon conventional clinical research methods that represent a better ‘fit’ with the interventions under study. These methods, termed ‘whole systems research,’ emphasize the principle of model validity, representing the principle of compatibility between intervention and research design. The present author’s scoping review of these research methods (currently in press) represents a first comprehensive collation of these approaches, analyzing with several dozen clinical whole systems research exemplars. It is hypothesized that an emphasis on whole systems research methods within the context of a postsecondary course for T&CM professional trainees may enhance research engagement beyond what occurs using a pedagogy that takes a less critical approach to EBM.

How does a model-validity focused post-secondary course focused on increasing research literacy among traditional Chinese medicine professional trainees impact student research engagement?

image placeholder

Aiden Dearing

Vr assignment collaboration.

Numerous studies have looked at issues regarding assessment and effectiveness of collaborative group work (King 2005), and there have been studies of students collaborating over distance (Smallwood 2017). We have found studies about collaboration between journalism students and computer science students (Pearson 2017) but this study would look at collaboration between journalism and video game students.

As journalism continues to adopt new technologies for the production and distribution of content, journalism education can fail to exploit the potential of new technology (Angus 2015). When teaching Humber students about the potential of Virtual Reality, instructors don’t have the time or resources to teach technical coding skills required to build immersive experiences. At the same time, our Game Programming need exposure to non-traditional ways of using game technology and design skills. This study will look at a collaboration effort between these two programs at Humber, where journalism and videogame design students collaborate on the building of a VR tour experience.

Matthew Harris

Matthew Harris

Reducing student distraction in class.

Personal electronic devices are ubiquitous in today’s classrooms. However, with the added usefulness of these devices comes concerns that they are being used in ways that inhibit learning. Some instructors have gone so far as to ban all personal electronic devices in their classrooms to prevent these negative outcomes (Green, 2016). Various studies support these concerns. When tested directly, Wood et al (2012) showed that if students used Facebook while in a lecture, they received lower marks on a related exam than their undistracted peers. Mueller and Oppenheimer’s (2014) experiment produced better performance among the students who hand-wrote their notes rather than those who wrote on laptops. One study (Sana, Weston & Cepeda, 2013) showed that students who could merely see an off-task laptop had their learning negatively impacted. But while banning personal electronic devices can seem like the obvious choice, technology is such an integrated part of many students’ lives, that it can be a challenge. Tindell and Bohlander (2012) found that nearly nine in ten college students had texted in at least one of their classes, and that nearly two thirds felt that they should be allowed to do it. Most college and university students own laptops (Dahlstrom, Walker & Morgan, 2013), and they self-report that they are spending nearly half of their waking life with a laptop in their lap. (Kay & Lauricella, 2014) Laptops for these students can seem like an indispensable tool, or a part of their identity. Some studies also show that laptops can improve attention and engagement for some students when they are used in a positive way. (Samson, 2010). What would be most helpful is if students could develop habits around personal electronic devices that reinforced learning in class. Some researchers have suggested that teaching students metacognitive practices might help with this issue, (Aagaard, 2015) or that de-normalizing negative behaviours among a peer group (Taneja et al, 2014) might change students’ habits.

This project aimed to examine students’ current working habits with personal electronic devices in a series of focus groups.  A future second phase of the project will use the insights gathered to develop test interventions for the English classroom.

image placeholder

Maria Racanelli

Group micro venture project.

Entrepreneurship, BUS 3500, examines current theories and practices of entrepreneurship. Students are introduced to the concepts of new venture creation and small business management. It focuses on the recognition and appreciation of entrepreneurial skills, resource and environmental analysis, sources of venture capital, business planning, the e-environment, strategic planning and franchising. This course helps students appreciate the challenges involved in deciding to create a new venture and the steps involved in starting a new firm.

Existing research tests the integrated nature of experiential learning in entrepreneurship education (Liang, et al., 2016). It contributes to theoretical and practical development of entrepreneurship education. Liang, et al. (2016) designed and implemented an effective assessment instrument as well as a strategic evaluation framework that could be broadly applied in other institutions. Their research explained how experiential learning influences each student’s learning expectations and learning outcomes, and it explored outcomes and effectiveness of experiential learning curriculum. They also linked students’ reflections of learning expectations and outcomes to reveal and stimulate new ideas and “new opportunities to adopt experiential learning in entrepreneurship education across disciplines” (p. 125).

The primary instrument of experiential learning in the course is the experiential group project, Group Micro Venture project. Students completed the project, and changes to entrepreneurship skills and attitudes towards entrepreneurship were measured.

Breadth v. Depth: Which is the More Meaningful Learning Experience?

In the recent literature, depth and breadth are associated with acquiring a broad general education, writing clearly and effectively, and contributing to the welfare of communities (Coker, et al., 2017).

Depth is associated with higher order thinking (synthesis and application), as well as overall educational experience. Breadth (number of different types of learning experiences) was associated with a broad general education. Overall, in a five-year study of 2,000 American graduating college students, the authors found that both learning depth (amount of time commitment) and breadth (number of different types of experiences) are valuable and lead to additional learning gains in a range of areas. Both depth and breadth were positively associated with acquiring a broad general education, writing clearly and effectively (Coker, et al., 2017).

The Strategic Management course, MGMT 4502, is characterized by its breadth and its depth. More specifically, the breadth consists of 15 chapters of high level content. The depth is characterized by three major assignments. Two assignments afford students the opportunity to research an organization using the case teaching methodology, the third, is a research project methodology.

This research attempted to navigate the tension between the breadth of course content and the depth of key strategy practices, and explore the constructs to measure learning experience.

Attendance at Orientation and Academic Success

This project is concerned with increasing students’ levels of engagement and diploma program completion (student retention).

According to the recent Humber College Business Administration review, we are challenged with the limited information regarding student retention in the programs.    In addition, according to many studies student engagement can be improved at earlier stages within the academic journey, beginning with Orientation.

The focus of this study is to deepen an understanding of how attending the Orientation Session and “meet the faculty sessions” can help develop preventative methods for student engagement and retention.  It is anticipated that, such initiatives could improve engagement, retention, increase graduation rates and improve student success.

Jennifer Winfield

Jennifer Winfield

jessica freitag

Jessica Freitag

The impact of group-writing on quality.

In writing classrooms, especially second language (L2) settings, the use of pair work or small group work to complete tasks is a popular approach. This approach is rooted in Vygotsky’s social constructivism theory (1978), which posits that human development happens through social interaction. One common notion is that when a novice and an expert collaborate, the expert assists the novice as the novice gains independence at mastering a skill. The expert’s support is gradually lessened as the novice’s proficiency develops. In education, this notion is commonly referred to as scaffolding. Scaffolding naturally occurs when students work together in pairs or small groups.

This project’s aim is to add to this body of research by investigating the impact of group writing on the quality of the written product in our teaching context – a first-year composition course that second language learners complete as part of their diploma and certificate studies at Humber College – ESOL 100: College Reading and Writing Skills: ESOL. The course focuses on two types of written products: a summary and a critical analysis of a textual argument. This project used two ways to measure the impact of collaborative writing on the quality of these written products. First, students’ ability to meet the criteria for each written product was measured through the completion of a task rubric (task instructions and rubrics are included in Appendices A-D).  Second, students’ grammatical accuracy was measured by calculating the ratio of error-free clauses relative to all clauses.

The project attempted to answer the following:

  • Does collaborative writing affect the quality of students’ writing in ESOL composition classes?
  • What are students’ attitudes towards collaborative writing?

image placeholder

Jennifer Ball

Bringing sociology theory to life.

In the Winter 2018 semester, Survey Monkey was used to allow students in the Introduction to Sociology (GSOC 110) course to complete a voluntary, anonymous survey.  This survey asked students questions that are similar to variety of real and notable sociological studies.  For example, students reported if they’ve ever felt stereotype threat and explore if it has impacted their performance or likelihood to act in a deviant manner.

This project continued in the Fall 2018 semester.  The intent was to present back to them how their group’s anonymous responses align with the responses of the groups that were previously surveyed by “real life sociologists.”  This led to interesting and engaging conversations about why this could be, what influences beliefs and behaviours, how research methods differ, how the validity of findings can be challenged, and how beliefs and behaviours change over time and between cultures.

image placeholder

Jaclyn Strimas

In-class mindfulness training to mitigate stress and improve academic self-regulation in post-secondary students.

In the American College Health Association’s 2016 report on the health of Ontario post-secondary students, close to 43% of respondents identified stress as having a negative impact on their academic performance—in fact, stress was identified as the most impactful factor. At Humber College we have counselling freely available to students and a strategic plan that includes the pillar: “A Healthy and Inclusive Community.” But Humber’s most recent Student Success Survey (2018) shows that only a little more than half of Humber students “know where to go for counselling or advising” if they need it. Anecdotally, also, we perceive a gap in students’ need for mental health support and its availability through conversations in which students perceive there are barriers to accessing care or are unclear about the resources available to them. In a 2013 study, Bergen-Cico et. al. noted that there is a growing gap between students with self-reported anxiety and those seeking help for anxiety. They suggest that a brief, curriculum-based intervention may be a way to bridge that gap.

This project introduced first-year Media Foundation students — in the course Success Foundations — to basic mindfulness meditation and other cognitive behavioral skills. This intervention aligns with the learning outcomes “explain various learning strategies and psychological theories related to learning” and “use research-based learning strategies for the development of effective learning skills in the post-secondary context.” The intent of the project was to see the students integrate these skills into their overall learning strategies and enhance their resilience and coping skills in the academic domain.

DragosParaschiv

Dragos Paraschiv

kerry johnston

Kerry Johnston

image placeholder

Rory McDowall

The impact of lighting quality on student learning.

Developing a sense of community is central for our students within the SSCS. However, there are challenges to building this sense of academic community within our incoming degree students (i.e., diverse backgrounds, not spending time on campus to socialize, no undergrad bar, long commutes, etc.). A Canadian review study discussed that a lack of social support and a sense of social networks with other students can contribute to academic failure and dropping-out of post-secondary education (Silva & Ravindran, 2016). One idea that has been shown to foster a sense of community is sharing one’s history through sharing family recipes (Hancock, 2001; Schermuly & Forbes-Mewett, 2016). Having anecdotally seen disconnected students struggle academically, and pairing this with scholarly literature, we want to explore the impact having first-year students participate in a group cookbook project could have on their feelings of connectedness and community at Humber.

This study hypothesized that first-year students taking part in the Ubuntu cookbook project will have higher levels of connectedness (as measured with survey responses) with their Humber peers than those not taking part in the Ubuntu cookbook project.

doug thomson

Doug Thomson

Alyssa ferns

Alyssa Ferns

image placeholder

Marilyn Cresswell

Ubuntu: a cookbook to build civility and community.

image placeholder

Heather Snell

image placeholder

Brenda Webb

image placeholder

Katherine Sloss

Learning outcomes of cyc student work placements.

Field work, placement, practicum and internship are some of the names used to describe a course in which a student in Child and Youth Care education has the opportunity to develop skills learned and to practice skills outside of the classroom setting.  The Government of Ontario has published program standards to achieve the following goals: uniformity to college programming, ensuring that graduates have the skills and knowledge needed to practice in the working field irrespective of the educational institution they complete the program from and providing the public accountability for the quality and relevance of college program (Ontario Ministry of Training, Colleges and Universities, 2014).  Program standards contain three specific elements all in which, a student must consistently demonstrate in order to graduate from the program: 1) vocational standard, 2) essential employability skills and 3) general education requirement (Ontario Ministry of Training, Colleges, and Universities, 2014).

This project explored the adaptation of the Vocational Learning outcomes in Child and Youth Care programs field placement courses in Ontario diploma programs.  Vocational Learning Outcomes (VLO) will be further investigated as they are learning outcomes specific to CYC, unlike the aforementioned two elements, which are applicable to any programs of instruction.  Given that this is a preliminary study and is only an overview of CYC field placement, further research is needed in order to fulfill a more comprehensive grasp of this key component of CYC programs.

image placeholder

Mindfulness for Stress Reduction and Clinical Readiness

Currently students in ADMH experience stress and concerns about competence as they enter a new field.  Indeed, a recent study of 25,000 Canadian post-secondary students revealed that almost 90 percent said that they felt overwhelmed by all they had to do in the past year.   Humber has identified “Healthy and Inclusive Community” as one of its three strategic pillars and this research can support the journey toward that goal. Students entering the workforce from the ADMH program are trying out clinical skills for the first time and have anecdotally stated that the regular mindfulness practice that currently exists within the ADMH 5009 course increases their feelings of clinical readiness, as a result of feeling more grounded in the present. Therefore, it is critical to explore if regular practice of mindfulness through the Koru program lowers stress levels and increases feelings of clinical readiness

  • Do students who participate in the Koru mindfulness curriculum experience lower stress levels and/or greater feelings of clinical readiness than students who participate in the regular ADMH 5009 mindfulness curriculum?

image placeholder

Darren Hupp

Impact of professor coaching on student team performance in simulations.

There is mixed opinion on the value of simulation games in academic learning.  Research indicates that simulation games “are superior to other teaching methods for helping students develop skills such as complex problem-solving, strategic decision making and behavioral skills, including teamwork and organizing1” (Pasin & Giroux, 2010, Salas, Wildman, & Piccolo, 2009; Tompson & Dass, 2000).  There is evidence to show that traditional teaching methods can well develop simple decision-making skills, while simulations are better for developing complex decision making skills2 (Pasin & Giroux, 2010).

The Business Management Program at Humber Institute of Technology & Advanced Learning has successfully integrated a simulation game into its Introduction to Business course for 1st semester program students.  Implementation of the weekly simulation decisions varies amongst classes based on professor choice:  some classes will make weekly simulation decisions in-class with or without coaching from professors, others will make simulation decisions outside of class (without professor coaching).

Given the increased focus on experiential learning at Humber, determining the best approach to integrate simulation games for 1st year students is a key learning that can support this strategic focus.  This project utilized the business simulation software Biz Café ( https://www.interpretive.com/business-simulations/bizcafe/ ).

Do first year/first semester students obtain enhanced learning from making simulation game decisions in-class assisted with professor coaching and guidance3 (Pando-Garcia, Periañez-Cañadillas, Jon Charterina, 2015)?  Are there any unintended impacts on student attitudes and success in the course from this approach?

image placeholder

Daniel Bear

image placeholder

Julie Muravsky

image placeholder

Charlotte Serpa

Assessing educator’s knowledge of harm reduction.

The paramedics program at Humber College trains students to be effective while out in the field. Students in their first year are trained to deal with Multiple Casualty Incidents (MCI), which are limited-scale emergency situations requiring a coordinated response from one or more ambulance service(s). Students go through three hours of classroom lecturing and are given a paper-based scenario with ten patients to triage. Students will identify and prioritize multiple patients based on their symptoms and follow steps of assessment. These steps include assessing breathing, circulation, and level of consciousness.

Virtual reality is an emerging technology in education. The technology is used to create simulation-based training to mimic real-life scenarios. In partnership with Humber’s paramedics program, the CTL’s VR & Simulation team developed an MCI simulation. In this scenario, a bomb is detonated at a subway station and there are multiple casualties. Students were expected to properly assess virtual patients within a given amount of time. Students used an HTC Vive headset to view the simulation and will use game controllers to move themselves and interact with the environment. In addition, physiological measures (i.e. heart rate, body temperature, and galvanic skin response [GSR]) were measured using the Empatica E4 wristband.

image placeholder

Craig MacCalman

image placeholder

Jennifer Naken

The impact of virtual reality training on paramedic student readiness for practice.

Audrey Wubbenhorst

Audrey Wubbenhorst

Pr business cases part 2.

This is a continuation of the PR Business Cases project by Audrey Wubbenhorst.

Currently the Public Relations textbooks on the market are American.  There is a lack of PR case studies, which look at current business problems in the Canadian context.    In the summer of 2017, the following cases were complete:

  • A case study of Canadian Tire and how they introduced Facebook @work as an internal communications tool
  • A case study of Cisco Canada’s corporate social responsibility work with indigenous communities in Northern Ontario

This project looked at several new cases:

  • A case study of TD Bank’s use of digital content as a recruitment strategy
  • A case study on the launch of EQ Bank in Canada
  • A case study on a technology startup

More current, relevant cases based on Canadian companies will augment the learning outcomes of the course in particular to applying creative approaches to complex communications and problem solving.

Brenda Ridley

Brenda Ridley

image placeholder

Simulations as Training Tools for Nursing Students

The American Heart Association (AHA) Adult Chain of Survival underwent a major change in 2010 for Basic (BLS) and Advanced Cardiac Life support (ACLS).  The AHA recommends immediate recognition of cardiac arrest with activation of the Emergency Response Team, early CPR and rapid defibrillation followed by effective ACLS. (Hazinski et al. 2010). The AHA goal for early defibrillation in the hospital and ambulatory clinics is for the shock to be delivered within three minutes of the victim’s collapse.

In the Humber Post graduate specialty certificate programs, nurses from both community and academic hospital practice settings can develop added competencies in a safe, simulated environment before they enter their new practice setting.

To develop participants’ ability to effectively manage deteriorating patients requiring defibrillation and advanced airway management the project will use a high fidelity simulator in a clinical lab setting.  Eligible participants were recruited from our specialty programs. Participants can incorporate theoretical knowledge gained in each of the specialty programs. The three specialty programs are designed to assist participants developing the knowledge and skills requisite of the registered nurse to deliver safe, quality patient care in the high acuity, critical care and the Emergency department setting.

image placeholder

Bernadette Summers

Usability of course webpages.

The School of Health Sciences Continuing Education department utilizes the learning management system (LMS) Blackboard to support all online and face-to-face course sections offered each term. Over many years, these courses have been exposed to curriculum upgrades. As a result, there are several different course designs in Continuing Education based on either when the curriculum was originally written or last overhauled.

Through the Usability Lab at Humber College, the researchers would like to determine which course design enhances student learning. According to Miller-Cochran and Rodrigo (2006), “a key reason for conducting usability testing is to revise the course to better facilitate learning” (p. 96). Similar to Unal & Unal (2011), the Humber College researchers are striving to build effective and efficient course designs that address the needs of the student learner.

Which course design results in the most positive student experience?

The researchers hope to learn two important lessons in quality assurance. The first, through employment of the think-aloud technique in the Usability Lab, is determining which course design will result in the fastest navigation and least amount of frustration for participants. The second lesson for researchers is assessing which course design participants prefer and find the most visually compelling.

Austin Tian

Austin Tian

Whiteboard animation software in the classroom.

For engineering students, some of the course modules, especially when mathematical skills are needed, are often found to be quite difficult. As a result, students can gradually lose their interest to study. According to the research conducted by Awang et al. (2013), an attractive and enjoyable environment for study will increase student engagement and levels of success. This proposed research studies the effects of whiteboard animation videos on explaining complicated engineering concepts. Compared to traditional PowerPoint presentations, the whiteboard animation software can integrate course content using images, text, narration, and animation into a “storytelling” style video presentation, which is believed to increase engagement in the classroom.

PR Business Cases

Currently the Public Relations textbooks on the market are American.  There is a lack of PR case studies which take a look at current business problems in the Canadian context.  Further, the cases will be based on first-hand research with business professionals in the field.  Pending availability of participants, sample topics include:

This project will create business cases based on insights gathered from interviewing individuals from three Canadian companies.

image placeholder

Alexander Gurevich

Teaching with numerical card exercises.

Many students express and demonstrate difficulty with understanding and practicing mathematics. It is normal to initially have difficulty with a subject, although it is expected that throughout the process of taking a course in the particular subject matter, students develop comfort and improve their performance ability with the material. This expectation is built on another expectation, which is that students are engaged in class, complete assigned work and exercises, and seek help when they are struggling with a subject.

An ideal solution would be one that achieves the desired results of the lesson, and of a work periods or practicum session, simultaneously, such that students can be presented with fundamental concepts about the subject matter, while engaged with an interesting activity that allows them to practice the concepts discussed. Gamification presents such a solution.

A numerical card game called quant , has developed by the author in order to improve mathematical fluency. This game utilizes a set of numerical cards, and mechanics based on how to draw cards, the meaning of drawn cards, and what calculations must be performed in what situations.

  • Can exercises with cards from the game quant, improve students’ learning, engagement, and enjoyment, in introductory statistics courses?

Alena Papayanis

Alena Papayanis

The impact of process-based assignments.

Feedback is the “most powerful single influence” in student achievement (Hattie in Gibbs, 2005, p. 9) and is central to a student’s learning process (Hattie and Timperley 2007; Housell 2003 in Carless et al., 2010), yet evidence shows that current standard feedback practices do not support this goal (Carless et al., 2010).  Furthermore, students’ perceptions of the demands of the assessment system tend to govern “all aspects of their study,” even above teaching itself (Gibbs & Simpson, p.4).  Thus, feedback and assessment are key factors in student learning.

This project aimed to understand if a process-based assessment structure better supports student learning.  This structure consists of small assignments throughout the semester that build towards the creation of a final product.  It also involves feedback that is frequent, timely, and actionable, conditions that Gibbs et al. found to be central to assessments that best supported student learning (2005).  It is expected that the breakdown of a larger assignment into sub-tasks, in addition to the interdependence of these smaller assignments, will reduce student procrastination. Ultimately this project examined if a process-based assessment structure allowed students to learn how to incorporate feedback, and develop meta-cognition.

image placeholder

Tonia Richard

image placeholder

Jaspreet Bal

image placeholder

Christine McKenzie

Drivers of persistence in a face-to-face caregiver training workshop.

To date, most educational interventions created for family caregivers have focused on patients with a single diagnosis (Hughes, 2008). Unlike the aforementioned group, the research acknowledges that family caregivers providing care for chronically ill patients take on greater responsibilities in terms of care provided; and ultimately walk an “unpredictable path” (Mardanian Dehkordi, Babashahi, & Irajpour, 2016). The current evidence suggests that informal / unpaid caregivers of chronically ill patients managing care on a long-term basis endure ongoing stress that prompts them to institutionalize patients prior to their actual need (Hughes, 2008). An overview of the literature indicates that family caregivers can more effectively manage their burden and build task mastery through receipt of multicomponent educational interventions, offered by an interprofessional health care team (Ostwald, Hepburn, Caron, Burns, & Mantell, 1999; Reinhard, Given, Huhtala Petlick, & Bemis, 2009). Suggested interventional topics include: teaching skills related to symptom management, pain management, medication management (and handling related side effects), general problem solving, safely promoting patient independence, achieving illness acceptance, etc., within a supportive group structure (Ostwald et al., 1999). Considered crucial to the foregoing structure is the offer of patient respite and/or concurrent training for the patients, as the literature indicates this further positively impacts rates of caregiver burden (Ostwald et al., 1999).

  • What is the effect of a multicomponent curriculum (with the offer of parallel respite / training for patients) on informal / unpaid caregivers of chronically ill patients?
  • Will the foregoing educational design improve caregiver burden or wellbeing, compared to those caregivers on a waiting list to participate in the training workshop?

Christine Zupo

Christine Zupo

Impact of positive interventions in nature.

Current literature explores the healthy benefits of implementing positive interventions and increasing the experience of positive emotions. Studies have found happy people report frequent positive emotions, infrequent negative’ emotions and enjoy higher levels of success, health, and social connection (Nelson & Lyubomirsky, 2012; Nickerson, Diener, Norbert, & Schwarz, 2010; Lyubomirsky, Sheldon & Schkade, 2005). Simultaneously, current literature explores the healthy benefits of nature. However, current literature does not explore the effects on subjective sense of happiness when positive interventions are implemented in nature.

This intent of this study is to explore possible increase sense of wellbeing when positive interventions are implemented in nature. This study focused on experimental interventions in which participants were prompted to engage in positive activities that have been reliably shown to increase positive emotions (Lyubomirsky & Lepper, 1999). Participants were assigned to two groups, an experiential group and a comparison group. Each group completed the Subjective Happiness Scale as a pre and post-test, weekly positive interventions that have been reliably shown to increase positive emotions and subjective sense of happiness (Lyubomirsky & Lepper, 1999). The comparison group implemented positive intervention indoors and the experiential group implemented positive interventions in nature.

Arun Dhanota

Arun Dhanota

image placeholder

Documentary on Mental Illness

Within the last decade, various service industries have addressed the stigma surrounding mental llness in order to support individuals on their road to recovery. Leading mental health organizations, such as the Centre for Addiction and Mental Health and the Canadian Mental Health Association have published a variety of resources to educate the community about mental illness. Some notable documentaries have also captured the experiences of individuals and their families who have mental illness. Voices (2014) looks at three participants and their experience with schizophrenia. Children of Darkness (1983) brings to light the abuses youth with mental illness experience in institutions. Being Black, Going Crazy (2016) showcases Black men in the UK experiencing mental illness. Changing Minds, The Inside Story (2014) is a three-part documentary following mentally ill patients in two Australian hospitals. In addition, one can find online resources (such as through YouTube) to learn more about mental illness.

What is still missing, however, are educational videos geared towards Canadian criminal justice organizations that provide guidelines from survivors and their caregivers about their experiences, and henceforth, their suggestions for interaction and support.

The research questions driving this project are:

  • What are the lived experience of survivors and caregivers?
  •  What suggestions/feedback do they have for the criminal justice/community service industry?

Anthony Vanhoy

Anthony VanHoy

Effects of online demonstrations in a mathematics course.

The interactive demonstrations have been created by various individuals (including the Principal Investigator) using the programming language Mathematica. Some demonstrations can be found at: http://anthonyvanhoy.com/demos.html . A free player application is needed to run demonstrations and can be found at http://www.wolfram.com/cdf-player/ . Demonstrations are interactive in that they allow individuals (using keyboard, mouse, touchscreen) to effect change on various computer modeled phenomena by clicking buttons, scrolling, zooming, entering specific values, etc. The ability to customize demonstrations is only limited by the author’s skill with the programing language (Mathematica). The ability to interact directly with mathematical phenomenon such as these demonstrations should have a positive impact on learning concepts related to the specific demonstration.

Demonstrations may also create an interest in mathematics (and related fields) where some find confusion and boredom.

Demonstrations will be incorporated into the course in two ways. First, the instructor will be using demonstrations in class as lecture material. Second and most importantly students will be able to go on-line at anytime to interact directly with demonstrations via a link on Blackboard.

  • Do students perceive demonstrations to be generally useful and helpful?
  • Is student learning significantly affected by demonstrations?

image placeholder

Hillary Rexe

Accessible and inclusive video captioning.

With The Accessibility for Ontarians with Disabilities Act, 2005 (AODA), the provincial government has mandated universal accessibility standards in a wide range of fields including education. Within the Media Foundation Program here at Humber, we are deeply committed to learning how we can make our course content fully inclusive and accessible for all of our students. We firmly believe that Humber has an opportunity to lead the way in universal design, and that looking to the study of media is a great first step.

In our Media Foundation courses, we often look at events and trends as they happen, and we often do so using video content from all over the web. Since this content is new, and tends to be on platforms with unreliable or non-existent captioning (such as YouTube and Vimeo), we are currently unable to make it accessible for our students have a hearing impairment or who are Deaf. What’s more, we do not yet know how to make this content accessible for those who are blind or have other accessibility requirements.

  • How does learning how to caption video affect student perception of the importance of universal design in video content?

Sarah Feldbloom

Sarah Feldbloom

Leanne Milech

Leanne Milech

Teaching critical thinking, reading, and writing using multimodal texts and personalized approaches.

The multimodal approach has already been demonstrated to deepen the learning process and to respond to authentic communication needs in students’ home, academic, and work lives.  In “The Multimodal Writing Process: Changing Practices in Contemporary Classrooms”, Christine Joy Edwards-Groves (2010) proposes that due to the rapidly changing nature of technology and our increasingly globalized world, students must “design, produce, and present multimodal texts as representations of learning.

Despite all this now widely accepted knowledge about the relevance of teaching multimodal literacy, we have noticed that College Reading and Writing Skills (WRIT 100), Humber’s foundational reading, writing, and critical thinking course for students across multiple programs, does not formally address the importance of multimodal literacy in its course outline. Since we are aware of the current academic discourse on the importance of multimodal literacy (as evidenced above), we have developed an alternative method for teaching WRIT 100 that uses multimodal texts and strategies.

This research focused on the impact of helping Humber College students develop multimodal literacy; the hypothesis was that using multimodal texts in the college composition classroom and asking students to show literacy development by producing multimodal texts promotes engagement and increases student success at demonstrating learning outcomes in Humber’s WRIT 100 classroom.

Lara McInnis

Lara McInnis

image placeholder

Sean Gilpin

image placeholder

John Stilla

Reformulation and noticing.

This project investigates the effectiveness of a popular English as a Second Language (L2) corrective written feedback approach – reformulation – with a different learner group: Native English speaking (L1) college students in a remedial reading-writing course. We have found that some remedial L1 writing students are unable to notice lexical, grammatical, and discourse errors in their own writing, often because they have difficulty distancing themselves from their own writing and observing it systematically. This is supported within the field of rhetorical and composition research, especially within the scope of basic writing error analysis for remedial writers (see Shaughnessy, M., 1976, p. 236; Otte & Williams Mlynaruzyk, 2010, p. 123). We are, in essence, trying to determine effective pedagogical strategies for remedial writers that will enhance their ability to notice and correct certain errors in their writing.

  • What are the effects of reformulation and concurrent verbal protocol on the noticing of errors of remedial L1 writing students?

Soheila Pashang

Soheila Pashang

Pedagogy and practice: teaching diversity in classrooms.

Canada is among one of the most diverse nations where individuals, regardless of their race, gender, class, age, sexual orientation, ability, country of origin and nationality as well as other socially constructed identities are to be protected under the law, policy and practice. The commitment towards diversity is considered as one the first platforms for our societal strength and socio-economic and political inclusion. As a result, many academic institutions including Humber College are currently offering diversity courses within their programs in order to contribute towards socio-equitable practices and eliminate discrimination and barriers that might hinder marginalized individuals and communities from reaching their potential.

Despite this effort, recent reports (Gaudet, 2018; Reasons et al., 2016; Hankivsky, 2014; Saddichha et al., 2014; Langille, 2014; Anthony, 2013; Owusu-Bempah, 2011) suggest alarming results and an increase in incidences of discrimination against various individuals and communities within the criminal justice system.  Other studies (Hayle, Wortley, and Tanner, 2016; Scott, 2012; Henry, Reece and Tator, 2010) highlight discrimination within the justice system from the point of entry to exit from the system. . These findings reveal the need to re-examine the diversity course offered in the Justice Services Program in the School of Social and Community Services (SSCS) at Humber College.

This project interviewed faculty who have taught diversity related subjects in SSCS in order to identify gaps and strengths of the existing pedagogical approaches used in teaching diversity courses within the Justice Services Program.

PriscillaNBengo

Priscilla Bengo

Using a highly rated teaching strategy by students to improve their achievement.

Little has been written about the best learning experiences of students in a community college (Hatch et al., 2018). This research focuses on courses other than research methods courses. Most of the research is about students in universities taking online courses (e.g., Holweiss et.al., 2014: Zhan & Mei, 2013). This study addresses the research gap. The current research on best learning experiences for students provides some components of great learning experiences for students at the post-secondary level. As a result, it can be used to design this study.

The research methods course, RSMT 1501: Quantitative Methods-Interpretation, can be challenging for students.  The purpose of the study was to help students understand course concepts. Students described what their best learning experience has been in the research methods course and rated how they feel about the experiences described in the focus group.  Insights gathered can be used to improve RSMT 1501 for future cohorts.

Peter Wheeler

Peter Wheeler

The impact of lightboard videos on student learning.

Educators are integrating increasingly innovative technology into their teaching practices, including in their online educational content.  One such innovation — the Lightboard — is an illuminated clear glass board used for recording video lectures.  It allows the educator to face the audience while writing out and diagramming concepts, which creates a more personable and interactive learning experience.

This project implemented six Lightboard videos in an Electronic Devices 2nd semester course at Humber College. Engineering lectures usually involve drawings and equations drawn on chalkboards or whiteboards. In capturing lectures on video however, these traditional props become liabilities: the presenter must turn away from the audience to write or draw on the board, and the presenter’s body often obscures the material (Birdwell & Peshkin, 2015). Birdwell & Peshkin (2015) developed the lightboard to, “create visually compelling videotape lectures, avoid the liabilities of chalkboards, and furthermore to be able to produce upload-ready video segments with no post-production.” These are the exact same challenges that led to the utilization of the Lightboard for a 2nd semester Electronic Devices class at Humber College.  This project measured direct (GPA) performance along with student self-reports on engagement with the videos.

Naghmeh Saffarian

Naghmeh Saffarian-Pour

Aaron Landry

Aaron Landry

Using esl techniques with hesl students.

There is global interest in content-based instruction (Stoller 2004). Many instructors are intrigued by the idea of integrating language skills and content. Both can inform one another. Nevertheless, there are more challenges to this approach than in, for example, a language-only course (Stoller 2014, Baecher, 2014). There is also the worry that the language objectives will suffer when taught alongside content (Bigelow 2006). This research shifts the focus to the content. In other words, while most research evaluates content according to whether or not it helps language learning, this project used language learning in the service of content.

The rationale for this inversion is threefold. First, it is based on limited utilization of these techniques in our current pedagogy. The hope was that by formalizing these experiences into a research project, it would help clarify and confirm whether or not anecdotal experience maps onto reality. Second, and more concretely, it is believed that the triadic model (Bigelow 2006) utilized has multiple access points, which means that one can leverage language structures in order to build content comprehension. The final reason for utilizing language structures in the service of content is the learning objectives of the course. Currently at Humber, HESL 024 is a compulsory content-based course, which means that students are required to take it. This reality, on its own, restricts the scope of the research project.

This projected attempted to answer the following:

  • Does the use of the Triadic Model ESL technique, in the context of a content-based Science and Humanities course (HESL 024-Introduction to Arts and Science), positively impact student comprehension and engagement?

Matt Ramer

Matthew Ramer

Efficacy of the one2one assessment in improving knowledge acquisition and retention.

Content-heavy, foundational Health Science courses are often a challenge for students. There is evidence, that students perform better on oral as compared to traditional written examinations. Two types of oral exams are often used in professional schools including medical education: Traditional oral exams (TOE) and Structured oral exams (SOE). TOEs are designed such that students do not know the questions they will be asked before the exam and the examiner has wide latitude in which questions to ask. SOEs by contrast, are those in which the questions are standardized for all students and they may or may not know the questions ahead of time.

Students are also reported to have a positive attitude toward oral assessments, though a drawback of many of these studies is the anecdotal nature of much of this student feedback. Given the resource-intensive nature of oral assessments, it is important to know objectively whether oral assessments facilitate student learning and knowledge retention; this will allow for informed decisions regarding resource allocation for this type of assessment. This project evaluates the efficacy of a one-on-one oral assessment (called the One2One) in student learning of new material and the retention of this new knowledge in both the medium- and long-term. How students experience their learning is also a valuable metric for measuring the efficacy of an assessment method and so the perceived professional value and usefulness of the One2One was evaluated through the solicitation of student perspectives.

Mark Whale

Nathan Radke

The effectiveness of podcasts in student learning.

In the last 10 years, podcasting (pre-recorded audio files that are downloaded by individuals and listened to largely on portable devices) has become a wide-spread phenomenon in fields as disparate as entertainment and neuroscience.  However, because they are still a fairly recent medium, little is known so far about their effectiveness in communicating information in the diploma-level college environment.  Research has shown there is great potential to this new area (Robinson and Ritzko 2009; McKinney et al 2009) but that there still remains questions regarding what format is the most useful in this setting (Middleton 2016).

As the application of podcasting technology to college instruction is still fairly new, this project aimed to answer the following:

  • How can podcasts best be designed, performed and distributed to students in order to maximize the academic utility generated?
  • In particular, how can podcasts be created that increase engagement with and grasp of the material of introductory liberal arts and science courses?

Louise Zimanyi

Louise Zimanyi

Parent perspectives on children’s engagement with nature.

This study explores parent perspectives on how time spent in nature and natural settings, including their own experiences with the natural world, might influence their child’s play, learning and holistic development and connections in and to the natural world. Through a survey and focus group, parents identified benefits of participation in a forest nature program including increased time outdoors, play confidence, risk-taking opportunities, improved health, wellness and the developing seeds of environmental stewardship and reciprocity.

Karen White

Karen White

Interdisciplinary design experience in course alumni.

Interdisciplinary practice and collaboration in design is recognized by the North American Interior Design discipline’s governing accreditation body, the Council for Interior Design Accreditation, as an important aspect of interior design education. (CIDA, 2016) As such, there is evidence in the literature of how interior design education programs design and deliver courses in interdisciplinary design practice. (Greenberg, 2015; Karakaya & Şenyapili, 2008; Russ & Dickinson, 1999) Successful course design has been demonstrated to be project-based, innovation-focused and forward-thinking, with a procedural emphasis on collaboration and teamwork. Indeed, at Humber College ITAL, the Bachelor of Interior Design program has since 2008 required that students complete a course titled Interdisciplinary Practices* in the winter semester of the junior (third) year.

Contrary to studies focusing on other aspects of interior design (Templeton, 2011), or studies of interdisciplinary collaboration that do not focus on interior design (Cobb, Agogino, & Beckman, 2007; Kalyanaraman, Fernandez-Solis, Rybkowski, & Neeraj, 2016), the literature concerned with interdisciplinary practice in interior design does not show evidence of longitudinal studies conducted to investigate the impact of this curricula in the professional practice of interior designers after graduation.

Seeking to remedy the gap, this research project proposes an exploratory study to identify long-term learning lessons from the Interdisciplinary Practices course in the Bachelor of Interior Design degree at Humber ITAL. The objective is to help inform and introduce improvements to the course for future offerings.

Janice Fung

Janice Fung

Nasby

Sarah Nasby

Kinesthetic and tactile typography.

Printing is an integral part of design and advertising – it is both technical and creative and can be the difference between the success or failure of a project. Knowledge of different printing technologies helps students to push their creativity by working with different mediums. Mixing old and new printing technologies – both tactile and digital media – opens up the potential for new ideas. Students often poorly understand the importance of printing as well as the required depth of information and its creative potential. It is difficult to create engagement around this topic in a digital classroom as the inherently tactile quality of printing doesn’t lend well to explanation alone. We believe this topic would connect better with students and create improved understanding when taught in a more hands-on manner through the direct use of printing technology that the students can use themselves. By furthering the tactile experience of using print technology in the second year BoCA course, Digital Production, we hope that students will be further engaged and better understand how printing works.

Lonela Bacain

Ionela Bacain

image placeholder

Using Mini-Cases in Teaching Managerial Accounting

This research project explored students learning experience in Introduction to Managerial Accounting course, in particular the application of the new concepts introduced to students. The current student population that is enrolled in the course varies and comes from different degree programs offered by the Faculty of Business, so the level of understanding and grasping of accounting concepts varies significantly amongst the students. Some of our students are non-accounting majors and struggle with this topic (managerial accounting). Given the complexity of the subject matter, non-accounting students often fail to appreciate the relevance of the many aspects of the course content. Therefore, we see students struggling with the application of concepts in real case scenarios after it is introduced and discussed in class using traditional classroom techniques.

Case study method has been widely used as a teaching tool in various business disciplines (Thomas, L. Ngo-Ye & Jae Choi, 2014). The case study approach enables students to apply concepts learned to practical business scenarios.

Our proposed study was designed to determine a) if students perceive mini-cases improve their learning experience; b) if students perceive that mini-cases motivate them; c) if students perceive they achieved higher outcomes with the mini-case approach.

Intentional Activities and Student Well-Being

This research proposal is concerned with increasing students’ levels of subjective well-being and happiness.  According to Lyubomirsky & Della Porta, (in press), wellbeing is defined as high life satisfaction, frequent positive affect and infrequent negative affect. By 2020, the World Health Organization predicts depression to be the most common health problem in the world (Brundland, 2001). The stigma attached to the issue of mental health sometimes holds students back from seeking support, however, even when support is sought the demand outweighs the services (Burns, Lee & Brown, 2011).

The focus of this study is to deepen an understanding of how post-secondary institutions can build in preventative measures for student wellness.  Ultimately, such initiatives could improve retention, increase graduation rates and advance student success.  Each week, for four weeks, students engaged in three self-selected intentional activities.  The experimental group chose from a list of activities that have been reliably shown to increase positive emotions and the comparison group chose from a list of activities that have not been reliably shown to increase positive emotions.  The intent of this project was to discover if regular engagement in positive activities will increase the subjective sense of happiness among students.

Ian McIsaac

Ian McIsaac

Integrating interactive software in the classroom for bmgt152 principles of management.

In their article Abodor and Daneshfar, 2006 (1.) state most of the leading Business Schools such as the Harvard’s Business School  have introduced simulation workshops or incorporated simulations as part of the content of different courses offered to students.  Abodor and Daneshfar, 2006 have referenced Jennings, 2002, Thomson et al., 1997 and Lane , 1995 (2.) who found that benefits of simulations have been noted include enabling  users to make strategic decisions by applying principles they learn in theory, teaching general management skills because they require users to make a series of strategic decisions, and enabling  users to make errors without any loss of investment.

The Principles of Management course (BMGT152) at Humber College in Toronto examines the roles of the manager and the skills needed to plan, organize, lead and control resources to achieve organizational objectives.

The business simulation software Praxar Golf – Introduction to Management introduced in this study offers students an opportunity to make decisions affecting an entire business comprising a golf course, pro-shop and restaurant. Students will then be able to observe the results of their decisions to such measures as EPS, sales and asset values. Ethical decisions are included as an integral part of decision-making.

Christine McCaw

Christine McCaw

Student engagement and satisfaction levels.

Student engagement levels are linked to satisfaction with learning events. There is a large amount of research around student engagement and how the learning environment can improve learner engagement and participation levels.  This project hypothesizes that increased student participation and engagement activities (as measured by student contributions to class discussions, by questions asked in class, or by participation in group activities in class) is positively correlated to academic success.  The classroom environment can also impact student engagement (ie: having more access to white boards for students to write/respond to in-class activities, access to and use of technology such as personal laptops, tablets or cell phones).

This project compares traditional classrooms to redesigned classrooms (HIVES rooms at Humber) to answer the following:

  • Does classroom design impact student satisfaction, engagement, participation, and the quality of interactions between students and the instructor?
  • Specifically, does the classroom design and access to technology improve the learner experience and improve overall academic results?

Cheryl Mitchell

Cheryl Mitchell

Enhancing entrepreneurial learning.

The two most common ways post-secondary education promotes entrepreneurial activity is with the creation of incubators and entrepreneurial specific programming (Politis, Winborg, & Dahlstrand, 2010).  However, research findings are inconclusive on whether this close proximity to the school as well as entrepreneurial education has produced an individual better equipped to deal with the day-to-day activities of an entrepreneur (Politis, Winborg, & Dahlstrand, 2010).

In order to paint a portrait of current entrepreneur teaching strategies at Ontario Colleges, this project will focus on entrepreneurship courses and specifically the entrepreneurial education taught in the Entrepreneurial Enterprise Post Graduate Program and will:

  • Provide a landscape of entrepreneurship including definitions of entrepreneurship and entrepreneurship education and common characteristics of entrepreneurs;
  • Identify common ways post-secondary institutions and Humber teach entrepreneurship in the classroom; At Humber, my preliminary research indicates that it is mostly taught using business plan development and pitch presentations.
  • Present a profile of effective education in developing entrepreneurial skills;
  • Provides recommendations for changes to the educational curriculum to better support potential entrepreneurs.

Blake Lamber

Blake Lambert

Does twitter increase student engagement.

Students at Humber College, both degree and diploma students, are required to take a prescribed number of Electives in order to graduate from their program. A frequent student bias is that they don’t see Electives courses as part of their program. This belief can manifest itself as disengagement. But if Twitter, a popular social media platform, is incorporated into course work, can it help students to respond to and invest themselves in the learning materials of their courses.

  • Does Twitter, when used in course work, increase student engagement in the learning materials in Liberal Studies electives?

Conflict Scenarios in Criminal Justice Careers

The BAA-CJ program currently does not use the conflict simulator as a teaching tool within its curriculum. This is partially because many of the scenarios in the library are not conducive to the skills learned in the program and/or the types of conflicts happening within a range of CJ professions. This project aimed to get a better sense of the prevalent conflict scenarios happening in current CJ professions and from there develop ecologically valid scenarios to be incorporated into CJ courses.

The current project developed useful conflict scenarios for the simulator library based on student and CJ professional experience. A follow-up study will incorporate these scenarios into our conflict management course to: 1. Determine if this technology enhances student engagement in course material; 2. If can be used as an evaluative tool to capture student learning of specific conflict skills.

Alfred Seasman

Alfred Seaman

Professional identity in accounting students.

Professional identity is an important construct as it has been previously associated with retention, attrition, persistence, and knowledge construction. This study hypothesized that levels of professional identity will increase across subsequent years in the program. The overarching goal is to identify if and when substantial changes to identity occurs, which can also be described as the point when students stop viewing themselves as students and start viewing themselves as professionals. In particular, the study investigated when professional identity emerges within an accounting program, and whether this differs between the diploma and degree programs at Humber.

Adam Thomas

Adam Thomas

image placeholder

George Paravantes

Learning through play.

There is a large active discussion in education regarding teaching programming fundamentals to students in all levels of education including middle school, secondary school and postsecondary school.

Students enrolled in their first coding course often find grasping the fundamentals difficult and likely don’t understand the importance of a solid foundational understanding. Students often “hack” their way through introductory courses and then struggle in later courses.

This project tries to solve this through introducing an “element of play” to an introductory coding course.  The first half of this course continued learning Java but used it to program Lego Mindstorm robotics ( mindstorms.lego.com ).

Does adding a physical element of play to a programming fundamentals course help increase student engagement and result in better learning?

What do you do if you see that NEW Blackboard Ally red “oil gauge” next to your document in your course site? In this session we demonstrate how to fix accessibility issues identified by Blackboard Ally. This session will focus specifically on fixing issues with missing ALT text and color contrast in digital course content.

When designing and building your course online, consider the perspectives of all of your learners. In this session, we will look at online courses from the perspective of students with disabilities. We will also share five strategies to make your online course accessible to all students, such as incorporating Universal Design for Learning principles, adapting documents, and modifying assessments.

Creating and providing accessible documents is about inclusion, equity, and social responsibility. This session reviews the WCAG accessibility guidelines and AODA requirements applicable to creating digital content and documents and the fundamentals of creating accessible materials. As well, you will walk away with strategies and a toolbox full of apps and resources to help with making your content accessible.

Microsoft PowerPoint has many tools that help to ensure that the documents you share with your students are fully accessible. In this session, we modify inaccessible content to comply with accessibility standards. As well, we look at the following in PowerPoint with respect to accessibility: Accessibility Checker, Slide Layouts and Master Slides, and Reading Order. You will walk away with strategies to help you make your PowerPoint documents accessible.

Microsoft Word has many tools that help to ensure that the documents you share with your students are fully accessible. In this session, we modify inaccessible content to comply with accessibility standards. As well, we look at the following in Word with respect to accessibility:  Accessibility Checker ,  Document Structure , and  Tables . You will walk away with strategies to help you make your Word documents accessible.

What do you do if you see that NEW Blackboard Ally red “oil gauge” next to your document in your course site? In this session we demonstrate how to fix accessibility issues identified by Blackboard Ally. This session will focus specifically on fixing heading structures in digital course content.

Blackboard Ally is a new feature that was installed to your Blackboard site on June 1, 2020. It is an accessibility application that helps identify whether your digital course content is accessible and provides tips for remediation. Ally can convert your LMS content into alternative formats, including PDFs, electronic braille and audio files. In this session, we tour this new tool and its features and you will learn ways to maximize it in your course.

Are you interested in exploring Infographics as an assignment for your students? Are you already exploring infographics but are unsure where to send your students for tools, resources, and support?

The Idea Lab offers resources including online tutorials, instructional videos, and Blackboard packages for Infographic assignments. In addition, the Idea Lab team are here to support your students with workshops, online support, and one-on-one guidance and troubleshooting with digital assignments.

This session will help you explore if there is an appropriate infographic assignment opportunity in your course.

Are you interested in exploring website creation as an assignment for your students? Are you already exploring websites but are unsure where to send your students for tools, resources, and support?

The Idea Lab offers resources including online tutorials, instructional videos, and Blackboard packages for website assignments. In addition, the Idea Lab team are here to support your students with workshops, online support, and one-on-one guidance and troubleshooting with digital assignments.

Ideas for a website assignment include an online portfolio, a blog, a field placement report, and many more. This session will help you explore if there is an appropriate website assignment opportunity in your course.

Are you interested in exploring video creation as an assignment for your students? Are you already exploring videos but are unsure where to send your students for tools, resources, and support?

The Idea Lab offers resources including online tutorials, instructional videos, and Blackboard packages for various types of video assignments. In addition, the Idea Lab team are here to support your students with workshops, online support, and one-on-one guidance and troubleshooting with digital assignments.

Ideas for a video assignment include a mock interview, a screencast of a software demonstration, a digital story, a sales pitch, a demonstration of newly acquired skill, and many more. This session will help you explore if there is an appropriate video assignment opportunity in your course.

How can we choose and use educational technology tools thoughtfully? What tools align with our learning outcomes and assessments? In this session, we explore the Edtech bank – a group of tools, organized by function, that we recommend. We briefly talk about each tool to provide you with direction for choosing the right ones for you and your course.

There are many innovative digital tools that you can use in your online class to increase student participation and engagement, and to solicit feedback from your class. These tools can add polling, engagement with videos, mindmaps and more to your lessons. Explore these sessions to learn about Flipgrid, EdPuzzle, Mentimeter, Padlet, H5P, Kahoot and Poll Everywhere, and to get ideas on how to incorporate them in meaningful ways.

Inclusive design can help to remove barriers for learners and foster an online learning environment that is engaging for all. In this session, we provide practical strategies based on universal design for learning (UDL) to support you as you build inclusive courses online.

Rubrics are multidimensional scoring guides that faculty can use to provide consistency and transparency when evaluating students’ work. In this session, we review the components of a rubric, become familiar with difference kinds of rubrics, and begin creating a rubric. You will also learn how to use your rubric in Blackboard to increase marking ease, efficiency and consistency.

Universal Design for Learning is a framework that recognizes the ways in which humans learn and is designed to improve teaching and learning. In this session, participants will learn how universal design for learning (UDL) can help remove barriers for students and foster an online learning environment that is engaging and accessible for all. Practical strategies are shared to help you improve your own course.

As teachers, we know that the lively sense of community that forms in a face-to-face college classroom can be richly rewarding. In this session we explore how we can support, motivate and engage our students in an online environment. You will discover ways to forge meaningful connections between students and the material, students and the instructor, and students with their peers.

You want your live lectures on Blackboard Collaborate to be engaging and effective for your students, but how can you structure them to deliver maximum impact? This session delves into some strategies that, once incorporated, will keep students engaged during online lectures.

Encouraging collaboration can be a powerful way of improving learner engagement in an online setting. In this session we present and review how to facilitate collaborative learning including group work in an online class.

Topics will include: synchoronous and asynchronous collaborative teaching and learning, approaches used to implement effective group work, and examples of group work activities.

Adobe Spark is an online design application that allows users to effortlessly create graphics, short videos and web pages. In this session, we look at creating learning modules using Adobe Spark. Using the examples provided in the video below, we explore Adobe Spark’s functionality and provide tips on how you can use it in your own context.

Blackboard offers many tools that you can use to engage with your learners. In this session we demonstrate how to create Discussion Boards in Blackboard, and discuss how to use Discussion Boards to assess learning, build community, engage learners, and deepen learning.

In many cases, assessments need to be rethought and redesigned to fit in an online setting. This session explores ideas for alternative forms of assessments that will engage students in authentic opportunities to demonstrate their learning. You will also learn about educational technology tools that can support the delivery and completion of these assessments.

In this session, we will discuss ways to design an effective online course that meets the needs of today’s learners. We will overview the backwards design approach and will walk through course design tools and sample modules developed by Humber’s Teaching + Learning team. Questions will be welcome in the chat and at the end of the presentation.

Online course design requires a different approach than course design in a physical classroom. In this session, we discuss ways to design an effective online course that meets the needs of today’s learners. We provide an overview of the backwards design approach and walk through course design tools and sample modules developed by Humber’s Teaching + Learning team.

To respond to the shift to online, the Teaching + Learning team created a 10 step process that guides the development of effective online courses. This session goes through each of these steps to help you bring your course to the online world and ensure it is accessible and engaging for your students.

Teaching in the online environment is different than teaching in a face-to-face classroom in many ways. This session explores some key differences, highlighting issues regarding synchronous and asynchronous methods, engaging students, building trust and community, and managing workload in the online world.

Panopto allows you to easily manage, record and integrate video into your digital classroom. In this session, we cover the steps involved in creating videos using Panopto, and provide you with strategies to produce effective and impactful content for your learners.

Panopto allows you to easily manage, record and integrate video into your digital classroom. In this session, we cover the steps involved in sharing a video with your learners using Panopto.

Do you want to make your recorded Blackboard Collaborate sessions available to your students? Taking your recorded sessions from Blackboard Collaborate and bringing them into Panopto is easy. This short session will guide you through the process as well as review provisioning Panopto for Blackboard.

Captioning is esssential to make sure your videos are accessible to all learners. This session provides a step-by-step guide on how to add captions to your webinars, recorded classes or course videos. It shows you how to leverage Panopto’s power to generate automated captions to your videos in seconds and edit the captions effortlessly in minutes.

Panopto is more than just a platform to share your lectures and other videos. In it, you can create quizzes and use it as a platform on which students can create and share video assignments! This session explores the functionality of Panopto and provides you with the steps you need to maximize its functionality in your online class.

Panopto allows you to easily manage, record and integrate video into your digital classroom. This session provides you with an overview of the Panopto platform and how it can support you in remote-based delivery. Learn how Panopto can help you build and share videos with your students, and how it can be used to facilitate video assignments from your students. Panopto supports in-video quizzing, and has an easy-to-use Closed Captioning tool. Learn about discussions, and Panopto’s recording tools.

So you’ve finished your course trailer? This session provides guidance on how to finalize and submit your course trailer to be viewed by your learners. Within Lumen5, we’ll guide you through how to “request review” and how to apply Captions to your video within Panopto, and how to complete the webform to submit your completed video!

Creating a course trailer is a great way to get students interested in your course. Lumen5 is a video creation platform that allows anyone – no matter your skill level – to create engaging video content quickly and easily. This session offers a step-by-step guide on how to build your course trailer. You will learn about the writing process and an introduction to using the tool.

Creating a course trailer is a great way to get students interested in your course. Lumen5 is a video creation platform that allows anyone – no matter your skill level – to create engaging video content quickly and easily. This session is a step-by-step guide to Lumen5 that will help you create your perfect course trailer!

Blackboard Collaborate Ultra is a live video tool that allows you to add files, share applications and use a virtual whiteboard to interact with your learners.  This session covers the ins and outs of Blackboard Collaborate Ultra, including the management of roles, the management of audio and video, how to record sessions, how to interact with participants, sharing content and creating breakout groups.

Are you worried that teaching online will make it harder for you to notice students who feel invisible and disengaged? In this session we demonstrate how to use Blackboard’s Retention Centre tool to “see” learner and instructor engagement. You will learn how to identify students who may be facing challenges and reach out to them to set them up for success.

In this session we take a deep dive into the Blackboard Rubrics tool. We look at adding different types of rubrics to Blackboard, attaching rubrics to assignments, using the rubric to grade student submissions and provide feedback, and viewing the rubric as the student sees it.   This workshop focuses on the tool and not the theory of rubrics. To get the most out of this workshop, you may wish to review  this past recording of the “Creating and Using Rubrics” webinar  which focuses on designing effective rubrics.

Blackboard offers many tools that you can use to engage with your learners. In this session we review how to facilitate collaborative learning including group work using the Blackboard’s Discussion Board tool.

Blackboard has many tools dedicated to improving communication among learners and instructors. In this session, we take a closer look at how to set up and use three of these communication tools.

Discussion Boards  are an interactive space where learners can post and reply to messages (examples include class debates, weekly reflections, team discussions, role plays, etc.).

Journals  are a private and reflective space for learners to write (examples include reflecting on personal growth, recording lab results, documenting clinical experiences, communicating “muddiest points” that are private).

Wikis are a collaborative space where students can view, contribute to and edit content (examples include grant writing, creative writing, group research projects, student-created study guides and course glossaries).

Blackboard has many tools dedicated to creating well-organized and engaging learning modules. In this session, we will take a deeper dive into creating learning modules in Blackboard. We review the Content Folder Method and the Learning Module Method for building module material. Using the examples provided in the video below, we explore Blackboard’s functionality and provide tips on how you can maximize it in your own context.

Blackboard Groups is a tool on Blackboard that can greatly increase collaboration in your class. This session focuses on the Groups feature in Blackboard. You will learn about the various ways to create groups, creating group sign-up sheets, group tools available to students, creating group assignments, and grading group assignment.

Blackboard has many useful built-in tools that can be used to create effective tests and assessments. In this session, you will learn how to create tests in Blackboard. Topics will include creating test questions, reusing questions from past tests, creating randomized blocks, and using pools.

Looking for an easier way to load exams and quizzes into your Blackboard course site? Do you have a Word document with test questions that you want to use? Respondus is powerful third-party tool for preparing, managing, and uploading exams. This sessions shows you how to download and install Respondus, format test questions and answers in Word, import Word documents into the software, and publish a test from Respondus to Blackboard.

In Blackboard, instructors can create pools of questions or import questions from previously created tests. This session provides the steps required in order to create an assessment from a pool of questions, question set or previous test.

Blackboard has a test function that provides powerful customizability for your assessments. In this session, we answer the following questions: Does it have to be a test? Where do I create a test? How do I add questions? How do I deploy the test to learners? How do I make exceptions and accommodations for learners?

Blackboard’s Grade Center can be an extremely useful tool for tracking student success and providing critical information to students. This 90-minute session is for instructors who want to dig a little deeper into the functionality of the Grade Center. You will learn about setting up and organizing Grade Center, and the difference between scores and weights.

Blackboard offers many tools that you can use to build your course online. This session focuses on the basics of Blackboard, including how to navigate the platform, add learning content, use basic communication tools, create assignments and apply the Humber Template to your course site. The Blackboard Basics Quick Reference Card can help you transfer what you have learned in this session to your course.

After participating in Creating Accessible Documents 1 (CAD1) and 2 (CAD2), you feel confident that you can recognize accessibility issues in digital documents and are able to apply the AODA and WCAG guidelines and requirements to make them accessible. Now that you have gone back to your own materials and documents and given accessibility a try, you may have some further questions.

Creating Accessible Documents 3 (CAD3) is a 2-hour drop-in working session where you can get those questions answered. Come to this session with your documents that you would like to make accessible and that you may have questions about. We will brainstorm, analyze, and find solutions to your accessibility issues!

Learning Outcomes

  • Modify documents to comply with accessibility standards.
  • Use accessibility features in PowerPoint, Word, and Acrobat Pro.
  • Identify accessibility issues in documents.
  • Create accessible documents.
  • Earn the Digital Document Accessibility Training Certificate of Achievement (DDAT) upon completion of CAD1, CAD2, and CAD3.
  • Prerequisites: CAD1 and CAD2
  • Please bring at least one document (PowerPoint, Word, or PDF) that you would like to make accessible and have questions about.

Find out how to create videos in Panopto, Humber’s new video streaming platform. Topics include: Using the Mac and Windows applications to capture your content, using the Panopto video editor, and creating/uploading content using your mobile phone! Before engaging in this session, we highly recommend the “Intro to Panopto” module.

Find out about Panopto, Humber’s new streaming platform. Topics include: uploading a video, enabling closed captioning, navigating the Humber folder structure, and leveraging Panopto’s built-in “Smart Search” to maximize your content discovery, amongst others.

Students studying various disciplines will, upon graduation, be required to work alongside a range of others trained in different disciplines. Research has shown that interprofessional teaching can have significant benefits in helping students appreciate the potential contributions of those from other professions and learn the essential communication and collaborative skills to effectively enhance the client or patient experience. In this session, participants will explore opportunities to create memorable and rich interprofessional learning opportunities for their students.

Differentiated Assessment is a best practice in universal design for courses and allows students with different learning styles to engage and produce meaningful work for assessment. Often instructors wish to consider assessment options but time constraints negatively impact the ability to come up with innovative ideas. In this session, participants will learn about the underpinnings to this approach and receive hands on help in expanding their assessment options for students.

Qualitative data collection methods can generate an overwhelming amount of data. The process of organizing and analyzing this data can be daunting. Attendees of this interactive session will walk through the process of organizing, and analyzing qualitative data. Specifically, attendees will be guided to develop code or theme lists that reflect the key ideas and patterns found in their data that answer their research questions. Participants are encouraged to bring a set of qualitative data that may be explored during the session.

Thinking about conducting an interview or focus group as a part of a research project, but not sure where to begin? When done effectively, these two research methods can add richness and depth to your study. In this session, we will discuss best practices for effective and ethical interview and focus group facilitation. We can also point you in the right direction for what comes after, with an overview of thematic analysis. Participants are encouraged to bring any projects that may be further developed during this session.

Various disciplines, in academic and business sectors, use qualitative research to observe people in their natural setting or to better understand people’s lived experiences as described by them. Qualitative Research Methods can be used in research studies that require a response to the question ‘why’ and to look at the topic through the eyes of participants. The most often used methods in qualitative research are interviews and observation. In this session, participants will learn what qualitative research is and when it is most appropriate to use it in a study. Anyone with an idea for a research study will benefit from thinking about what methodological approach will be best suited to meet the research objectives.

The iPad is a great tool for presenting content in an engaging and interactive way, especially if your subject matter is analytical or creative. We’ll work with your existing course content and discuss how apps such as GoodNotes and Keynote can enhance communication and interaction during classroom presentations.

Microsoft OneDrive gives you the ability to store and sync up to 1TB of files online. Find out how you can best manage, sync and share files. We can also discuss strategies for backups and organization.

Creating Accessible Documents 2 (CAD2) consists of three parts:

  • Creating Accessible PowerPoint Documents
  • Creating Accessible Word Documents
  • Creating Accessible PDFs

In this 3-hour hands-on session, Creating Accessible Documents 2 (CAD2) , we will apply the accessibility guidelines and requirements for digital content learned in Creating Accessible Documents 1 (CAD1) to various documents. You will review the fundamentals of creating accessible materials and modify inaccessible content to comply with accessibility standards. We will dive into PowerPoint, Word and PDF documents and practice making them accessible. You will be introduced to the use of accessibility checkers in PowerPoint, Word and Acrobat Pro and the different features in these software applications to make documents accessible.

  • Apply accessibility guidelines to digital content and documents.
  • Prerequisite: CAD1

With Ontario’s commitment to be a fully accessible province by 2025 through the implementation of the AODA, knowing how to create accessible learning materials is crucial! Humber College has committed to Accessible Education as one of its Strategic Pillars. Creating and providing accessible documents is about inclusion, equity, and social responsibility.

Creating Accessible Documents 1 (CAD1) will review WCAG accessibility guidelines and AODA requirements applicable to creating digital content and documents. In this 2-hour hands-on session, you will learn the fundamentals of creating accessible materials and modify inaccessible content to comply with accessibility standards. You will walk away with strategies and a toolbox full of apps and resources to help with making your content accessible.

  • Identify different types of disabilities.
  • Recognize the different legislation and guidelines related to accessibility.
  • Identify accessibility requirements and issues for digital content.
  • Creating Accessible Documents 1 (CAD1) and 2 (CAD2) are also part of the Inclusive Curriculum Design Certificate (ICDC) . This training, equivalent to one module in the ICDC, can be used to PLAR into the 6-module ICDC certificate.

In this 1-hour lecture style session, WCAG What? Understanding AODA Compliance for Documents , we will review WCAG accessibility guidelines and AODA requirements applicable to creating digital content and documents. You will walk away with a toolbox full of apps and resources to help with making your content accessible.

In this 1-hour hands-on session, we will apply the accessibility guidelines and requirements for digital content and instructional documents to PowerPoint documents. You will be modifying inaccessible content to comply with accessibility standards.

As well, we will look at PowerPoint’s built-in tools for accessibility: Accessibility Checker, slide layouts and Master Slides, and Reading Order.

You will walk away with strategies to help you make your PowerPoint documents accessible.

  • Apply accessibility guidelines to digital content and PowerPoint documents.
  • Modify PowerPoint documents to comply with accessibility standards.
  • Use accessibility features in PowerPoint.

Prerequisite: WCAG What? Understanding AODA Compliance for Documents or Creating Accessible Documents 1 (CAD1)

In this 1-hour lecture style session, we will apply the accessibility guidelines and requirements for digital content and instructional documents, learned previously, to PDF documents in Acrobat DC (Pro).

We will look at the following: Acrobat DC’s Accessibility Checker (Full Check), Reading Order, Reflow, and the Tags Pane.

You will walk away with strategies to help with making your PDF documents accessible.

  • Identify accessibility issues in PDF documents using Acrobat DC (Pro).
  • Use accessibility features in Acrobat DC (Pro).

Prerequisite: WCAG What? Understanding AODA Compliance for Documents or Creating Accessible Documents 1 (CAD1) and Creating Accessible Word Documents

In this 1-hour hands on session, we will apply the accessibility guidelines and requirements for digital content and documents to Word documents. You will be modifying inaccessible content to comply with accessibility standards. As well, we will look at the following in Word with respect to accessibility: Accessibility Checker, Document Structure, and Tables.

You will walk away with strategies to help you make your Word documents accessible.

  • Apply accessibility guidelines to digital content and Word documents.
  • Modify Word documents to comply with accessibility standards.
  • Use accessibility features in Word.

Word clouds are a great way to distil and summarize information. Come and find out various ways you can use word clouds in teaching and learning. Discover several free and easy to use tech tools, including ones built right into PowerPoint and Word!

Engage your students and improve your lessons when using the Sharp classroom touchscreen situated in the HIVES classrooms and other Humber learning spaces. In this hands-on session, you’ll learn evidence-informed practices to plan, prepare, and share your materials using the classroom touchscreen. Participants are welcome to bring their own ideas and materials to the session.

The skill you learn here are transferable to any touch screen you’ll encounter in and out of the classroom.

Lucidchart is a web app that allows users to collaborate and work together in real time to create flowcharts, organizational charts, website wireframes, mind maps, software prototypes, and many other diagram types. In this hands-on session, you’ll learn the basic functionality of Lucidchart that will empower you to create your very own designs.

Learn how to use Padlet, an online virtual “bulletin” board where students and teachers collaborate, reflect, share links and any file type. Padlet is a tool between a document and a full-fledged website builder, empowering everyone to make the content they want. To check out the tool, go to padlet.com .

Mind mapping is a visual technique of representing and structuring thoughts and ideas. In this session, we will investigate evidence-informed practices used in the creation and real-world application of mind maps by using state-of-the-art mind mapping software. To get the most out of the session, participants are strongly advised to bring their own laptop, tablet, or phone.

Use Quizlet to create multimedia study sets and share them with your students. They can then use them to learn and review in seven different study modes: learn, flashcards, write, spell, test, match, and the gravity space game. Students can also create their own study sets that can be useful for reviewing or assessing their knowledge. As a premium feature, this tool uses spaced repetition principles to help student study more efficiently and retain information in the long term. For teachers, Quizlet Live is an engaging in-class game mode where students work in teams to correctly match terms and definitions. Learning is fun with Quizlet!

Learn how the Google suite of free apps facilitates sharing and collaborating learning practices. In this hands-on session, we will explore the apps that are useful in an educational setting, inside and outside of the classroom. By the end of the workshop, you will have a working knowledge of Calendar, Docs, Sheets, Slides, Drive, and Keep. Both web and mobile apps will be explored.

Enhance your lessons in the classroom with the latest electronic board technology. The SMART board brings the traditional whiteboard to the next level by adding extended capabilities for sharing and collaborating, endless whiteboards, session live sharing with local and remote users, saving and resuming sessions, and web browsing. The SMART board can be moved to where you need it the most on campus. This is a hands-on session where you’ll use and get comfortable with this technology.

Mobile devices are incredibly powerful tools that can enhance teaching and learning in the classroom and on the go. In this session, you will discover and use some educational apps that can be part of your teaching toolbox and leverage your mobile devices’ capabilities. You will learn how to download and use these essential apps on your smartphone or tablet. An Apple App Store or Google Play Store account is required in order to download the apps on your personal device. A limited number of devices will be available to borrow during the session.

Mentimeter can help you make your classes innovative and memorable. In this hands-on session, you’ll learn how to plan, design, and deliver your interactive lesson. Add polls, quizzes, scales, open ended questions and other interactive tools that can help you engage and interact with your learners. Participants are welcome to bring their ideas and any materials they’d like to work with to the session.

Google Forms is a simple and free way to collect, save and analyze student data. Use it to create class surveys, short quizzes, student inventories and much more. Use the information you collected to help you gauge your students’ understanding, class engagement and anything else you like. In this hands-on session, you’ll learn how to create Google Forms, how to make forms that are easily understandable, how to ask good questions and offer good answer choices. We’ll complete our overview of the tool by learning how to distribute surveys and collect answers.

Find out how Panopto, Humber’s new video streaming platform, integrates with Blackboard. Topics include: sharing videos with your students, leveraging “Assignment Folders” to collect your students’ videos, and integrating quizzes into your creations (It’s SCORM Compliant, too!). Before engaging in this session, we highly recommend the “Intro to Panopto” module.

Rubrics can do much more than communicate assignment requirements. By transferring your rubrics to Blackboard you can:

  • Increase marking ease and efficiency
  • Increase marking consistency and impartiality
  • Quickly provide personalized feedback and inline comments
  • Reduce downloading and printing of assignments

Prerequisites : Current knowledge of rubrics.

The purpose of this session is to provide you with an opportunity to explore Camtasia alongside a Studio Support Specialist, who will guide you through a detailed tour of the software’s interface, its essential tools, as well as provide you with an in-depth look at its many video creation possibilities. Central to this session is our appreciation of the fact that your goals and experiences, as they relate to creating educational content, are unique. With this in mind, we are happy to develop a plan that’s specifically tailored for you so that you can begin to create engaging video content for your students!

Learn how to make simple, animated videos on your computer using a using free browser-based tools. In this hands-on session, you will create a short animated video with a voiceover that you can use in your teaching.

Want to add a short welcome or course overview video in Blackboard but you don’t want to appear on-screen? Learn how to make simple animated videos on your PC using free tools. In this hands-on session you will experiment with two simple tools that will allow you to created short animated videos. By the end of this session, you’ll have the tools you need to create your own animated video.

Want to add a short welcome or course overview video in Blackboard but you don’t want to appear on-screen? Learn how to make simple animated videos on your Mac using free tools. In this hands-on session you will experiment with two simple tools that will allow you to created short animated videos. By the end of this session, you’ll have the tools you need to create your own animated video.

In this session, we will explore Blackboard tools to help you assess student learning. You will learn:

  • How to create various question types (essay, short answer, multiple choice, true/false)
  • How to randomize questions for tests and surveys
  • How to build tests from pools
  • How to edit test options (number of attempts, timer, forced completion, display dates, due dates, display results)

In this session, you will learn the basics about Blackboard 9.1, and find out how this product can enhance your classroom. Participants will discover how to navigate, add and remove content, use basic communication tools, and upload the Humber Template.  Please ensure you have access to Blackboard before the session beings; If you do not have access to a Blackboard site to work in you will not be able to participate actively in the workshop.

Is your Blackboard Grade Center ready for grades?

In this session, we will focus on common challenges like understanding the difference between scores and weights, creating a weighted grade column, organizing Grade Center, and controlling grade visibility for students.

Have you ever wanted to invite a guest speaker to your classroom virtually?  Or hold virtual office hours?  Or engage with your students in a virtual classroom?  With Blackboard Collaborate Ultra you can do all of the above.  Host a virtual session anytime within or outside of Blackboard and connect with your students.

Learn how to provide accommodation to individuals and groups for assignments and tests in Blackboard.

In this session we will address the following questions:

  • How do I grant an assignment extension?
  • How do I give someone more time to take a test in Blackboard?
  • How do I allow someone to retake a test?
  • How do I allow a student or a group to resubmit an assignment?

Prerequisites: Participants should know how to create assignments and tests in Blackboard.

Learning outcomes are a critical component of the teaching and learning process. They act as a guide for learners and educators and indicate what the learners will know and do by the end of a specified course or program. In this session, participants will learn how to write observable and measurable student learning outcomes that inform course content and assessments.

Compared to children and teenagers, adults have unique needs and requirements as learners.  Understanding the unique characteristics of the adult learner ensures more effective classroom instruction.   During this interactive session, faculty will: 1. identify the unique characteristics of the adult and mature learner; and 2. incorporate andragogic strategies to engage adult and mature learners.

Are you struggling to motivate and engage your students?  Are you finding it increasingly difficult to come up with new ideas on how to bring FUN into the classroom?  This session is designed to provide you with a “go-to basket” of FUN and ENGAGING activities that can be used in any discipline.  Not only will you participate in the “actual activities,” but you will walk away with a package of resources that can immediately be used in your teaching practice.

Personalized student feedback questionnaires (SFQ) can provide faculty with important information about their course. Join with colleagues to generate ideas for developing meaningful, personalized questions. By the end of this session, you will have created your own questions for your SFQ.

Lesson planning helps ensure that curriculum is engaging, consistent, and effectively assessed. A well-designed lesson is rewarding for educators and motivating for students. In this session, participants will explore the elements of an effective lesson plan and receive a template to guide the design of lessons that maximize learning opportunities for all students.

A ‘hot moment’ in the classroom is an emotion-laden moment of conflict or tension that threatens to derail teaching and learning.  ‘Hot moments’ are usually triggered by a comment on a sensitive issue or as a result of classroom dynamics.  Most faculty are uncertain how to effectively respond to a ‘hot moment’.  During this session faculty will:

  • discuss their role in developing and maintaining a positive classroom climate; and
  • apply strategies to cool down a tension filled classroom community.

Submitting a paper for publication in a scholarly journal can be an exciting – and intimidating – experience. In this session, we will walk you through the process from initial submission to final publication. Topics will include how to choose a target journal, how to set up your manuscript for success, how to deal with and respond to peer reviews (the good, the bad, and the ugly), and how to stay patient and keep your confidence intact throughout!

Planning to share your research at an academic conference? In this session, we will explore how to be a great storyteller with your research, so that your presentation (either oral or poster form) will stand out – in a good way!

This session will provide an introductory overview on how to represent textual data in a visual context to help others better understand its significance or value.  Participants will learn tips and tricks on data visualization using Microsoft Excel.  Topics to be discussed include arranging data for optimal chart creation, when to use each chart type, examples of good and bad data visualizations, and an introduction to intermediate and advanced chart templates (ex. Histogram, Stacked Bar, Combo, and Sunburst).

Active learning is a pedagogical approach to instruction that involves actively engaging students with the course material.  This student-centered approach places a greater degree of responsibility on the learner compared to teaching practices such as lectures, but instructor guidance is still crucial in the active learning classroom.  This session will: 1) familiarize participants with the active learning approach; and 2) describe various active learning techniques that can be seamlessly implemented in small, large and online class environments.

Planning or interested in developing a survey for a research project? This session is for you! In this module, we will guide you through the best practices of constructing survey questions, building your survey on Survey Monkey or Microsoft Forms, and analyzing and visualizing your data using Microsoft Excel. Participants are encouraged to bring survey-based projects that may be further developed during the session.

Attendees of this interactive session will walk through the process of creating, organizing, and applying codes for a qualitative data set, in an objective and comprehensive way to help answer their research question(s). Participants are encouraged to bring a set of qualitative data that may be explored during the session.

IMAGES

  1. Why is critical thinking difficult?

    critical thinking why is it so hard to teach

  2. Critical Thinking: Why Is It So Hard to Teach?

    critical thinking why is it so hard to teach

  3. Critical Thinking: Why is it Important and How to Teach Them to Students?

    critical thinking why is it so hard to teach

  4. Why Teaching Critical Thinking is Important & What We are Planning to

    critical thinking why is it so hard to teach

  5. Why is teaching critical thinking so hard? (With images)

    critical thinking why is it so hard to teach

  6. Why is critical thinking important to teach? in 2021

    critical thinking why is it so hard to teach

VIDEO

  1. Academic Culture and Expectations: Critical Thinking

  2. Critical Thinking: Why I’m Making it a Priority at 20 Years Old

  3. Critical Thinking: Why You Should Think Twice Before Jumping to Conclusions #DecisionMaking#shorts

  4. ur maybe thinking why so fast cuss that is how it felt for me whit him his life was too fast🥰mylo💖💗💝

  5. Unlocking the Power of Critical Thinking: Why Your Brain Seeks Clarity #podcast #comedy

  6. Critical Thinking is All You Need To Build Business and Life (How To Think Critically)

COMMENTS

  1. Critical Thinking Why Is It So Hard to Teach?

    Critical reasoning, decision making, self-direction. Critical thinking is effective in that it avoids common pitfalls, such as seeing only one side of an issue, discounting new evi. dence that ...

  2. Critical Thinking: Why Is It So Hard to Teach?

    In this article, I will describe the nature of critical thinking, explain why it is so hard to do and to teach, and explore how students acquire a specific type of critical thinking: thinking scientifically. Along the way, we'll see that critical thinking is not a set of skills that can be deployed at any time, in any context.

  3. PDF Critical Thinking

    In this article, I will describe the nature of critical thinking, explain why it is so hard to do and to teach, and explore how students acquire a specific type of critical thinking: thinking scientifically. Along the way, we'll see that critical thinking is not a set of skills that can be deployed at any time, in any context. It is a type

  4. Critical Thinking: Why Is It So Hard to Teach?

    Critical Thinking: Why Is It So Hard to Teach?: Arts Education Policy Review: Vol 109, No 4. Arts Education Policy Review Volume 109, 2008 - Issue 4. 4,769.

  5. Can We Teach Critical Thinking?

    Arguably one of the most valued and sought after skills that students are expected to learn is critical thinking. The ability to think critically, and by extension solve problems and exercise effective decision making, is highly prized among employers and academics. Instructors and programs therefore face a lot of pressure to improve this ...

  6. Critical Thinking: Why Is It So Hard to Teach?

    Philosophical Inquiry and Critical Thinking in Primary and Secondary Science Education. Tim Sprod. Philosophy, Education. 2014. If Lipman's claim that philosophy is the discipline whose central concern is thinking is true, then any attempt to improve students' scientific critical thinking ought to have a philosophical edge.…. Expand. 5.

  7. Ask the Cognitive Scientist: How Can Educators Teach Critical Thinking?

    It's a perennial idea—teach something that requires critical thinking, and such thinking will become habitual. In the 19th century, educators suggested that Latin and geometry demanded logical thinking, which would prompt students to think logically in other contexts. 4 The idea was challenged by psychologist Edward Thorndike, who compared scores from standardized tests that high school ...

  8. Critical Thinking: Why Is It so Hard to Teach?

    People who have sought to teach critical thinking have assumed that it is a skill, like riding a bicycle, and that, like other skills, once learned, it can be applied in any situation. ... Critical Thinking: Why Is It so Hard to Teach? Willingham, Daniel T. Arts Education Policy Review, v109 n4 p21-29 Mar-Apr 2008.

  9. What Is Critical Thinking and Why Do We Need To Teach It?

    The Foundation for Critical Thinking says, "Critical thinking can be seen as having two components: 1) a set of information and belief-generating and processing skills, and 2) the habit, based on intellectual commitment, of using those skills to guide behavior.". In other words, good critical thinkers know how to analyze and evaluate ...

  10. PDF Introduction: Reasoning, Argumentation, and Critical Thinking Instruction

    Willingham D (2007) Critical thinking: why is it so hard to teach? Am Educ 31(2):8-19. (reprinted as: Willingham, D.T. (2008). Critical thinking: Why is it so hard to teach? Arts Education Policy Review, 109 (4): 21-32.) 92 F. Zenker 123. Title: Introduction: Reasoning, Argumentation, and Critical Thinking Instruction

  11. PDF Can Critical Thinking Be Taught?

    Since critical thinking relies so heavily on domain knowledge, educators may wonder if thinking critically in a particular domain is easier to learn. The quick answer is yes, it's a littleeasier. To understand why, let's focus on one domain, science, and examine the development of scien- tific thinking.

  12. ‪Daniel Willingham‬

    Critical thinking: Why is it so hard to teach? DT Willingham. Arts Education Policy Review 109 (4), 21-32, 2008. 728: 2008: The myth of learning styles. C Riener, D Willingham. Change: The magazine of higher learning 42 (5), 32-35, 2010. 663: 2010: The scientific status of learning styles theories.

  13. Critical thinking

    Critical thinking is not a set of skills that can be deployed at any time, in any context. It is a type of thought that even 3-year-olds can engage in - and even trained scientists can fail in. Reference: Willingham, D. T. (2008). Critical thinking: Why is it so hard to teach?

  14. PDF Critical Thinking: Why Is It So Hard to Teach?

    Vol. 109, No. 4, March/April 2008 23 are especially relevant for educators: familiarity with a problem's deep struc-ture and the knowledge that one should

  15. CRITICAL THINKING: WHY IS IT HARD TO TEACH?

    John L. Irwin. Education. Journal of Advanced Research in Social Sciences…. 2019. Critical Thinking (CT) is terminology used in many higher learning institutions as a stated purpose for their existence. It is spoken of as an objective for the instructor and as an outcome for the…. Expand. 1. Highly Influenced.

  16. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  17. 'Why is this hard, to have critical thinking?' Exploring the factors

    Arguably, critical thinking does not come naturally to anyone, regardless of background. As van Gelder (2005) points out, 'critical thinking is hard . . . and most people are just not very good at it' (p. 42). Becoming 'good at it' is a life-long journey which starts early'.

  18. Critical Thinking: Why Is It So Hard to Teach?

    Element D: Teachers establish and communicate high expectations and use processes to support the development of critical-thinking and problem-solving skills. This article takes a deep dive into what it really means for students to think critically. Professor of Cognitive Psychology, Daniel Willingham, details the latest research in an effort ...

  19. Critical Thinking: Why is it so hard to teach?

    Willingham notes that background knowledge plays a role in critical thinking and understanding the surface structure of problems. He provides evidence-based strategies, such as promoting thinking within a particular domain to bring everyday thinking into the classroom. Curriculum Connection

  20. Critical Thinking: Why Is It So Hard to Teach?

    In this article, I will describe the nature of critical thinking, explain why it is so hard to do and to teach, and explore how students acquire a specific type of critical thinking: thinking scientifically. Along the way, we'll see that critical thinking is not a set of skills that can be deployed at any time, in any context.

  21. Getting Students Comfortable with Critical Thinking

    Critical thinking: Why is it so hard to teach? American Educator , 31 , 8-19. Bryan Goodwin is the president and CEO of McREL International, a Denver-based nonprofit education research and development organization.

  22. What influences students' abilities to critically evaluate scientific

    Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible ... Critical thinking: Why is it so hard to teach?. Arts Educ Policy Rev. 2008. Mar 1; 109 (4):21-32. doi: 10.3200/AEPR.109.4.21-32 [Google Scholar]

  23. Think Critically Before Thinking Critically

    Critical thinking: Why is it so hard to teach? American Educator, 8-19. Wineburg, S., & McGrew, S. (2017). Lateral reading: Reading less and learning more when evaluating digital information.

  24. Critical Thinking: Why Is It So Hard To Teach?

    Case Study Groups - provides a format for students to exercise and develop their critical thinking skills in the submission of their analysis and recommendations for the case. It also provides them with the ability to work effectively in groups (a key component of the skills required by the financial services industry).