Rubric for Evaluating Student Presentations

  • Kellie Hayden
  • Categories : Student assessment tools & principles
  • Tags : Teaching methods, tools & strategies

Rubric for Evaluating Student Presentations

Make Assessing Easier with a Rubric

The rubric that you use to assess your student presentations needs to be clear and easy to read by your students. A well-thought out rubric will also make it easier to grade speeches.

Before directing students to create a presentation, you need to tell them how they will be evaluated with the rubric. For every rubric, there are certain criteria listed or specific areas to be assessed. For the rubric download that is included, the following are the criteria: content, eye contact, volume and clarity, flow, confidence and attitude, visual aids, and time.

Student Speech Presentation Rubric Download

Assessment Tool Explained in Detail

Use a Rubric to Assess Presentations

Content : The information in the speech should be organized. It should have an engaging introduction that grabs the audience’s attention. The body of the speech should include details, facts and statistics to support the main idea. The conclusion should wrap up the speech and leave the audiences with something to remember.

In addition, the speech should be accurate. Teachers should decide how students should cite their sources if they are used. These should be turned in at the time of the speech. Good speakers will mention their sources during the speech.

Last, the content should be clear. The information should be understandable for the audience and not confusing or ambiguous.

Eye Contact

Students eyes should not be riveted to the paper or note cards that they prepare for the presentation. It is best if students write talking points on their note cards. These are main points that they want to discuss. If students write their whole speech on the note cards, they will be more likely to read the speech word-for-word, which is boring and usually monotone.

Students should not stare at one person or at the floor. It is best if they can make eye contact with everyone in the room at least once during the presentation. Staring at a spot on the wall is not great, but is better than staring at their shoes or their papers.

Volume and Clarity

Students should be loud enough so that people sitting in the back of the room can hear and understand them. They should not scream or yell. They need to practice using their diaphragm to project their voice.

Clarity means not talking too fast, mumbling, slurring or stuttering. When students are nervous, this tends to happen. Practice will help with this problem.

When speaking, the speaker should not have distracting pauses during the speech. Sometimes a speaker may pause for effect; this is to tell the audience that what he or she is going to say next is important. However, when students pause because they become confused or forget the speech, this is distracting.

Another problem is verbal fillers. Student may say “um,” “er” or “uh” when they are thinking or between ideas. Some people do it unintentionally when they are nervous.

If students chronically say “um” or use any type of verbal filler, they first need to be made aware of the problem while practicing. To fix this problem, a trusted friend can point out when they doing during practice. This will help students be aware when they are saying the verbal fillers.

Confidence and Attitude

When students speak, they should stand tall and exude confidence to show that what they are going to say is important. If they are nervous or are not sure about their speech, they should not slouch. They need to give their speech with enthusiasm and poise. If it appears that the student does not care about his or her topic, why should the audience? Confidence can many times make a boring speech topic memorable.

Visual Aids

The visual that a student uses should aid the speech. This aid should explain a facts or an important point in more detail with graphics, diagrams, pictures or graphs.

These can be presented as projected diagrams, large photos, posters, electronic slide presentations, short clips of videos, 3-D models, etc. It is important that all visual aids be neat, creative and colorful. A poorly executed visual aid can take away from a strong speech.

One of the biggest mistakes that students make is that they do not mention the visual aid in the speech. Students need to plan when the visual aid will be used in the speech and what they will say about it.

Another problem with slide presentations is that students read word-for-word what is on each slide. The audience can read. Students need to talk about the slide and/or offer additional information that is not on the slide.

The teacher needs to set the time limit. Some teachers like to give a range. For example, the teacher can ask for short speeches to be1-2 minutes or 2-5 minutes. Longer ones could be 10-15 minutes. Many students will not speak long enough while others will ramble on way beyond the limit. The best way for students to improve their time limit is to practice.

The key to a good speech is for students to write out an outline, make note cards and practice. The speech presentation rubric allows your students to understand your expectations.

  • A Research Guide.com. Chapter 3. Public Speaking .
  • 10 Fail Proof Tips for Delivering a Powerful Speech by K. Stone on DumbLittleMan.
  • Photo credit: Kellie Hayden
  • Planning Student Presentations by Laura Goering for Carleton College.
  • Grades 6-12
  • School Leaders

Free printable Mother's Day questionnaire 💐!

15 Helpful Scoring Rubric Examples for All Grades and Subjects

In the end, they actually make grading easier.

Collage of scoring rubric examples including written response rubric and interactive notebook rubric

When it comes to student assessment and evaluation, there are a lot of methods to consider. In some cases, testing is the best way to assess a student’s knowledge, and the answers are either right or wrong. But often, assessing a student’s performance is much less clear-cut. In these situations, a scoring rubric is often the way to go, especially if you’re using standards-based grading . Here’s what you need to know about this useful tool, along with lots of rubric examples to get you started.

What is a scoring rubric?

In the United States, a rubric is a guide that lays out the performance expectations for an assignment. It helps students understand what’s required of them, and guides teachers through the evaluation process. (Note that in other countries, the term “rubric” may instead refer to the set of instructions at the beginning of an exam. To avoid confusion, some people use the term “scoring rubric” instead.)

A rubric generally has three parts:

  • Performance criteria: These are the various aspects on which the assignment will be evaluated. They should align with the desired learning outcomes for the assignment.
  • Rating scale: This could be a number system (often 1 to 4) or words like “exceeds expectations, meets expectations, below expectations,” etc.
  • Indicators: These describe the qualities needed to earn a specific rating for each of the performance criteria. The level of detail may vary depending on the assignment and the purpose of the rubric itself.

Rubrics take more time to develop up front, but they help ensure more consistent assessment, especially when the skills being assessed are more subjective. A well-developed rubric can actually save teachers a lot of time when it comes to grading. What’s more, sharing your scoring rubric with students in advance often helps improve performance . This way, students have a clear picture of what’s expected of them and what they need to do to achieve a specific grade or performance rating.

Learn more about why and how to use a rubric here.

Types of Rubric

There are three basic rubric categories, each with its own purpose.

Holistic Rubric

A holistic scoring rubric laying out the criteria for a rating of 1 to 4 when creating an infographic

Source: Cambrian College

This type of rubric combines all the scoring criteria in a single scale. They’re quick to create and use, but they have drawbacks. If a student’s work spans different levels, it can be difficult to decide which score to assign. They also make it harder to provide feedback on specific aspects.

Traditional letter grades are a type of holistic rubric. So are the popular “hamburger rubric” and “ cupcake rubric ” examples. Learn more about holistic rubrics here.

Analytic Rubric

Layout of an analytic scoring rubric, describing the different sections like criteria, rating, and indicators

Source: University of Nebraska

Analytic rubrics are much more complex and generally take a great deal more time up front to design. They include specific details of the expected learning outcomes, and descriptions of what criteria are required to meet various performance ratings in each. Each rating is assigned a point value, and the total number of points earned determines the overall grade for the assignment.

Though they’re more time-intensive to create, analytic rubrics actually save time while grading. Teachers can simply circle or highlight any relevant phrases in each rating, and add a comment or two if needed. They also help ensure consistency in grading, and make it much easier for students to understand what’s expected of them.

Learn more about analytic rubrics here.

Developmental Rubric

A developmental rubric for kindergarten skills, with illustrations to describe the indicators of criteria

Source: Deb’s Data Digest

A developmental rubric is a type of analytic rubric, but it’s used to assess progress along the way rather than determining a final score on an assignment. The details in these rubrics help students understand their achievements, as well as highlight the specific skills they still need to improve.

Developmental rubrics are essentially a subset of analytic rubrics. They leave off the point values, though, and focus instead on giving feedback using the criteria and indicators of performance.

Learn how to use developmental rubrics here.

Ready to create your own rubrics? Find general tips on designing rubrics here. Then, check out these examples across all grades and subjects to inspire you.

Elementary School Rubric Examples

These elementary school rubric examples come from real teachers who use them with their students. Adapt them to fit your needs and grade level.

Reading Fluency Rubric

A developmental rubric example for reading fluency

You can use this one as an analytic rubric by counting up points to earn a final score, or just to provide developmental feedback. There’s a second rubric page available specifically to assess prosody (reading with expression).

Learn more: Teacher Thrive

Reading Comprehension Rubric

Reading comprehension rubric, with criteria and indicators for different comprehension skills

The nice thing about this rubric is that you can use it at any grade level, for any text. If you like this style, you can get a reading fluency rubric here too.

Learn more: Pawprints Resource Center

Written Response Rubric

Two anchor charts, one showing

Rubrics aren’t just for huge projects. They can also help kids work on very specific skills, like this one for improving written responses on assessments.

Learn more: Dianna Radcliffe: Teaching Upper Elementary and More

Interactive Notebook Rubric

Interactive Notebook rubric example, with criteria and indicators for assessment

If you use interactive notebooks as a learning tool , this rubric can help kids stay on track and meet your expectations.

Learn more: Classroom Nook

Project Rubric

Rubric that can be used for assessing any elementary school project

Use this simple rubric as it is, or tweak it to include more specific indicators for the project you have in mind.

Learn more: Tales of a Title One Teacher

Behavior Rubric

Rubric for assessing student behavior in school and classroom

Developmental rubrics are perfect for assessing behavior and helping students identify opportunities for improvement. Send these home regularly to keep parents in the loop.

Learn more: Teachers.net Gazette

Middle School Rubric Examples

In middle school, use rubrics to offer detailed feedback on projects, presentations, and more. Be sure to share them with students in advance, and encourage them to use them as they work so they’ll know if they’re meeting expectations.

Argumentative Writing Rubric

An argumentative rubric example to use with middle school students

Argumentative writing is a part of language arts, social studies, science, and more. That makes this rubric especially useful.

Learn more: Dr. Caitlyn Tucker

Role-Play Rubric

A rubric example for assessing student role play in the classroom

Role-plays can be really useful when teaching social and critical thinking skills, but it’s hard to assess them. Try a rubric like this one to evaluate and provide useful feedback.

Learn more: A Question of Influence

Art Project Rubric

A rubric used to grade middle school art projects

Art is one of those subjects where grading can feel very subjective. Bring some objectivity to the process with a rubric like this.

Source: Art Ed Guru

Diorama Project Rubric

A rubric for grading middle school diorama projects

You can use diorama projects in almost any subject, and they’re a great chance to encourage creativity. Simplify the grading process and help kids know how to make their projects shine with this scoring rubric.

Learn more: Historyourstory.com

Oral Presentation Rubric

Rubric example for grading oral presentations given by middle school students

Rubrics are terrific for grading presentations, since you can include a variety of skills and other criteria. Consider letting students use a rubric like this to offer peer feedback too.

Learn more: Bright Hub Education

High School Rubric Examples

In high school, it’s important to include your grading rubrics when you give assignments like presentations, research projects, or essays. Kids who go on to college will definitely encounter rubrics, so helping them become familiar with them now will help in the future.

Presentation Rubric

Example of a rubric used to grade a high school project presentation

Analyze a student’s presentation both for content and communication skills with a rubric like this one. If needed, create a separate one for content knowledge with even more criteria and indicators.

Learn more: Michael A. Pena Jr.

Debate Rubric

A rubric for assessing a student's performance in a high school debate

Debate is a valuable learning tool that encourages critical thinking and oral communication skills. This rubric can help you assess those skills objectively.

Learn more: Education World

Project-Based Learning Rubric

A rubric for assessing high school project based learning assignments

Implementing project-based learning can be time-intensive, but the payoffs are worth it. Try this rubric to make student expectations clear and end-of-project assessment easier.

Learn more: Free Technology for Teachers

100-Point Essay Rubric

Rubric for scoring an essay with a final score out of 100 points

Need an easy way to convert a scoring rubric to a letter grade? This example for essay writing earns students a final score out of 100 points.

Learn more: Learn for Your Life

Drama Performance Rubric

A rubric teachers can use to evaluate a student's participation and performance in a theater production

If you’re unsure how to grade a student’s participation and performance in drama class, consider this example. It offers lots of objective criteria and indicators to evaluate.

Learn more: Chase March

How do you use rubrics in your classroom? Come share your thoughts and exchange ideas in the WeAreTeachers HELPLINE group on Facebook .

Plus, 25 of the best alternative assessment ideas ..

Scoring rubrics help establish expectations and ensure assessment consistency. Use these rubric examples to help you design your own.

You Might Also Like

What is Project Based Learning? #buzzwordsexplained

What Is Project-Based Learning and How Can I Use It With My Students?

There's a difference between regular projects and true-project based learning. Continue Reading

Copyright © 2024. All rights reserved. 5335 Gate Parkway, Jacksonville, FL 32256

Rubric Best Practices, Examples, and Templates

A rubric is a scoring tool that identifies the different criteria relevant to an assignment, assessment, or learning outcome and states the possible levels of achievement in a specific, clear, and objective way. Use rubrics to assess project-based student work including essays, group projects, creative endeavors, and oral presentations.

Rubrics can help instructors communicate expectations to students and assess student work fairly, consistently and efficiently. Rubrics can provide students with informative feedback on their strengths and weaknesses so that they can reflect on their performance and work on areas that need improvement.

How to Get Started

Best practices, moodle how-to guides.

  • Workshop Recording (Fall 2022)
  • Workshop Registration

Step 1: Analyze the assignment

The first step in the rubric creation process is to analyze the assignment or assessment for which you are creating a rubric. To do this, consider the following questions:

  • What is the purpose of the assignment and your feedback? What do you want students to demonstrate through the completion of this assignment (i.e. what are the learning objectives measured by it)? Is it a summative assessment, or will students use the feedback to create an improved product?
  • Does the assignment break down into different or smaller tasks? Are these tasks equally important as the main assignment?
  • What would an “excellent” assignment look like? An “acceptable” assignment? One that still needs major work?
  • How detailed do you want the feedback you give students to be? Do you want/need to give them a grade?

Step 2: Decide what kind of rubric you will use

Types of rubrics: holistic, analytic/descriptive, single-point

Holistic Rubric. A holistic rubric includes all the criteria (such as clarity, organization, mechanics, etc.) to be considered together and included in a single evaluation. With a holistic rubric, the rater or grader assigns a single score based on an overall judgment of the student’s work, using descriptions of each performance level to assign the score.

Advantages of holistic rubrics:

  • Can p lace an emphasis on what learners can demonstrate rather than what they cannot
  • Save grader time by minimizing the number of evaluations to be made for each student
  • Can be used consistently across raters, provided they have all been trained

Disadvantages of holistic rubrics:

  • Provide less specific feedback than analytic/descriptive rubrics
  • Can be difficult to choose a score when a student’s work is at varying levels across the criteria
  • Any weighting of c riteria cannot be indicated in the rubric

Analytic/Descriptive Rubric . An analytic or descriptive rubric often takes the form of a table with the criteria listed in the left column and with levels of performance listed across the top row. Each cell contains a description of what the specified criterion looks like at a given level of performance. Each of the criteria is scored individually.

Advantages of analytic rubrics:

  • Provide detailed feedback on areas of strength or weakness
  • Each criterion can be weighted to reflect its relative importance

Disadvantages of analytic rubrics:

  • More time-consuming to create and use than a holistic rubric
  • May not be used consistently across raters unless the cells are well defined
  • May result in giving less personalized feedback

Single-Point Rubric . A single-point rubric is breaks down the components of an assignment into different criteria, but instead of describing different levels of performance, only the “proficient” level is described. Feedback space is provided for instructors to give individualized comments to help students improve and/or show where they excelled beyond the proficiency descriptors.

Advantages of single-point rubrics:

  • Easier to create than an analytic/descriptive rubric
  • Perhaps more likely that students will read the descriptors
  • Areas of concern and excellence are open-ended
  • May removes a focus on the grade/points
  • May increase student creativity in project-based assignments

Disadvantage of analytic rubrics: Requires more work for instructors writing feedback

Step 3 (Optional): Look for templates and examples.

You might Google, “Rubric for persuasive essay at the college level” and see if there are any publicly available examples to start from. Ask your colleagues if they have used a rubric for a similar assignment. Some examples are also available at the end of this article. These rubrics can be a great starting point for you, but consider steps 3, 4, and 5 below to ensure that the rubric matches your assignment description, learning objectives and expectations.

Step 4: Define the assignment criteria

Make a list of the knowledge and skills are you measuring with the assignment/assessment Refer to your stated learning objectives, the assignment instructions, past examples of student work, etc. for help.

  Helpful strategies for defining grading criteria:

  • Collaborate with co-instructors, teaching assistants, and other colleagues
  • Brainstorm and discuss with students
  • Can they be observed and measured?
  • Are they important and essential?
  • Are they distinct from other criteria?
  • Are they phrased in precise, unambiguous language?
  • Revise the criteria as needed
  • Consider whether some are more important than others, and how you will weight them.

Step 5: Design the rating scale

Most ratings scales include between 3 and 5 levels. Consider the following questions when designing your rating scale:

  • Given what students are able to demonstrate in this assignment/assessment, what are the possible levels of achievement?
  • How many levels would you like to include (more levels means more detailed descriptions)
  • Will you use numbers and/or descriptive labels for each level of performance? (for example 5, 4, 3, 2, 1 and/or Exceeds expectations, Accomplished, Proficient, Developing, Beginning, etc.)
  • Don’t use too many columns, and recognize that some criteria can have more columns that others . The rubric needs to be comprehensible and organized. Pick the right amount of columns so that the criteria flow logically and naturally across levels.

Step 6: Write descriptions for each level of the rating scale

Artificial Intelligence tools like Chat GPT have proven to be useful tools for creating a rubric. You will want to engineer your prompt that you provide the AI assistant to ensure you get what you want. For example, you might provide the assignment description, the criteria you feel are important, and the number of levels of performance you want in your prompt. Use the results as a starting point, and adjust the descriptions as needed.

Building a rubric from scratch

For a single-point rubric , describe what would be considered “proficient,” i.e. B-level work, and provide that description. You might also include suggestions for students outside of the actual rubric about how they might surpass proficient-level work.

For analytic and holistic rubrics , c reate statements of expected performance at each level of the rubric.

  • Consider what descriptor is appropriate for each criteria, e.g., presence vs absence, complete vs incomplete, many vs none, major vs minor, consistent vs inconsistent, always vs never. If you have an indicator described in one level, it will need to be described in each level.
  • You might start with the top/exemplary level. What does it look like when a student has achieved excellence for each/every criterion? Then, look at the “bottom” level. What does it look like when a student has not achieved the learning goals in any way? Then, complete the in-between levels.
  • For an analytic rubric , do this for each particular criterion of the rubric so that every cell in the table is filled. These descriptions help students understand your expectations and their performance in regard to those expectations.

Well-written descriptions:

  • Describe observable and measurable behavior
  • Use parallel language across the scale
  • Indicate the degree to which the standards are met

Step 7: Create your rubric

Create your rubric in a table or spreadsheet in Word, Google Docs, Sheets, etc., and then transfer it by typing it into Moodle. You can also use online tools to create the rubric, but you will still have to type the criteria, indicators, levels, etc., into Moodle. Rubric creators: Rubistar , iRubric

Step 8: Pilot-test your rubric

Prior to implementing your rubric on a live course, obtain feedback from:

  • Teacher assistants

Try out your new rubric on a sample of student work. After you pilot-test your rubric, analyze the results to consider its effectiveness and revise accordingly.

  • Limit the rubric to a single page for reading and grading ease
  • Use parallel language . Use similar language and syntax/wording from column to column. Make sure that the rubric can be easily read from left to right or vice versa.
  • Use student-friendly language . Make sure the language is learning-level appropriate. If you use academic language or concepts, you will need to teach those concepts.
  • Share and discuss the rubric with your students . Students should understand that the rubric is there to help them learn, reflect, and self-assess. If students use a rubric, they will understand the expectations and their relevance to learning.
  • Consider scalability and reusability of rubrics. Create rubric templates that you can alter as needed for multiple assignments.
  • Maximize the descriptiveness of your language. Avoid words like “good” and “excellent.” For example, instead of saying, “uses excellent sources,” you might describe what makes a resource excellent so that students will know. You might also consider reducing the reliance on quantity, such as a number of allowable misspelled words. Focus instead, for example, on how distracting any spelling errors are.

Example of an analytic rubric for a final paper

Example of a holistic rubric for a final paper, single-point rubric, more examples:.

  • Single Point Rubric Template ( variation )
  • Analytic Rubric Template make a copy to edit
  • A Rubric for Rubrics
  • Bank of Online Discussion Rubrics in different formats
  • Mathematical Presentations Descriptive Rubric
  • Math Proof Assessment Rubric
  • Kansas State Sample Rubrics
  • Design Single Point Rubric

Technology Tools: Rubrics in Moodle

  • Moodle Docs: Rubrics
  • Moodle Docs: Grading Guide (use for single-point rubrics)

Tools with rubrics (other than Moodle)

  • Google Assignments
  • Turnitin Assignments: Rubric or Grading Form

Other resources

  • DePaul University (n.d.). Rubrics .
  • Gonzalez, J. (2014). Know your terms: Holistic, Analytic, and Single-Point Rubrics . Cult of Pedagogy.
  • Goodrich, H. (1996). Understanding rubrics . Teaching for Authentic Student Performance, 54 (4), 14-17. Retrieved from   
  • Miller, A. (2012). Tame the beast: tips for designing and using rubrics.
  • Ragupathi, K., Lee, A. (2020). Beyond Fairness and Consistency in Grading: The Role of Rubrics in Higher Education. In: Sanger, C., Gleason, N. (eds) Diversity and Inclusion in Global Higher Education. Palgrave Macmillan, Singapore.

Person talking and waving an arm (icon)

Creating an Oral Presentation Rubric

In-class activity.

This activity helps students clarify the oral presentation genre; do this after distributing an assignment–in this case, a standard individual oral presentation near the end of the semester which allows students to practice public speaking while also providing a means of workshopping their final paper argument. Together, the class will determine the criteria by which their presentations should–and should not–be assessed.

Guide to Oral/Signed Communication in Writing Classrooms

To collaboratively determine the requirements for students’ oral presentations; to clarify the audience’s expectations of this genre

rhetorical situation; genre; metacognition; oral communication; rubric; assessment; collaboration

  • Ask students to free-write and think about these questions: What makes a good oral presentation? Think of examples of oral presentations that you’ve seen, one “bad” and one “good.” They can be from any genre–for example, a course lecture, a museum talk, a presentation you have given, even a video. Jot down specific strengths and weaknesses.
  • Facilitate a full-class discussion to list the important characteristics of an oral presentation. Group things together. For example, students may say “speaking clearly” as a strength; elicit specifics (intonation, pace, etc.) and encourage them to elaborate.
  • Clarify to students that the more they add to the list, the more information they have in regards to expectations on the oral presentation rubric. If they do not add enough, or specific enough, items, they won’t know what to aim for or how they will be assessed.
  • Review the list on the board and ask students to decide what they think are the most important parts of their oral presentations, ranking their top three components.
  • Create a second list to the side of the board, called “Let it slide,” asking students what, as a class, they should “let slide” in the oral presentations. Guide and elaborate, choosing whether to reject, accept, or compromise on the students’ proposals.
  • Distribute the two lists to students as-is as a checklist-style rubric or flesh the primary list out into a full analytic rubric .

Here’s an example of one possible rubric created from this activity; here’s another example of an oral presentation rubric that assesses only the delivery of the speech/presentation, and which can be used by classmates to evaluate each other.

high school student presentation rubric

  • Teacher Education
  • Nursing Education
  • Behavioral Sciences
  • Sign & Foreign Languages
  • Performing Arts
  • Communication
  • Any Skill You Teach

WATCH FOR FREE

ReAction On-Demand!

Dive into our on-demand library from the skills based conference.

high school student presentation rubric

SEE GOREACT IN ACTION

Try for Free

See how GoReact can help empower confident skills

high school student presentation rubric

CONTENT TYPE

  • Case Studies
  • Product Demos

high school student presentation rubric

ReAction On-Demand

Dive into our on-demand library from ReAction, the skills based conference. Whether you missed a session or want to rewatch, it's all here (and free)!

  • CONTACT SALES EXPLORE GOREACT TRY FOR FREE CONTACT SALES

Higher Education

How to (Effectively) Use a Presentation Grading Rubric

high school student presentation rubric

Almost all higher education courses these days require students to give a presentation, which can be a beast to grade. But there’s a simple tool to keep your evaluations on track. 

Enter: The presentation grading rubric.

With a presentation grading rubric, giving feedback is simple. Rubrics help instructors standardize criteria and provide consistent scoring and feedback for each presenter. 

How can presentation grading rubrics be used effectively? Here are 5 ways to make the most of your rubrics. 

1. Find a Good Customizable Rubric

There’s practically no limit to how rubrics are used, and there are oodles of presentation rubrics on Pinterest and Google Images. But not all rubrics are created equal. 

Professors need to be picky when choosing a presentation rubric for their courses. Rubrics should clearly define the target that students are aiming for and describe performance. 

2. Fine-Tune Your Rubric

Make sure your rubric accurately reflects the expectations you have for your students. It may be helpful to ask a colleague or peer to review your rubric before putting it to use. After using it for an assignment, you could take notes on the rubric’s efficiency as you grade. 

You may need to tweak your rubric to correct common misunderstandings or meet the criteria for a specific assignment. Make adjustments as needed and frequently review your rubric to maximize its effectiveness. 

3. Discuss the Rubric Beforehand

On her blog Write-Out-Loud , Susan Dugdale advises to not keep rubrics a secret. Rubrics should be openly discussed before a presentation is given. Make sure reviewing your rubric with students is listed on your lesson plan.

Set aside time to discuss the criteria with students ahead of presentation day so they know where to focus their efforts. To help students better understand the rubric, play a clip of a presentation and have students use the rubric to grade the video. Go over what grade students gave the presentation and why, based on the rubric’s standards. Then explain how you would grade the presentation as an instructor. This will help your students internalize the rubric as they prepare for their presentations.

4. Use the Rubric Consistently

Rubrics help maintain fairness in grading. When presentation time arrives, use a consistent set of grading criteria across all speakers to keep grading unbiased. 

An effective application for rubrics is to apply a quantitative value to students across a cohort and over multiple presentations. These values show which students made the most progress and where they started out (relative to the rest of their class). Taken together, this data tells the story of how effective or ineffective the feedback has been.

5. Share Your Feedback

If you’re using an electronic system, sharing feedback might be automatic. If you’re using paper, try to give copies to presenters as soon as possible. This will help them incorporate your feedback while everything is still fresh in their minds. 

If you’re looking to use rubrics electronically, check out GoReact, the #1 video platform for skill development. GoReact allows you to capture student presentations on video for feedback, grading, and critique. The software includes a rubric builder that you can apply to recordings of any kind of presentation.

Presenters can receive real-time feedback by live recording directly to GoReact with a webcam or smartphone. Instructors and peers submit feedback during the presentation. Students improve astronomically. 

A presentation grading rubric is a simple way to keep your evaluations on track. Remember to use a customizable rubric, discuss the criteria beforehand, follow a consistent set of grading criteria, make necessary adjustments, and quickly share your feedback.

By following these five steps, both you and your students can reap the benefits that great rubrics have to offer.

high school student presentation rubric

Personalize Your GoReact Experience

Advertisement

Advertisement

Rubric formats for the formative assessment of oral presentation skills acquisition in secondary education

  • Development Article
  • Open access
  • Published: 20 July 2021
  • Volume 69 , pages 2663–2682, ( 2021 )

Cite this article

You have full access to this open access article

high school student presentation rubric

  • Rob J. Nadolski   ORCID: orcid.org/0000-0002-6585-0888 1 ,
  • Hans G. K. Hummel 1 ,
  • Ellen Rusman 1 &
  • Kevin Ackermans 1  

10k Accesses

4 Citations

1 Altmetric

Explore all metrics

Acquiring complex oral presentation skills is cognitively demanding for students and demands intensive teacher guidance. The aim of this study was twofold: (a) to identify and apply design guidelines in developing an effective formative assessment method for oral presentation skills during classroom practice, and (b) to develop and compare two analytic rubric formats as part of that assessment method. Participants were first-year secondary school students in the Netherlands ( n  = 158) that acquired oral presentation skills with the support of either a formative assessment method with analytic rubrics offered through a dedicated online tool (experimental groups), or a method using more conventional (rating scales) rubrics (control group). One experimental group was provided text-based and the other was provided video-enhanced rubrics. No prior research is known about analytic video-enhanced rubrics, but, based on research on complex skill development and multimedia learning, we expected this format to best capture the (non-verbal aspects of) oral presentation performance. Significant positive differences on oral presentation performance were found between the experimental groups and the control group. However, no significant differences were found between both experimental groups. This study shows that a well-designed formative assessment method, using analytic rubric formats, outperforms formative assessment using more conventional rubric formats. It also shows that higher costs of developing video-enhanced analytic rubrics cannot be justified by significant more performance gains. Future studies should address the generalizability of such formative assessment methods for other contexts, and for complex skills other than oral presentation, and should lead to more profound understanding of video-enhanced rubrics.

Similar content being viewed by others

high school student presentation rubric

Viewbrics: A Technology-Enhanced Formative Assessment Method to Mirror and Master Complex Skills with Video-Enhanced Rubrics and Peer Feedback in Secondary Education

high school student presentation rubric

Students’ and Teachers’ Perceptions of the Usability and Usefulness of the First Viewbrics-Prototype: A Methodology and Online Tool to Formatively Assess Complex Generic Skills with Video-Enhanced Rubrics (VER) in Dutch Secondary Education

high school student presentation rubric

The Dilemmas of Formulating Theory-Informed Design Guidelines for a Video Enhanced Rubric

Avoid common mistakes on your manuscript.

Introduction

Both practitioners and scholars agree that students should be able to present orally (e.g., Morreale & Pearson, 2008 ; Smith & Sodano, 2011 ). Oral presentation involves the development and delivery of messages to the public with attention to vocal variety, articulation, and non-verbal signals, and with the aim to inform, self-express, relate to and persuade listeners (Baccarini & Bonfanti, 2015 ; De Grez et al., 2009a ; Quianthy, 1990 ). The current study is restricted to informative presentations (as opposed to persuasive presentations), as these are most common in secondary education. Oral presentation skills are complex generic skills of increasing importance for both society and education (Voogt & Roblin, 2012 ). However, secondary education seems to be in lack of instructional design guidelines for supporting oral presentation skills acquisition. Many secondary schools in the Netherlands are struggling with how to teach and assess students’ oral presentation skills, lack clear performance criteria for oral presentations, and fall short in offering adequate formative assessment methods that support the effective acquisition of oral presentation skills (Sluijsmans et al., 2013 ).

Many researchers agree that the acquisition and assessment of presentation skills should depart from a socio-cognitive perspective (Bandura, 1986 ) with emphasis on observation, practice, and feedback. Students practice new presentation skills by observing other presentations as modeling examples, then practice their own presentation, after which the feedback is addressed by adjusting their presentations towards the required levels. Evidently, delivering effective oral presentations requires much preparation, rehearsal, and practice, interspersed with good feedback, preferably from oral presentation experts. However, large class sizes in secondary schools of the Netherlands offer only limited opportunities for teacher-student interaction, and offer even fewer practice opportunities. Based on research on complex skill development and multimedia learning, it can be expected that video-enhanced analytic rubric formats best capture and guide oral presentation performance, since much non-verbal behavior cannot be captured in text (Van Gog et al., 2014 ; Van Merriënboer & Kirschner, 2013 ).

Formative assessment of complex skills

To support complex skills acquisition under limited teacher guidance, we will need more effective formative assessment methods (Boud & Molloy, 2013 ) based on proven instructional design guidelines. During skills acquisition students will perceive specific feedback as more adequate than non-specific feedback (Shute, 2008 ). Adequate feedback should inform students about (i) their task-performance, (ii) their progress towards intended learning goals, and (iii) what they should do to further progress towards those goals (Hattie & Timperly, 2007 ; Narciss, 2008 ). Students receiving specific feedback on criteria and performance levels will become equipped to improve oral presentation skills (De Grez et al., 2009a ; Ritchie, 2016 ). Analytic rubrics are therefore promising formats to provide specific feedback on oral presentations, because they can demonstrate the relations between subskills and explain the open-endedness of ideal presentations (through textual descriptions and their graphical design).

Ritchie ( 2016 ) showed that adding structure and self-assessment to peer- and teacher-assessments resulted in better oral presentation performance. Students were required to use analytic rubrics for self-assessment when following their (project-based) classroom education. In this way, they had ample opportunity for observing and reflecting on (good) oral presentations attributes, which was shown to foster acquisition of their oral presentation skills.

Analytic rubrics incorporate performance criteria to inform teachers and students when preparing oral presentation. Such rubrics support mental model formation, and enable adequate feedback provision by teachers, peers, and self (Brookhart & Chen, 2015 ; Jonsson & Svingby, 2007 ; Panadero & Jonsson, 2013 ). Such research is inconclusive about what are most effective formats and delivery media, but most studies dealt with analytic text-based rubrics delivered on paper. However, digital video-enhanced analytic rubrics are expected to be more effective for acquiring oral presentation skills, since many behavioral aspects refer to non-verbal actions and processes that can only be captured on video (e.g., body posture or use of voice during a presentation).

This study is situated within the Viewbrics project where video-modelling examples are integrated with analytic text-based rubrics (Ackermans et al., 2019a ). Video-modelling examples contain question prompts that illustrate behavior associated with (sub)skills performance levels in context, and are presented by young actors the target group can identify with. The question prompts require students to link behavior to performance levels, and build a coherent picture of the (sub)skills and levels. To the best of authors’ knowledge, there exist no previous studies on such video-enhanced analytic rubrics. The Viewbrics tool has been incrementally developed and validated with teachers and students to structure the formative assessment method in classroom settings (Rusman et al., 2019 ).

The purpose of our study is twofold. On the one hand, it investigates whether the application of evidence-based design guidelines results in a more effective formative assessment method in classroom. On the other hand, it investigates (within that method) whether video-enhanced analytic rubrics are more effective than text-based analytic rubrics.

Research questions

The twofold purpose of this study is stated by two research questions: (1) To what extent do analytic rubrics within formative assessment lead to better oral presentation performance? (the design-based part of this study); and (2) To what extent do video-enhanced analytic rubrics lead to better oral presentation performance (growth) than text-based analytic rubrics? (the experimental part of this study). We hypothesize that all students will improve their oral presentation performance in time, but that students in the experimental groups (receiving analytic rubrics designed according to proven design guidelines) will outperform a control group (receiving conventional rubrics) (Hypothesis 1). Furthermore, we expect the experimental group using video-enhanced rubrics to achieve more performance growth than the experimental group using text-based rubrics (Hypothesis 2).

After this introduction, the second section describes previous research on design guidelines that were applied to develop the analytic rubrics in the present study. The actual design, development and validation of these rubrics is described in “ Development of analytic rubrics tool ” section. “ Method ” section describes the experimental method of this study, whereas “ Results ” section reports its results. Finally, in the concluding “ Conclusions and discussion ” section, main findings and limitations of the study are discussed, and suggestions for future research are provided.

Previous research and design guidelines for formative assessment with analytic rubrics

Analytic rubrics are inextricably linked with assessment, either summative (for final grading of learning products) or formative (for scaffolding learning processes). They provide textual descriptions of skills’ mastery levels with performance indicators that describe concrete behavior for all constituent subskills at each mastery level (Allen & Tanner, 2006 ; Reddy, 2011 ; Sluijsmans et al., 2013 ) (see Figs.  1 and 2 in “ Development of analytic rubrics tool ” section for an example). Such performance indicators specify aspects of variation in the complexity of a (sub)skill (e.g., presenting for a small, homogeneous group as compared to presenting for a large heterogeneous group) and related mastery levels (Van Merriënboer & Kirschner, 2013 ). Analytic rubrics explicate criteria and expectations, can be used to check students’ progress, monitor learning, and diagnose learning problems, either by teachers, students themselves or by their peers (Rusman & Dirkx, 2017 ).

figure 1

Subskills for oral presentation assessment

figure 2

Specification of performance levels for criterium 4

Several motives for deploying analytic rubrics in education are distinguished. A review study by Panadero and Jonsson ( 2013 ) identified following motives: increasing transparency, reducing anxiety, aiding the feedback process, improving student self-efficacy, and supporting student self-regulation. Analytic rubrics also improve reliability among teachers when rating their students (Jonsson & Svingby, 2007 ). Evidence has shown that analytic rubrics can be utilized to enhance student performance and learning when they were used for formative assessment purposes in combination with metacognitive activities, like reflection and goal-setting, but research shows mixed results about their learning effectiveness (Panadero & Jonsson, 2013 ).

It remains unclear what is exactly needed to make their feedback effective (Reddy & Andrade, 2010 ; Reitmeier & Vrchota, 2009 ). Apparently, transparency of assessment criteria and learning goals (i.e., make expectations and criteria explicit) is not enough to establish effectiveness (Wöllenschläger et al., 2016 ). Several researchers stressed the importance of how and which feedback to provide with rubrics (Bower et al., 2011 ; De Grez et al., 2009b ; Kerby & Romine, 2009 ). We now continue this section by reviewing design guidelines for analytic rubrics we encountered in literature, and then specifically address what literature mentions about the added value of video-enhanced rubrics.

Design guidelines for analytic rubrics

Effective formative assessment methods for oral presentation and analytic rubrics should be based on proven instructional design guidelines (Van Ginkel et al., 2015 ). Table 1 presents an overview of (seventeen) guidelines on analytic rubrics we encountered in literature. Guidelines 1–4 inform us how to use rubrics for formative assessment; Guidelines 5–17 inform us how to use rubrics for instruction, with Guidelines 5–9 on a rather generic, meso level and Guidelines 10–17 on a more specific, micro level. We will now shortly describe them in relation to oral presentation skills.

Guideline 1: use analytic rubrics instead of rating scale rubrics if rubrics are meant for learning

Conventional rating-scale rubrics are easy to generate and use as they contain scores for each performance criterium (e.g., by a 5-point Likert scale). However, since each performance level is not clearly described or operationalized, rating can suffer from rater-subjectivity, and rating scales do not provide students with unambiguous feedback (Suskie, 2009 ). Analytic rubrics can address those shortcomings as they contain brief textual performance descriptions on all subskills, criteria, and performance levels of complex skills like presentation, but are harder to develop and score (Bargainnier, 2004 ; Brookhart, 2004 ; Schreiber et al., 2012 ).

Guideline 2: use self-assessment via rubrics for formative purposes

Analytic rubrics can encourage self-assessment and -reflection (Falchikov & Boud, 1989 ; Reitmeier & Vrchota, 2009 ), which appears essential when practicing presentations and reflecting on other presentations (Van Ginkel et al., 2017 ). The usefulness of self-assessment for oral presentation was demonstrated by Ritchie’s study ( 2016 ), but was absent in a study by De Grez et al. ( 2009b ) that used the same rubric.

Guideline 3: use peer-assessment via rubrics for formative purposes

Peer-feedback is more (readily) available than teacher-feedback, and can be beneficial for students’ confidence and learning (Cho & Cho, 2011 ; Murillo-Zamorano & Montanero, 2018 ), also for oral presentation (Topping, 2009 ). Students positively value peer-assessment if the circumstances guarantee serious feedback (De Grez et al., 2010 ; Lim et al., 2013 ). It can be assumed that using analytic rubrics positively influences the quality of peer-assessment.

Guideline 4: provide rubrics for usage by self, peers, and teachers as students appreciate rubrics

Students appreciate analytic rubrics because they support them in their learning, in their planning, in producing higher quality work, in focusing efforts, and in reducing anxiety about assignments (Reddy & Andrade, 2010 ), aspects of importance for oral presentation. While students positively perceive the use of peer-grading, the inclusion of teacher-grades is still needed (Mulder et al., 2014 ) and most valued by students (Ritchie, 2016 ).

Guidelines 5–9

Heitink et al. ( 2016 ) carried out a review study identifying five relevant prerequisites for effective classroom instruction on a meso-level when using analytic rubrics (for oral presentations): train teachers and students in using these rubrics, decide on a policy of their use in instruction, while taking school- and classroom contexts into account, and follow a constructivist learning approach. In the next section, it is described how these guidelines were applied to the design of this study’s classroom instruction.

Guidelines 10–17

Van Ginkel et al. ( 2015 ) review study presents a comprehensive overview of effective factors for oral presentation instruction in higher education on a micro-level. Although our research context is within secondary education, the findings from the aforementioned study seem very applicable as they were rooted in firmly researched and well-documented Instructional Design approaches. Their guidelines pertain to (a) instruction, (b) learning, and (c) assessment in the learning environment (Biggs, 2003 ). The next section describes how guidelines were applied to the design of this study’s online Viewbrics tool.

  • Video-enhanced rubrics

Early analytic rubrics for oral presentations were all text-based descriptions. This study assumes that such analytic rubrics may fall short when used for learning to give oral presentations, since much of the required performance refers to motoric activities, time-consecutive operations and processes that can hardly be captured in text (e.g., body posture or use of voice during a presentation). Text-based rubrics also have a limited capacity to convey contextualized and more ‘tacit’ behavioral aspects (O’Donnevan et al., 2004 ), since ‘tacit knowledge’ (or ‘knowing how’) is interwoven with practical activities, operations, and behavior in the physical world (Westera, 2011 ). Finally, text leaves more space for personal interpretation (of performance indicators) than video, which negatively influences mental model formation and feedback consistency (Lew et al., 2010 ).

We can therefore expect video-enhanced rubrics to overcome such restrictions, as they can integrate modelling examples with analytic text-based explanations. The video-modelling examples and its embedded question prompts can illustrate behavior associated with performance levels in context, and contain information in different modalities (moving images, sound). Video-enhanced rubrics foster learning from active observation of video-modelling examples (De Grez et al., 2014 ; Rohbanfard & Proteau, 2013 ), especially when combined with textual performance indicators. Looking at effects of video-modelling examples, Van Gog et al. ( 2014 ) found an increased task performance when the video-modelling example of an expert was also shown. De Grez et al. ( 2014 ) found comparable results for learning to give oral presentations. Teachers in training assessing their own performance with video-modelling examples appeared to overrate their performance less than without examples (Baecher et al., 2013 ). Research on mastering complex skills indicates that both modelling examples (in a variety of application contexts) and frequent feedback positively influence the learning process and skills' acquisition (Van Merriënboer & Kirschner, 2013 ). Video-modelling examples not only capture the ‘know-how’ (procedural knowledge), but also elicit the ‘know-why’ (strategic/decisive knowledge).

Development of analytic rubrics tool

This section describes how design guidelines from previous research were applied in the actual development of the rubrics in the Viewbrics tool for our study, and then presents the subskills and levels for oral presentation skills as were defined.

Application of design guidelines

The previous section already mentioned that analytic rubrics should be restricted to formative assessment (Guidelines 2 and 3), and that there are good reasons to assume that a combination of teacher-, peer-, and self-assessment will improve oral presentations (Guidelines 1 and 4). Teachers and students were trained in rubric-usage (Guidelines 5 and 7), whereas students were motivated for using rubrics (Guideline 7). As participating schools were already using analytic rubrics, one might assume their positive initial attitude. Although the policy towards using analytic rubrics might not have been generally known at the work floor, the participating teachers in our study were knowledgeable (Guideline 6). We carefully considered the school context, as (a representative set of) secondary schools in the Netherlands were part of the Viewbrics team (Guideline 8). The formative assessment method was embedded within project-based education (Guideline 9).

Within this study and on the micro-level of design, the learning objectives for the first presentation were clearly specified by lower performance levels, whereas advice on students' second presentation focused on improving specific subskills, that had been performed with insufficient quality during the first presentation (Guideline 10). Students carried out two consecutive projects of increasing complexity (Project 1, Project 2) with authentic tasks, amongst which the oral presentations (Guideline 11). Students were provided with opportunities to observe peer-models to increase their self-efficacy beliefs and oral presentation competence. In our study, only students that received video-enhanced rubrics could observe videos with peer-models before their first presentation (Guideline 12). Students were allowed enough opportunities to rehearse their oral presentations, to increase their presentation competence, and to decrease their communication apprehension. Within our study, only two oral presentations could be provided feedback, but students could rehearse as often as they wanted outside the classroom (Guideline 13). We ensured that feedback in the rubrics was of high quality, i.e., explicit, contextual, adequately timed, and of suitable intensity for improving students’ oral presentation competence. Both experimental groups in the study used digital analytic rubrics within the Viewbrics tool (both teacher-, peer-, and self-feedback). The control group received feedback by a more conventional rubric (rating scale), and could therefore not use the formative assessment and reflection functions (Guideline 14). The setup of the study implied that all peers play a major role during formative assessment in both experimental groups, because they formatively assessed each oral presentation using the Viewbrics tool (Guideline 15). The control group received feedback from their teacher. Both experimental groups used the Viewbrics tool to facilitate self-assessment (Guideline 16). The control group did not receive analytic progress data to inform their self-assessment. Specific goal-setting within self-assessment has been shown to positively stimulate oral presentation performance, to improve self-efficacy and reduce presentation anxiety (De Grez et al., 2009a ; Luchetti et al., 2003 ), so the Viewbrics tool was developed to support both specific goal-setting and self-reflection (Guideline 17).

Subskills and levels for oral presentation

Reddy and Andrade ( 2010 ) stress that rubrics should be tailored to the specific learning objectives and target groups. Oral presentations in secondary education (our study context) involve generating and delivering informative messages with attention to vocal variety, articulation, and non-verbal signals. In this context, message composition and message delivery are considered important (Quianthy, 1990 ). Strong arguments (‘logos’) have to be presented in a credible (‘ethos’) and exciting (‘pathos’) way (Baccarini & Bonfanti, 2015 ). Public speaking experts agree that there is not one right way to do an oral presentation (Schneider et al., 2017 ). There is agreement that all presenters need much practice, commitment, and creativity. Effective presenters do not rigorously and obsessively apply communication rules and techniques, as their audience may then perceive the performance as too technical or artificial. But all presentations should demonstrate sufficient mastery of elementary (sub)skills in an integrated manner. Therefore, such skills should also be practiced as a whole (including knowledge and attitudes), making the attainment of a skill performance level more than the sum of its constituent (sub)skills (Van Merriënboer & Kirschner, 2013 ). A validated instrument for assessing oral presentation performance is needed to help teachers assess and support students while practicing.

When we started developing rubrics with the Viewbrics tool (late 2016), there were no studies or validated measuring instruments for oral presentation performance in secondary education, although several schools used locally developed, non-validated assessment forms (i.e., conventional rubrics). For instance, Schreiber et al. ( 2012 ) had developed an analytic rubric for public speaking skills assessment in higher education, aimed at faculty members and students across disciplines. They identified eleven (sub)skills of public speaking, that could be subsumed under three factors (‘topic adaptation’, ‘speech presentation’ and ‘nonverbal delivery’, similar to logos-ethos-pathos).

Such previous work holds much value, but still had to be adapted and elaborated in the context of the current study. This study elaborated and evaluated eleven subskills that can be identified within the natural flow of an oral presentation and its distinctive features (See Fig.  1 for an overview of subskills, and Fig.  2 for a specification of performance levels for a specific subskill).

Between brackets are names of subskills as they appear in the dashboard of the Viewbrics tool (Fig.  3 ).

figure 3

Visualization of oral presentation progress and feedback in the Viewbrics tool

The upper part of Fig.  2 shows the scoring levels for first-year secondary school students for criterium 4 of the oral presentation assessment (four values, from more expert (4 points) to more novice (1 point), from right to left), an example of the conventional rating-scale rubrics. The lower part shows the corresponding screenshot from the Viewbrics tool, representing a text-based analytic rubric example. A video-enhanced analytic rubric example for this subskill provides a peer modelling the required behavior on expert level, with question prompts on selecting reliable and interesting materials. Performance levels were inspired by previous research (Ritchie, 2016 ; Schneider et al., 2017 ; Schreiber et al., 2012 ), but also based upon current secondary school practices in the Netherlands, and developed and tested with secondary school teachers and their students.

All eleven subskills are to be scored on similar four-point Likert scales, and have similar weights in determining total average scores. Two pilot studies tested the usability, validity and reliability of the assessment tool (Rusman et al., 2019 ). Based on this input, the final rubrics were improved and embedded in a prototype of the online Viewbrics tool, and used for this study. The formative assessment method consisted of six steps: (1) study the rubric; (2) practice and conduct an oral presentation; (3) conduct a self-assessment; (4) consult feedback from teacher and peers; (5) Reflect on feedback; and (6) select personal learning goal(s) for the next oral presentation.

After the second project (Project 2), that used the same setup and assessment method as for the first project, students in experimental groups could also see their visualized progress in the ‘dashboard’ of the Viewbrics tool (see Fig.  3 , with English translations provided between brackets), by comparing performance on their two project presentations during the second reflection assignment. The dashboard of the tool shows progress (inner circles), with green reflecting improvement on subskills, blue indicating constant subskills, and red indicating declining subskills. Feedback is provided by emoticons with text. Students’ personal learning goals after reflection are shown under ‘Mijn leerdoelen’ [My learning goals].

The previous sections described how design guidelines for analytic rubrics from literature (“ Previous research and design guidelines for formative assessment with analytic rubrics ” section) were applied in a formative assessment method with analytic rubrics (“ Development of analytic rubrics tool ” section). “ Method ” section describes this study’s research design for comparing rubric formats.

Research design of the study

All classroom scenarios followed the same lesson plan and structure for project-based instruction, and consisted of two projects with specific rubric feedback provided in between. Both experimental groups used the same formative assessment method with validated analytic rubrics, but differed on the analytic rubric format (text-based, video-enhanced). The students of the control group did not use such a formative assessment method, and only received teacher-feedback (via a conventional rating-scale rubric that consisted of a standard form with attention points for presentations, without further instructions) on these presentations. All three scenarios required similar time investments for students. School classes (six) were randomly assigned to conditions (three), so students from the same class were in the same condition. Figure  4 graphically depicts an overview of the research design of the study.

figure 4

Research design overview

A repeated-measures mixed-ANOVA on oral presentation performance (growth) was carried out to analyze data, with rubric-format (three conditions) as between-groups factor and repeated measures (two moments) as within groups factor. All statistical data analyses were conducted with SPSS version 24.

Participants

Participants were first-year secondary school students (all within the 12–13 years range) from two Dutch schools, with participants equally distributed over schools and conditions ( n  = 166, with 79 girls and 87 boys). Classes were randomly allocated to conditions. Most participants completed both oral presentations ( n  = 158, so an overall response rate of 95%). Data were collected (almost) equally from the video-enhanced rubrics condition ( n  = 51), text-based condition ( n  = 57), and conventional rubrics (control) condition ( n  = 50).

A related study within the same context and participants (Ackermans et al., 2019b ), analyzed the concept maps elicited from participants to reveal that their mental models (indicating mastery levels) for oral presentation across conditions were similar. From that finding we can conclude that students possessed similar mental models for presentation skills before starting the projects. Results from the online questionnaire (“ Anxiety, preparedness, and motivation ” section) reveal that students in experimental groups did not differ in anxiety, preparedness and motivation before their first presentation. Together with the teacher assessments of similarity of classes, we can assume similarity of students across conditions at the start of the experiment.

Materials and procedure

Teachers from both schools worked closely together in guaranteeing similar instruction and difficulty levels for both projects (Project 1, Project 2). Schools agreed to follow a standardized lesson plan for both projects and their oral presentation tasks. Core team members then developed (condition-specific) materials for teacher- and student workshops on how to use rubrics and provide instructions and feedback (Guidelines 5 and 7). This also assured that similar measures were taken for potential problems with anxiety, preparedness and motivation. Teachers received information about (condition-specific) versions of the Viewbrics tool (see “ Development of analytic rubrics tool ” section). The core team consisted of three researchers and three (project) teachers, with one teacher also supervising the others. The teacher workshops were given by the supervising teacher and two researchers before starting recruitment of students.

Teachers estimated similarity of all six classes with respect to students’ prior presentation skills before starting the first project. All classes were informed by an introduction letter from the core team and their teachers. Participation in this study was voluntary. Students and their parents/caretakers were informed about 4 weeks before the start of the first project, and received information on research-specific activities, time-investment and -schedule. Parents/caretakers signed, on behalf of their minors of age, an informed consent form before the study started. All were informed that data would be anonymized for scientific purposes, and that students could withdraw at any time without giving reasons.

School classes were randomly assigned to conditions. Students of experimental groups were informed that the usability of the Viewbrics tool for oral presentation skills acquisition were investigated, but were left unaware of different rubric formats. Students of the control group were informed that their oral presentation skills acquisition was investigated. From all students, concept maps about oral presentation were elicited (reflecting their mental model and mastery level). Students participated in workshops (specific for their condition and provided by their teacher) on how to use rubrics and provide peer-feedback (all materials remained available throughout the study).

Before giving their presentations on Project 1, students filled in the online questionnaire via LimeSurvey. Peers and teachers in experimental groups provided immediate feedback on given presentations, and students immediately had to self-assess their own presentations (step 3 of the assessment method). Subsequently, students could view the feedback and ratings given by their teacher and peers through the tool (step 4), were asked to reflect on this feedback (step 5), and to choose specific goals for their second oral presentation (step 6). In the control group, students directly received teachers’ feedback (verbally) after completing their presentation, but did not receive any reflection assignment. Control group students used a standard textual form with attention points (conventional rating-scale rubrics). After giving their presentations on the second project, students in the experimental groups got access to the dashboard of the Viewbrics tool (see “ Development of analytic rubrics tool ” section) to see their progress on subskills. About a week after the classes had ended, some semi-structured interviews were carried out by one of the researchers. Finally, one of the researchers functioned as a hotline for teachers in case of urgent questions during the study, and randomly observed some of the lessons.

Measures and instruments

Oral performance scores on presentations were measured by both teachers and peers. A short online questionnaire (with 6 items) was administered to students just before their first oral presentation at the end of Project 1 (see Fig.  4 ). Interviews were conducted with both teachers and students at the end of the intervention to collect more qualitative data on subjective perceptions.

Oral presentation performance

Students’ oral presentation performance progress was measured via comparison of the oral presentation performance scores on both oral presentations (with three months in between). Both presentations were scored by teachers using the video-enhanced rubric in all groups (half of the score in experimental groups, full score for control group). For participants in both experimental groups, oral presentation performance was also scored by peers and self, using the specific rubric-version (either video-enhanced or text-based) (other half of the score). For each of the (eleven) subskills, between 1 point (novice level) and 4 points (expert level) could be earned, with a maximum of 44 points for total performance score. For participants in the control group, the same scale applied but no scores were given by peers nor self. The inter-rater reliability of assessments between teachers and peers was a Cohen’s Kappa = 0.74 which is acceptable.

Anxiety, preparedness, and motivation

Just before presenting, students answered the short questionnaire with five-point Likert scores (from 0 = totally disagree to 4 = totally agree) as additional control for potential differences in anxiety, preparedness and motivation, since especially these factors might influence oral presentation performance (Reddy & Andrade, 2010 ). Notwithstanding this, teachers were the major source to control for similarity of conditions with respect to dealing with presentation anxiety, preparedness and motivation. Two items for anxiety were: “I find it exciting to give a presentation” and “I find it difficult to give a presentation”, a subscale that appeared to have a satisfactory internal reliability with a Cronbach’s Alpha = 0.90. Three items for preparedness were: “I am well prepared to give my presentation”, “I have often rehearsed my presentation”, and “I think I’ve rehearsed my presentation enough”, a subscale that appeared to have a satisfactory Cronbach’s Alpha = 0.75. The item for motivation was: “I am motivated to give my motivation”. Unfortunately, the online questionnaire was not administered within the control group, due to unforeseen circumstances.

Semi-structured interviews with teachers (six) and students (thirty) were meant to gather qualitative data on the practical usability and usefulness of the Viewbrics tool. Examples of questions are: “Have you encountered any difficulties in using the Viewbrics online tool? If any, could you please mention which one(s)” (both students of experimental groups and teachers); “Did the feedback help you to improve your presentation skills? If not, what feedback do you need to improve your presentation skills?” (just students); “How do you evaluate the usefulness of formative assessment?” (both students and teachers); “Would you like to organize things differently in applying formative assessment as during this study? If so, what would you like to organize different?” (just teachers); “How much time did you spend on providing feedback? Did you need more or less time than before?” (just teachers).

Interviews with teachers and students revealed that the reported rubrics approach was easy to use and useful within the formative assessment method. Project teachers could easily stick to the lessons plans as agreed upon in advance. However, project teachers regarded the classroom scenarios as relatively time-consuming. They expected that for some other schools it might be challenging to follow the Viewbrics approach. None of the project teachers had to consult the hotline during the study, and no deviations from the lesson plans were observed by the researchers.

Most important results on the performance measures and questionnaire are presented and compared between conditions.

A mixed ANOVA, with oral presentation performance as within-subjects factor (two scores) and rubric format as between-subjects factor (three conditions), revealed an overall and significant improvement of oral presentation performance over time, with F (1, 157) = 58.13, p  < 0.01, η p 2  = 0.27. Significant differences over time were also found between conditions, with F (2, 156) = 17.38, p  < 0.01, η p 2  = 0.18. Tests of between-subjects effects showed significant differences between conditions, with F (2, 156) = 118.97, p  < 0.01, η p 2  = 0.59, and both experimental groups outperforming the control group as expected (so we could accept H1). However, only control group students showed significantly progress on performance scores over time (at the 0.01 level). At both measures, no significant differences between experimental groups were found as was expected (so we had to reject H2). For descriptives of group averages (over time) see Table 2 .

A post-hoc analysis, using multiple pairwise comparisons with Bonferroni correction, confirms that experimental groups significantly (with p  < 0.01 level) outperform the control group at both moments in time, and that both experimental groups not to differ significantly at both measures. Regarding performance progress over time, only the control group shows significant growth (again with p < 0.01). The difference between experimental groups in favour of video-enhanced rubrics did ‘touch upon’ significance ( p  = 0.053), but formally H2 had to be rejected. This finding however is a promising trend to be further explored with larger numbers of participants.

An independent t-test comparing the similarity of participants in both experimental groups before their first presentation for anxiety, preparedness, motivation showed no difference, with t (1,98) = 1.32 and p  = 0.19 for anxiety, t (1,98) = − 0.14 and p  = 0.89 for preparedness, and t (1,98) = − 1.24 and p  = 0.22 for motivation (see Table 3 for group averages).

As mentioned in the previous section (interviews with teachers), it was assessed by teachers that presentation anxiety, preparedness and motivation in the control group were no different from both experimental groups. It can therefore be assumed that all groups were similar regarding presentation anxiety, preparedness and motivation before presenting, and that these factors did not confound oral presentation results. There are missing questionnaire data from 58 respondents: Video-enhanced (one respondent), Text-based (seven respondents), and Control group (fifty respondents), respectively.

Conclusions and discussion

The first purpose was to study if applying evidence-informed design guidelines in the development of formative assessment with analytic rubrics supports oral presentation performance of first-year secondary school students in the Netherlands. Students that used such validated rubrics indeed outperform students using common rubrics (so H1 could be accepted). This study has demonstrated that the design guidelines can also be effectively applied and used for secondary education, which makes them more generic. The second purpose was to study if video-enhanced rubrics would be more beneficial to oral presentation skills acquisition when compared to text-based rubrics, but we did not find significant differences here (so H2 had to be rejected). However, post-hoc analysis shows that the growth on performance scores over time indeed seems higher when using video-enhanced rubrics, a promising difference that is ‘only marginally’ significant. Preliminary qualitative findings from the interviews point out that the Viewbrics tool can be easily integrated into classroom instruction and appears usable for the target audiences (both teachers and students), although teachers state it is rather time-consuming to conform to all guidelines.

All students had prior experience with oral presentations (from primary schools) and relatively high oral presentation scores at the start of the study, so there remained limited room for improvement between their first and second oral presentation. Participants in the control group scored relatively low on their first presentation, so had more room for improvement during the study. In addition, the somewhat more difficult content of the second project (Guideline 11) might have slightly reduced the quality of the second oral presentation. Also, more intensive training, additional presentations and their assessments might have demonstrated more added value of the analytic rubrics. Learning might have occurred, since adequate mental models of skills are not automatically applied during performance (Ackermans et al., 2019b ).

A first limitation (and strength at the same time) of this study was its contextualization within a specific subject domain and educational sector over a longer period of time, which implies we cannot completely exclude some influence of confounding factors. A second limitation is that the Viewbrics tool has been specifically designed for formative assessment, and not meant for summative assessment purposes. Although our study revealed the inter-rater reliability of our rubrics to be satisfactory (see “ Measures and instruments ” section), it is likely to become lower and less suitable when compared to more traditional summative assessment methods (Jonsson & Svinby, 2007 ). Thirdly, just having a reliable rubric bears no evidence for content-validity (representativeness, fidelity of scoring structure to the construct domain) or generalizability to other domains and educational sectors (Jonsson & Svinby, 2007 ). Fourth, one might criticize the practice-based research design of our study, as this is less-controlled than laboratory studies. We acknowledge that the application of more unobtrusive and objective measures to better understand the complex relationship between instructional characteristics, student characteristics and cognitive learning processes and strategies could best be achieved in a combination of more laboratory research and more practice-based research. Notwithstanding some of these issues, we have deliberately chosen for design-based research and evidence-informed findings from educational practice.

Future research could examine the Viewbrics approach to formative assessment for oral presentation skills in different contexts (other subject matters and educational sectors). The Viewbrics tool could be extended with functions for self-assessment (e.g., record and replay one's own presentations), for coping with speech anxiety (Leary & Kowalski, 1995 ), and goal-setting (De Grez et al., 2009a ). As this is a first study on video-enhanced rubrics, more fine-grained and fundamental research into beneficial effects on cognitive processes is needed, also to justify the additional development costs. Development of video-enhanced rubrics is more costly when compared to text-based rubrics. Another line of research might be directed to develop multiple measures for objectively determining oral presentation competence, for example using sensor-based data gathering and algorithms for data-gathering, guidance, and meaningful interpretation (Schneider et al., 2017 ), or direct measures of cortisol levels for speaking anxiety (Bartholomay & Houlihan, 2016 ; Merz & Wolf, 2015 ). Other instructional strategies might be considered, for example repeated practice of the same oral presentation might result in performance improvement, as has been suggested by Ritchie ( 2016 ). This also would enable to downsize the importance of presentation content and to put more focus on presentation delivery. The importance of finding good instructional technologies to support complex oral presentation skills will remain of importance throughout the twenty-first century and beyond.

Ackermans, K., Rusman, E., Brand-Gruwel, S., & Specht, M. (2019a). Solving instructional design dilemmas to develop Video-Enhanced Rubrics with modeling examples to support mental model development of complex skills: The Viewbrics-project use case. Educational Technology Research & Development, 67 (4), 993–1002.

Google Scholar  

Ackermans, K., Rusman, E., Nadolski, R. J., Brand-Gruwel, S., & Specht, M. (2019b). Video-or text-based rubrics: What is most effective for mental model growth of complex skills within formative assessment in secondary schools? Computers in Human Behavior, 101 , 248–258.

Allen, D., & Tanner, K. (2006). Rubrics: Tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE Life Sciences Education, 5 (3), 197–203.

Baccarini, C., & Bonfanti, A. (2015). Effective public speaking: A conceptual framework in the corporate-communication field. Corporate Communications, 20 (3), 375–390.

Baecher, L., Kung, S. C., Jewkes, A. M., & Rosalia, C. (2013). The role of video for self-evaluation in early field experiences. Teacher Teaching Education, 36 , 189–197.

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory . Prentice-Hall.

Bargainnier, S. (2004). Fundamentals of rubrics. In D. Apple (Ed.), Faculty guidebook (pp. 75–78). Pacific Crest.

Bartholomay, E. M., & Houlihan, D. D. (2016). Public Speaking Anxiety Scale: Preliminary psychometric data scale validation. Personality Individual Differences, 94 , 211–215.

Biggs, J. (2003). Teaching for quality learning at University . Society for Research in Higher Education and Open University Press.

Boud, D., & Molloy, E. (2013). Rethinking models of feedback for learning: The challenge of design. Assessment & Evaluation in Higher Education, 38 (6), 698–712.

Bower, M., Cavanagh, M., Moloney, R., & Dao, M. (2011). Developing communication competence using an online Video Reflection system: Pre-service teachers’ experiences. Asia-Pacific Journal of Teacher Education, 39 (4), 311–326.

Brookhart, S. M. (2004). Assessment theory for college classrooms. New Directions for Teaching and Learning, 100 , 5–14.

Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67 (3), 343–368.

Cho, Y. H., & Cho, K. (2011). Peer reviewers learn from giving comments. Instructional Science, 39 (5), 629–643.

De Grez, L., Valcke, M., & Berings, M. (2010). Peer assessment of oral presentation skills. Procedia Social and Behavioral Sciences, 2 (2), 1776–1780.

De Grez, L., Valcke, M., & Roozen, I. (2009a). The impact of goal orientation, self-reflection and personal characteristics on the acquisition of oral presentation skills. European Journal of Psychology of Education, 24 (3), 293–306.

De Grez, L., Valcke, M., & Roozen, I. (2009b). The impact of an innovative instructional intervention on the acquisition of oral presentation skills in higher education. Computers & Education, 53 (1), 112–120.

De Grez, L., Valcke, M., & Roozen, I. (2014). The differential impact of observational learning and practice-bases learning on the development of oral presentation skills in higher education. Higher Education Research Development, 33 (2), 256–271.

Falchikov, N., & Boud, D. (1989). Student self-assessment in higher education: A meta-analysis. Review of Educational Research, 59 (4), 395–430.

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77 (1), 81–112.

Heitink, M. C., Van der Kleij, F. M., Veldkamp, B. P., & Schildkamp, K. (2016). A systematic review of prerequisites for implementing assessment for learning in classroom practice. Educational Research Review, 17 , 50–62.

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2 (2), 130–144.

Kerby, D., & Romine, J. (2009). Develop oral presentation skills through accounting curriculum design and course-embedded assessment. Journal of Education for Business, 85 (3), 172–179.

Leary, M. R., & Kowalski, R. M. (1995). Social anxiety . Guilford Press.

Lew, M. D. N., Alwis, W. A. M., & Schmidt, H. G. (2010). Accuracy of students’ self-assessment and their beliefs about its utility. Assessment and Evaluation in Higher Education, 35 (2), 135–156.

Lim, B. T., Moriarty, H., Huthwaite, M., Gray, L., Pullon, S., & Gallagher, P. (2013). How well do medical students rate and communicate clinical empathy? Medical Teacher, 35 , 946–951.

Luchetti, A. E., Phipss, G. L., & Behnke, R. R. (2003). Trait anticipatory public speaking anxiety as a function of self-efficacy expectations and self-handicapping strategies. Communication Research Reports, 20 (4), 348–356.

Merz, C. J., & Wolf, O. T. (2015). Examination of cortisol and state anxiety at an academic setting with and without oral presentation. The International Journal on the Biology of Stress, 18 (1), 138–142.

Morreale, S. P., & Pearson, J. C. (2008). Why communication education is important: Centrality of discipline in the 21st century. Communication Education, 57 , 224–240.

Mulder, R. A., Pearce, J. M., & Baik, C. (2014). Peer review in higher education: Student perceptions before and after participation. Active Learning in Higher Education, 15 (2), 157–171.

Murillo-Zamorano, L. R., & Montanero, M. (2018). Oral presentations in higher education: A comparison of the impact of peer and teacher feedback. Assessment & Evaluation in Higher Education, 43 (1), 138–150.

Narciss, S. (2008). Feedback strategies for interactive learning tasks. In J. M. Spector, M. D. Merrill, J. J. G. van Merrienboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 125–144). Lawrence Erlbaum Associates.

O’Donnevan, B., Price, M., Rust, C., & Donovan, B. O. (2004). Teaching in higher education know what I mean? Enhancing student understanding of assessment standards and criteria. Teacher Higher Education, 9 (3), 325–335.

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposed revisited: A review. Educational Research Review, 9 , 129–144.

Quianthy, R. L. (1990). Communication is life: Essential college sophomore speaking and listening competencies . National Communication Association.

Reddy, Y. M. (2011). Design and development of rubrics to improve assessment outcomes. Quality Assurance in Education, 19 (1), 84–104.

Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35 (4), 435–448.

Reitmeier, C. A., & Vrchota, D. A. (2009). Self-assessment of oral communication presentations in food science and nutrition. J. of Food Science Education, 8 (4), 88–92.

Ritchie, S. M. (2016). Self-assessment of video-recorded presentations: Does it improve skills? Active Learning in Higher Education, 17 (3), 207–221.

Rohbanfard, H., & Proteau, L. (2013). Live versus video presentation techniques in the observational learning of motor skills. Trends Neuroscience Education, 2 , 27–32.

Rusman, E., & Dirkx, K. (2017). Developing rubrics to assess complex (generic) skills in the classroom: How to distinguish skills’ mastery Levels? Practical Assessment, Research & Evaluation. https://doi.org/10.7275/xfp0-8228

Article   Google Scholar  

Rusman, E., Nadolski, R. J., & Ackermans, K. (2019). Students’ and teachers’ perceptions of the usability and usefulness of the first Viewbrics-prototype: A methodology and online tool to formatively assess complex generic skills with video-enhanced rubrics in Dutch secondary education. In S. Draaijer, D. Joosten-ten Brinke, E. Ras (Eds), Technology enhanced assessment. TEA 2018. Communications in computer and information science (Vol. 1014, pp. 27–41). Springer, Cham.

Schneider, J., Börner, D., Van Rosmalen, P., & Specht, M. (2017). Presentation trainer: What experts and computers can tell about your nonverbal communication. Journal of Computer Assisted Learning, 33 (2), 164–177.

Schreiber, L. M., Paul, G. D., & Shibley, L. R. (2012). The development and test of the public speaking competence rubric. Communication Education, 61 (3), 205–233.

Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78 (1), 153–189.

Sluijsmans, D., Joosten-ten Brinke, D., & Van der Vleuten, C. (2013). Toetsen met leerwaarde [Assessments with value for learning] . The Hague, The Netherlands: NWO. Retrieved from https://sluijsmans.net/wp-content/uploads/2019/02/Toetsen-met-leerwaarde.pdf

Smith, C. M., & Sodano, T. M. (2011). Integrating lecture capture as a teaching strategy to improve student presentation skills through self-assessment. Active Learning in Higher Education, 12 , 151–162.

Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). Wiley.

Topping, K. (2009). Peer assessment. Theory into Practice, 48 (1), 20–27.

Van Ginkel, S., Gulikers, J., Biemans, H., & Mulder, M. (2015). Towards a set of design principles for developing oral presentation competence: A synthesis of research in higher education. Educational Research Review, 14 , 62–80.

Van Ginkel, S., Laurentzen, R., Mulder, M., Mononen, A., Kyttä, J., & Kortelainen, M. J. (2017). Assessing oral presentation performance: Designing a rubric and testing its validity with an expert group. Journal of Applied Research in Higher Education, 9 (3), 474–486.

Van Gog, T., Verveer, I., & Verveer, L. (2014). Learning from video modeling examples: Effects of seeing the human model’s face. Computers and Education, 72 , 323–327.

Van Merriënboer, J. J. G., & Kirschner, P. A. (2013). Ten steps to complex learning (2nd ed.). Lawrence Erlbaum.

Voogt, J., & Roblin, N. P. (2012). A comparative analysis of international frameworks for 21st century competences: Implications for national curriculum policies. Journal of Curriculum Studies, 44 , 299–321.

Westera, W. (2011). On the changing nature of learning context: Anticipating the virtual extensions of the world. Educational Technology and Society, 14 , 201–212.

Wöllenschläger, M., Hattie, J., Machts, N., Möller, J., & Harms, U. (2016). What makes rubrics effective in teacher-feedback? Transparency of learning goals is not enough. Contemporary Educational Psychology, 44–45 , 1–11.

Download references

Acknowledgements

Authors would like to thank the reviewers for their constructive comments on our paper and all students and teachers that participated in this study as well as the management from the participating schools.

The Viewbrics-project is funded by the practice-oriented research program of the Netherlands Initiative for Education Research (NRO), part of The Netherlands Organization for Scientific Research (NWO), under Grant Number: 405-15-550.

Author information

Authors and affiliations.

Faculty of Educational Sciences, Open University of the Netherlands, Valkenburgerweg 177, 6419 AT, Heerlen, The Netherlands

Rob J. Nadolski, Hans G. K. Hummel, Ellen Rusman & Kevin Ackermans

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Rob J. Nadolski .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interest.

Ethical approval

This research has been approved by the ethics committee of the author's institution (U2017/05559/HVM).

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Nadolski, R.J., Hummel, H.G.K., Rusman, E. et al. Rubric formats for the formative assessment of oral presentation skills acquisition in secondary education. Education Tech Research Dev 69 , 2663–2682 (2021). https://doi.org/10.1007/s11423-021-10030-7

Download citation

Accepted : 03 July 2021

Published : 20 July 2021

Issue Date : October 2021

DOI : https://doi.org/10.1007/s11423-021-10030-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital rubrics
  • Analytic rubrics
  • Oral presentation skills
  • Formative assessment method
  • Find a journal
  • Publish with us
  • Track your research

USC shield

Center for Excellence in Teaching

Home > Resources > Group presentation rubric

Group presentation rubric

This is a grading rubric an instructor uses to assess students’ work on this type of assignment. It is a sample rubric that needs to be edited to reflect the specifics of a particular assignment. Students can self-assess using the rubric as a checklist before submitting their assignment.

Download this file

Download this file [63.74 KB]

Back to Resources Page

Opinion: Salt Lake City School District’s restroom presentation harms students. As nonbinary and transgender educators, we see the damage.

The directive to give presentations in every single classroom created widespread harm and put educators in precarious positions to balance their “professionalism” with their humanity..

(Rick Bowmer | AP) Bonneville Elementary School 5th grader Graham Beeton, waves to fellow students during a block party supporting trans and non binary students and staff Monday, April 29, 2024, in Salt Lake City. Utah will become the latest state to implement restrictions for transgender people using school bathrooms and locker rooms in public schools and government-owned buildings when key components of a law passed by the Republican controlled Legislature take effect May 1.

On May 1, HB 257 went into effect in K-12 public schools. But for us, transgender and nonbinary educators in the Salt Lake City School District, the effect has been felt for weeks. We feel compelled to speak as private citizens, not as representatives of the district or the schools we work in, due to the impact on our queer and transgender community in Salt Lake City.

We chose to work in this district specifically because of its commitment to equity. Salt Lake City School District (SLCSD) has a vision statement that reads “Excellence and Equity: every student, every classroom, every day” in addition to being the first “ Dignity District ” nationwide, with the commitment “to learning and work environments where everyone is treated with dignity.” Yet, SLCSD’s directive to have teachers give classroom presentations about HB 257 undermines its own commitment to dignity and equity, resulting in direct harm to transgender students, educators and their allies.

SLCSD created presentations for grades K-5 and 6-12 which stated that “new legislation requires us” to tell students they needed to use the bathroom of the gender they were assigned at birth. In the creation of these presentations , SLCSD failed to recognize the impact this would have on teachers, counselors, administrators and psychologists who were left with no support or resources to honor our own dignity. While the district’s recent newsletter stated that it would “remain committed to making sure our schools remain safe, welcoming places for all our students, families, and staff,” we were not provided with adequate means of support to do so.

Instead, we as transgender and nonbinary educators took on the labor to answer questions from coworkers on how to best support our students. With no district acknowledgment of the pain it caused us, transgender and nonbinary educators were also expected to give this presentation to our students, directly harming them and betraying our community when we know exactly what the implications are — 56% of transgender and nonbinary youth in Utah seriously considered suicide in 2022 . The district did not recognize this potential for harm in any communications to faculty, staff or families.

We took it upon ourselves to create our own presentations filled with local and national resources, support systems and hotlines, crafted for both secondary and elementary communities — the resources to help fellow educators support impacted students and families that our district should have created. We shared these with the educators we could within our reach. Subsequently, we watched as some of our students’ fiercest allies and our schools’ kindest hearts experienced the chilling effect of the presentation: Pride flags, safe school posters and pronoun pins were fearfully stowed away. We continue to question: How does this make our district safe and welcoming?

In secondary schools, regardless of their gender identity, students questioned why this was the action the district decided to take when their friends in other districts were not subject to the same trauma and harm. Several stated they would rather the district make a presentation about vaping, drug use and truancy in the bathrooms instead — and stop targeting their trans and nonbinary peers. At the elementary level, young trans and nonbinary students already struggle to get peers and teachers to acknowledge and understand their identities. Many aren’t even “out,” choosing to be accepted simply for who they are without a label that could subject them to interrogation. Coupling this reality with the introduction of HB 257 means we have students of all ages, K-12, who had their right to use the bathroom become subject to classroom discussion.

We recognize that SLCSD needs to ensure they comply with state law and the specific directives of HB 257. However, the directive to give presentations in every single classroom created widespread harm and put educators in precarious positions to balance their “professionalism” with their humanity. SLCSD did not provide justification for choosing to present the presentations over the methods other districts in our state used to notify students of HB 257.

We call for SLCSD to recognize the impact that it had on transgender, nonbinary and allied educators, students, staff and administrators. The effects of this erasure by the district were unnecessary, undignified and dehumanizing. The presentations did not give us reason to believe our district’s stated values and vision statement.

Additionally, we need actionable, tangible changes to create safe, equitable and dignified environments for everyone who works and learns in SLCSD. We ask the district to contract a professional development session on LGBTQIA+ inclusion, preferably through a local organization. We also ask that SLCSD conduct a thorough audit of their policies in the wake of new federal Title IX guidance which states that sex-separated programs and activities, including bathrooms, cannot exclude a person from a space consistent with their gender identity.

We have seen displays of love and support for transgender and nonbinary students and staff both in and out of our buildings and recognize its positive impact. We hope that SCLSD can join in these displays and that these actions will hopefully ensure that our district’s transgender and nonbinary community never feels this way again.

(Photo courtesy of Rilee Pickle) Rilee Pickle

Mx. Rilee Pickle, MAT (they/she), is a high school teacher in Salt Lake City School District and a nonbinary lesbian. However, they wrote this as a private citizen and do not represent the district or the school they teach at.

(Photo courtesy of Breanna Taylor-Lof) Breanna Taylor-Lof

Mx. Breanna Taylor-Lof, M.Ed (they/them), is a transgender elementary school teacher in Salt Lake City School District. However, they wrote this as a private citizen and do not represent the district or the school they teach at.

The Salt Lake Tribune is committed to creating a space where Utahns can share ideas, perspectives and solutions that move our state forward. We rely on your insight to do this. Find out how to share your opinion here , and email us at [email protected] .

Donate to the newsroom now. The Salt Lake Tribune, Inc. is a 501(c)(3) public charity and contributions are tax deductible

RELATED STORIES

Opinion: utah’s new anti-trans bill is so much bigger than just bathrooms, opinion: utah’s transgender community is tired of being socially, medically and legally erased, opinion: as a pediatrician, i know how dangerous utah’s bathroom bill is for transgender youth, the ultimate utah shakespeare festival experience, opinion: we’re still suffering the mighty consequences of utah’s ‘mighty 5’ campaign, photos: kilby court block party brings music and fans from all over to utah, gordon monson: the national image of utah football it will rule and reign over the big 12 in its first year., utah launches all-out push to build thousands of new ‘starter homes’ that you might be able to afford, an acclaimed new york jazz musician has been living quietly in utah for decades. now, he’s ready to make noise again., featured local savings.

Students score $1 million judgement after acne medication mistaken for blackface

by RAY LEWIS | Crisis in the Classroom

One of the photos of the boys using acne medication. (California Superior Court)

MOUNTAIN VIEW, Calif. (CITC) — Two California students accused of wearing blackface received $1 million on Monday in a settlement with their former high school, according to the Dhillon Law Group, which represented them.

Saint Francis High School purportedly expelled the students, referred to as A.H. and H.H., after photos showing them in acne face masks at 14 years old surfaced in 2020. Parents allegedly took issue with the pictures, which the students took three years earlier, and protested Saint Francis’s tolerance of what was deemed racially insensitive behavior.

The school forced the students to withdraw or face expulsion and denied their ability to appeal, according to the attorneys. Saint Francis Principal Katie Teekell allegedly told the students’ parents the expulsions were for “optics,” rather than “intent.”

The jury’s verdict finally cleared our clients’ names after four long years of repeated personal attacks from St. Francis High School,” attorney Jarin Sweigart said. “Schools are supposed to protect and nurture children, not sacrifice them when it is convenient for public relations purposes.”

The settlement sets a precedent of allowing high school students to have a fair opportunity to respond to disciplinary actions, according to the attorneys. Saint Francis disputed the jury’s conclusion it had unfair disciplinary procedures.

“We respectfully disagree with the jury’s conclusion as to the lesser claim regarding the fairness of our disciplinary review process and are exploring legal options, including appeal as there is no legal precedent applying that claim to a high school,” the school told Crisis in the Classroom. “We look forward to putting this matter behind us.”

The jury rejected the students’ claims Saint Francis defamed and violated their right to free speech, according to the school. The students’ parents alleged their children suffered reputational damage as a result of Saint Francis’s actions.

“We would never wish the pain, humiliation, and suffering St. Francis has inflicted on our families on anyone, but we are thankful that the jury has spoken, and vindicated our boys, and forced St. Francis to finally take responsibility for their repeated personal attacks on the boys,” the parents said.

They argued the school also risked their children’s safety and health while jeopardizing the students’ ability to complete high school, gain admittance to a “suitable college” and compete in sports.

“Twenty percent of our boys’ lives have been spent seeing this process come to fruition,” the parents said. “But the sacrifice is worth it to clear our boys’ names, and to try to make sure that St. Francis can never again assume a child is guilty without giving a child the opportunity to show their innocence. To never again sacrifice any child to protect the school’s reputation like they did our boys.”

Follow Ray Lewis on X for trending national news @rayjlewis or send a tip to [email protected] .

COMMENTS

  1. PDF Oral Presentation Rubric

    Oral Presentation Rubric 4—Excellent 3—Good 2—Fair 1—Needs Improvement Delivery • Holds attention of entire audience with the use of direct eye contact, seldom looking at notes • Speaks with fluctuation in volume and inflection to maintain audience interest and emphasize key points • Consistent use of direct eye contact with ...

  2. Oral Presentation Rubric

    The rubric allows teachers to assess students in several key areas of oral presentation. Students are scored on a scale of 1-4 in three major areas. The first area is Delivery, which includes eye contact, and voice inflection. The second area, Content/Organization, scores students based on their knowledge and understanding of the topic being ...

  3. PDF Expectation: The Hull High School Student will present information and

    Gestures are mostly visible, and. The student's stance is somewhat controlled, with some distracting movements. The student somewhat uses a variety. uses a variety of facial expressions and maintains eye contact with the audience. Student is consistently audible and articulate. Intonation is varied.

  4. Rubric for Evaluating Student Presentations

    The rubric for evaluating student presentations is included as a download in this article. In addition, the criteria on the rubric is explained in detail. ... High school english lesson plans grades 9 12 (570) High school history lesson plans grades 9 12 (86) History facts study sheets homework help (245) Homeschool ...

  5. 15 Helpful Scoring Rubric Examples for All Grades and Subjects

    High School Rubric Examples. In high school, it's important to include your grading rubrics when you give assignments like presentations, research projects, or essays. ... Presentation Rubric. Analyze a student's presentation both for content and communication skills with a rubric like this one. If needed, create a separate one for content ...

  6. PDF Oral Presentation Evaluation Rubric

    Organization. Logical, interesting, clearly delineated themes and ideas. Generally clear, overall easy for audience to follow. Overall organized but sequence is difficult to follow. Difficult to follow, confusing sequence of information. No clear organization to material, themes and ideas are disjointed. Evaluation.

  7. Rubric Best Practices, Examples, and Templates

    Step 7: Create your rubric. Create your rubric in a table or spreadsheet in Word, Google Docs, Sheets, etc., and then transfer it by typing it into Moodle. You can also use online tools to create the rubric, but you will still have to type the criteria, indicators, levels, etc., into Moodle.

  8. PDF Student Presentation Scoring Guide and Rubric

    0. Weight. Possible points. Introduction - Effectiveness of your brief initial discussion of the content, scope, and flow of your presentation. Your introduction makes perfectly clear the salient points and scope of your talk. A bit too brief (or too long), or missing an important item. Contains some relevant information but not nearly enough.

  9. PDF Oral Presentation Rubric

    Oral Presentation Rubric. Holds attention of entire audience with the use of direct eye contact, seldom looking at notes. Consistent use of direct eye contact with audience, but still returns to notes. Displayed minimal eye contact with audience, while reading mostly from the notes. No eye contact with audience, as entire report is read from notes.

  10. PDF Rubrics for Assessing Student Writing, Listening, and Speaking

    High School Rubrics for Assessing Student Writing, Listening, & Speaking 3 • Many teachers model good papers, but few students are exposed to papers that contain various flaws that reduce the grade. The Glencoe Literature rubrics can help. Each column of the rubrics is ideal for effective, varied modeling. For example, a lesson on

  11. Creating an Oral Presentation Rubric

    Create a second list to the side of the board, called "Let it slide," asking students what, as a class, they should "let slide" in the oral presentations. Guide and elaborate, choosing whether to reject, accept, or compromise on the students' proposals. Distribute the two lists to students as-is as a checklist-style rubric or flesh ...

  12. How to (Effectively) Use a Presentation Grading Rubric

    To help students better understand the rubric, play a clip of a presentation and have students use the rubric to grade the video. Go over what grade students gave the presentation and why, based on the rubric's standards. Then explain how you would grade the presentation as an instructor. This will help your students internalize the rubric as ...

  13. Oral Presentation Rubric

    Use this FREE rubric to provide students with clear expectations for their next oral presentation or speech! Public speaking is difficult for everyone, especially students. Getting up and speaking in front of the class is the last thing many of our students want to do. Help students prepare for their speech or presentation with a clear set of ...

  14. PDF Oral Presentation Grading Rubric

    presentation. Presenter mumbles, talks very fast, and speaks too quietly for a majority of students to hear & understand. Timing 4 - Exceptional 3 - Admirable 2 - Acceptable 1 - Poor Length of Presentation Within two minutes of allotted time +/-. Within four minutes of allotted time +/-. Within six minutes of allotted time +/-

  15. Rubric formats for the formative assessment of oral presentation skills

    Participants were first-year secondary school students in the Netherlands (n = 158) that acquired oral presentation skills with the support of either a formative assessment method with analytic rubrics offered through a dedicated online tool (experimental groups), or a method using more conventional (rating scales) rubrics (control group). One ...

  16. PDF Speech Rubric

    Clarity. Speaks clearly and distinctly all the time with no mispronounced words. Speaks clearly and distinctly nearly all the time with no more than one mispronounced word. Speaks clearly and distinctly most of the time with no more than two mispronounced words. Often mumbles or can not be understood with more than three mispronounced words.

  17. PDF Sample Rubric for PowerPoint Presentation

    Student was unable to complete presentation before the class. Title: Sample Rubric for PowerPoint Presentation Author: jessikle Subject: Sample rubric to outline the quality, content, and effectiveness expected of a well-constructed presentation as opposed to a weak one Created Date:

  18. Group presentation rubric

    This is a grading rubric an instructor uses to assess students' work on this type of assignment. It is a sample rubric that needs to be edited to reflect the specifics of a particular assignment. Students can self-assess using the rubric as a checklist before submitting their assignment. Download this file.

  19. Download Project Based Learning Rubrics

    This rubric describes beginning, developing, and Gold Standard levels for Project Based Teaching Practices for K-12 teachers and features detailed, concrete indicators that illustrate what it means to teach in a PBL environment. Teachers and school leaders can use this rubric to reflect on their practice and plan for professional growth.

  20. PDF Grading Rubric for PowerPoint Presentation

    Grading Rubric for PowerPoint Presentation Rubric CATEGORY 4 3 2 1 Points Sequencing of Information Information is organized in a clear, logical way. It is easy to anticipate the type of material that might be on the next slide. Most information is organized in a clear, logical way. One slide or item of information seems out of place.

  21. A Standardized Rubric to Evaluate Student Presentations

    Design. A 20-item rubric was designed and used to evaluate student presentations in a capstone fourth-year course in 2007-2008, and then revised and expanded to 25 items and used to evaluate student presentations for the same course in 2008-2009. Two faculty members evaluated each presentation.

  22. PDF Activity: Student Participation and Teaching / Teaching While Leading A

    In preparation for student participation. o. Assign a problem/a question that affords multiple answers and allows students to employ different reasoning strategies. o. Anticipate students' responses to the task and the difficulties that they may have. Plans appropriate strategies to support students' reasoning through the task accordingly.

  23. Opinion: Salt Lake City School District's restroom presentation harms

    Salt Lake City School District (SLCSD) has a vision statement that reads "Excellence and Equity: every student, every classroom, every day" in addition to being the first " Dignity District ...

  24. Students score $1 million judgement after acne medication mistaken for

    MOUNTAIN VIEW, Calif. (CITC) — Two California students accused of wearing blackface received $1 million on Monday in a settlement with their former high school, according to the Dhillon Law Group, which represented them. Saint Francis High School purportedly expelled the students, referred to as A.H. and H.H., after photos showing them in acne face masks at 14 years old surfaced in 2020.